A New Proposal For Policing Copyrighted Material On the Internet

Friday, March 14, 2014 - 01:19 PM

(Craig Damlo/flickr)

Yesterday, the House Judiciary Committee held a hearing on copyright reform - one of many since the SOPA bill met strong public resistance two years ago. One of the proposals outlined yesterday was a modification of the current "safe harbors" as described in the Digital Millenium Copyright Act (DMCA).

To understand the potentially devastating effect of the changes, you first need to understand the "notice and takedown" components of the DMCA. The way it works now is when a piece of copyrighted material is uploaded to, say, a web forum, the site host is inoculated from a lawsuit if they didn't know it was up there and it is removed promptly. The DMCA was specifically signed to be business friendly and limit a company's liability.

As Techdirt reports, a new proposal replaces "notice and takedown" with "notice and staydown." The change would be that once a piece of offending content is removed, the host is now legally responsible for preventing it from ever being uploaded again. As Techdirt notes, that's basically impossible. 

This may sound good if you're not very knowledgeable about (a) technology and (b) copyright law. But if you understand either, or both, you quickly realize this is a really, really stupid solution that won't work and will have all sorts of dangerous unintended consequences that harm both creativity and the wider internet itself. 

To police for content like this would basically require websites to either no longer allow people to upload content (which is how every site we use regularly functions these days - think instagram, twitter, facebook) or to build a massive budget into every website to police for content that has already been removed so that it is not re-uploaded.

I understand why content owners want to do this - it would be nice to only have to send a takedown once and never have to worry about it ending up on that site again. But technically that's just impossible. Websites like Soundcloud and YouTube have algorithms that detect both audio and video images right now, to automate their prevention of copyright infringement, but the detection algorithms are easily fooled. YouTubers often flip the image of a movie horizontally to fool the YouTube one, and I have heard of people adding flutter to samples to keep the audio detection on Soundcloud from detecting them. Just the other day, a podcast that I like played a copyrighted song and the hosts just sang over it to make sure they'd beat the detection system.

Techdirt describes this as SOPA 2.0, which I think might be a little strong. SOPA had a number of pretty vicious tactics baked into it to keep websites from infringing. But this proposal is definitely of a piece with SOPA - a totally unworkable solution from a technical standpoint that puts the onus on website owners and essentially breaks the sharing components of the internet.

The likelihood of this making it into law is unclear. As I mentioned, SOPA received quite a bit of popular resistance. But this move telegraphs that the content owners, after all these years, still haven't come up with a better way of policing content. And their preferred method is just incompatible with the internet as it exists today.

Tags:

More in:

Leave a Comment

Email addresses are required but never displayed.

Supported by

 

Embed the TLDR podcast player

TLDR is a short podcast and blog about the internet by PJ Vogt and Alex Goldman. You can subscribe to our podcast here. You can follow our blog here. We’re also on Twitter, and we play Team Fortress 2 more or less constantly, so find us there if you like to communicate via computer games from six years ago.

Subscribe to Podcast iTunes RSS

Feeds