NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] Re: Speculation, how AT&T can implement "copyright filtering" without wiretapping/dpi...

I think you're exaggerating a bit, Kevin. My slide from Mininova shows a list of torrents that all carry Microsoft Office and a key generator. There is nothing ambiguous about these torrents, they're not fair use, they're not mashups, and they're not Digital Culture, they're simply theft. That's needed to shut down these illegal transactions is a request from the copyright owner to the ISP that's carrying the traffic to shut it off. Due process doesn't come into it unless somebody is prosecuted.

There is a risk of unfair shut-offs, but it's very, very small and can be dealt with after the fact in some reasonable way.

I agree that the system involved is non-neutral, but that doesn't mean it's bad a priori. I imagine most copyright thieves would prefer to have their streams blocked than go to jail or pay a fine, wouldn't you?


Kevin McArthur wrote:
There's nothing inherently offensive in this methodology until you realize that it bypasses due process of law.

The difference between copyright violation and fair use is not possible for a piece of software to decide. While many uses are clearly infringing, there currently exists no system that can tell the difference between legitimate fair use (like appropriation art or criticism) and true copyright infringement with any level of accuracy. Some questions arise:

Will the software, network or content owner be liable for a false claim?
Will a user falsely accused of downloading be able to make a libel claim?
Will the artist be able to make a claim for censorship, undue preference or collusion between the network owner and big media companies?
If the ISP _can_ police the network, do they not then have a _duty_ to police it -- and do they not waive their special intermediary immunity, instead opting for the legal environment of broadcasters and publishers?

These proposals bring more questions than answers, and I'm frankly surprised that these ISPs would even think about opening the pandoras box that is intermediary liability.

Even the very basic idea that one could take a currently-downloading torrent, and unilaterally decide it is infringing, is ludicrous; you're talking about acting on allegation before proving it in a court of law. From a legal perspective, its shoot first and ask questions later.

And thats not good enough,


Richard Bennett wrote:
I presented this technique at the NN2008 symposium yesterday. I showed a screen-grab from Mininova showing pirated Microsoft
Office, and the peer list from an Azureus leecher. It's pretty easy to connect the dots from Microsoft's monitoring of the tracker to action by an ISP in response to electronic requests from the copyright owner. One technique that comes to mind for stoppng piracy transactions is Reset Spoofing, of course.

I showed the technique to clarify that enforcement of copyright doesn't involve Deep Packet Inspection or anything that scary.

Is there any reason that such an automated system should not be used, or does Net Neutrality now connote a license to steal?


Nick Weaver wrote:
I've done some speculation on how AT&T might actually implement their
proposed copyright-filtering mechanism, without actually having to do
deep-packet inspection or even providing new hardware. After all, if
their motive is to save money, they will select a mechanism which
doesn't cost money.

The idea is to rely on someone else (MPAA or an affiliate) to spider
the torrents and identify participants, and once the graph of
participants is generated, to dynamically block just that graph using
router ACLs, which would allow the MPAA to play Whak-a-mole on
individual torrents.  This would be very different in practice
compared with either deep packet inspection or traffic analysis.

I think such speculation is useful because if AT&T really does follow
through on their stated goals, we should get ahead of the curve and
understand what this might look like in practice, and how to detect
the difference between possible techniques (DPI vs generic traffic
analysis vs spider-blocking).

The solution space is actually pretty limited.

More details/thoughts on my blog: