NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information

 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] Re: Google, Net Neutrality, and Fairness


One point we keep forgetting is that the vast bulk of the first/last mile
capacity is off the table making any claim of neutrality suspect. I use
first/last mile here because broadband is a pipe model with first/last miles
even though we should be thinking of the first square mile instead.

Too bad QoS creates fiduciary responsibility to limit the capacity in order
to generate revenue from selling a premium service. Why can't video just
deal with the capacity available -- after all, if there isn't enough
capacity why should all other packets be thrown on the floor in order to let
all those cute kittens be seen. Given how many bits are associated with
video how much room is left over for my medical information? Given that it's
so easy to buffer video it seems the least important candidate for priority.

QoS is a very seductive idea because it allows us to indulge our most naïve
notions of network management. It has a pervasive corrupting influence while
diverting our attention from the real issue -- a dysfunctional marketplace
that requires limiting capacity to create value (http://rmf.vc/?n=as).


-----Original Message-----
From: nnsquad-bounces+nnsquad=bobf.frankston.com@nnsquad.org
[mailto:nnsquad-bounces+nnsquad=bobf.frankston.com@nnsquad.org] On Behalf Of
Lauren Weinstein
Sent: Sunday, August 08, 2010 21:54
To: nnsquad@nnsquad.org
Subject: [ NNSquad ] Google, Net Neutrality, and Fairness



                       Google, Net Neutrality, and Fairness

                   http://lauren.vortex.com/archive/000740.html


Greetings.  In "Evil or misunderstood? Google and net neutrality"
posted earlier today ( http://bit.ly/c8BTWA [Quoderat] ), David Megginson
speculates that a key issue in the now widely-known (but little understood)
recent "secret" meetings between Google and Verizon might have been
discussions regarding giving particular types of content differing
priorities -- what I'd call Quality of Service
(QoS) -- rather than negotiations about giving priority to any particular
content providers per se.  David also argues that content-based priority
(e.g., giving video priority over e-mail) may make technical sense but is
still a bad idea.

I have mixed feelings about all this.  Theoretically, given sufficient
bandwidth and well-behaved applications, QoS shouldn't really ever have to
be an issue.  After all, QoS is fundamentally a mechanism to manage
scarcity.  In the presence of abundance, QoS basically should be
unnecessary.

Of course in the real world we have bandwidth limitations, bottlenecks, and
the reality that not all Internet-based apps are necessarily well-behaved in
their data usage -- so the concept of QoS probably needs to be on the table
for discussion at least.

But it also seems clear that QoS (especially relatively straightforward
implementations) has a number of potential problems -- some of which David
discusses.

There are fundamentally two ways to do QoS -- either you analyze the traffic
(based on actual content, traffic patterns, or both, perhaps using DPI -
Deep Packet Inspection) -- and/or you base your QoS decisions on flags
generated by content providers themselves.

What makes this all so complicated is that we must assume the presence of
"bad actors" on the Net who will attempt to game any QoS system
inappropriately.

Also, moves toward pervasive, ubiquitous encryption by default -- a concept
for which I'm a strong proponent -- could significantly interfere with
"automatically determined" QoS, putting more reliance on accurate tagging of
data by content providers.  Even traffic analysis can be fooled via various
techniques.

The upshot of all this is -- gosh darn it! -- these issues are anything but
trivial, and anyone who tells you otherwise is either misinformed or being
purposely misleading.

Small wonder then, that trying to discuss these issues publicly, with all
the money and emotions tangled up in the topic, tends to have a comfort
level similar to that of sticking your hand down deep into the blades of a
running blender.

While I don't presume to offer any magic wand solutions to these dilemmas,
issues of content prioritization vs. Internet data content types are ignored
at our peril.

If we do end up moving toward QoS systems of some sort, the real challenge
will be finding ways to implement such mechanisms that do not interfere with
users' ability to make full use of encryption, are as minimally vulnerable
to unfair manipulation as possible, and that do not create distortions
resulting in anticompetitive or other unfair behaviors.

A tall order.

And that's the truth.

--Lauren--
Lauren Weinstein (lauren@vortex.com)
http://www.vortex.com/lauren
Tel: +1 (818) 225-2800
Co-Founder, PFIR (People For Internet Responsibility): http://www.pfir.org
Co-Founder, NNSquad (Network Neutrality Squad): http://www.nnsquad.org
Founder, GCTIP (Global Coalition for Transparent Internet Performance):
   http://www.gctip.org
Founder, PRIVACY Forum: http://www.vortex.com Member, ACM Committee on
Computers and Public Policy Lauren's Blog: http://lauren.vortex.com
Twitter: https://twitter.com/laurenweinstein
Google Buzz: http://bit.ly/lauren-buzz