NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information

 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] Re: Monitoring user logins via unsecure Wi-Fi networks


I must admit that when it comes to EU (and EU member countries)
attitudes toward these issues, it's hard not to sense a sort of
"schizophrenic" sensibility.

On one hand, data retention limits are imposed on companies like
Google, and a "big show" is made of such "clamping down."  On the
other hand, there's the EU push to *require* even vaster data
retention for longer periods of time by ISPs and other telecom 
firms -- e.g., the Data Retention Directive -- which may even require 
hotels providing guests with Wi-Fi to register as ISPs 
"in order to monitor crime and terrorism" by falling into the purview 
of the Directive ( http://bit.ly/a6dJzy [PC World 10/14/10] ).

As for "gslite" -- it's obvious that most of the persons who express
"difficulties" (aka "suspicions") over its behavior (and associated
accidental data collection by Google) have never had to debug an
Internet connection at the packet level.

The ability to see the payload data of connections is invaluable in the
legitimate analysis of network problems -- the "tcpdump" program included
with (or available for) every copy of Linux on the planet explicitly
includes such a function (the "-w" raw packet dump option).  

The enhancement of only saving unencrypted packets is entirely
reasonable, since in most situations encrypted packets cannot be
usefully interpreted for debugging purposes.

For anyone with an engineering background who has actually worked with
such tools, it's very easy to see how such an option could
accidentally be left enabled and be widely deployed.  Should Google's
procedures have caught this error?  Yes.  But the hue and cry over
Google's harmless mistake, with no abuse of the data involved in any 
way -- while genuine bad guys are still free to collect all of the
unsecured Wi-Fi data that they wish for actual exploits -- strikes me
as being exploitative behavior by many of the accusers involved.

--Lauren--
Lauren Weinstein (lauren@vortex.com)
http://www.vortex.com/lauren
Tel: +1 (818) 225-2800
Co-Founder, PFIR (People For Internet Responsibility): http://www.pfir.org
Co-Founder, NNSquad (Network Neutrality Squad): http://www.nnsquad.org
Founder, GCTIP (Global Coalition for Transparent Internet Performance): 
   http://www.gctip.org
Founder, PRIVACY Forum: http://www.vortex.com
Member, ACM Committee on Computers and Public Policy
Lauren's Blog: http://lauren.vortex.com
Twitter: https://twitter.com/laurenweinstein
Google Buzz: http://bit.ly/lauren-buzz

  - - - -

On 10/25 19:38, Andrea Glorioso wrote:
> On 10/25/2010 6:14 PM, Lauren Weinstein wrote:
> >=20
> > Monitoring user logins via unsecure Wi-Fi networks
> >=20
> > http://bit.ly/cVyYrK   (Techcrunch - "An AOL Company")
> >=20
> > The existence of the exploit ("Firesheep") described at=20
> > http://bit.ly/d7nPNH  (Code Butler) should surprise nobody.
> >=20
> > The browser plugin "workaround" described at the Techcrunch/AOL link
...

> 
> I do not work for a Data Protection Authority (what in some countries is
> called "privacy commissioner") so I certainly do not speak on behalf of
> any of them. But I do work for the European Commission, so I want to
> make clear that what follows is NOT the official position or opinion of
> the European Commission.
> 
> I have some difficulties reconciling the notion that Google engaged in
> "accidental" payload data collection with the document "Source Code
> Analysis of gstumbler - prepared for Google and Perkins Coie by STROZ
> FRIEDBGERG" of 3 June 2010
> (http://www.google.com/en//googleblogs/pdfs/friedberg_sourcecode_analysis=
> _060910.pdf)".
> 
> 
> In particular, the conclusions of the report (which, to the best of my
> knowledge, has not been challenged by Google or others - but feel free
> to update me if I'm wrong) state:
> 
> "Gslite is an executable program that captures, parses, and writes to
> disk 802.11 wireless frame data. In particular, it parses all frame
> header data and associates it with its GPS coordinates for easy storage
> and use in mapping network locations. The program does not analyze or
> parse the body of Data frames, which contain user content. The data in
> the Data frame body passes through memory and is written to disk in
> unparsed format if the frame is sent over an unencrypted wireless
> network, and is discarded if the frame is sent over an encrypted
> network".
> 
> Admittedly, the program itself does not "analyse or parse" the payload,
> but it does discard encrypted payload while writing to secondary storage
> the non-encrypted payload.
> 
> What is the reason for this different treatment? And more to the point,
> how can someone write a program that *accidentally* verifies whether the
> payload is encrypted or not and *accidentally* saves such payload on
> disk if it is not encrypted?
> 
> I would honestly like to hear opinions on these questions, which have
> been puzzing me for quite some time.
> 
> For what concerns Firesheep and similar tools, in the EU data protection
> law broadly regulates the behaviour of "personal data processing", not
> the technology that could be used for such processing. But the existence
> of the tool might be worth signalling to Data Protection Authorities.
> 
> Last, not least, we can all agree that end-to-end encryption should be
> the default for all communications. But suggesting, as some did, that
> those who do not use end-to-end encryption should see their personal
> data collected and processed is akin to saying that since you do not
> cover your face with a scarf when walking on the street, there is no
> problem at all with companies building a database of your movements
> using face-recognition software.
> 
> Perhaps we have a different perspective in the EU - where the concept of
> "reasonable expectation of privacy" is not as widely used as in the USA
> - but on this side of the Atlantic, the fact that something can be seen
> does not mean that it can be processed (in the sense of "personal data
> processing" ex Data Protection Directive).
> 
> Best,
> 
> Andrea
> 
>