NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] Re: Monitoring user logins via unsecure Wi-Fi networks

On 10/25/2010 6:14 PM, Lauren Weinstein wrote:
> Monitoring user logins via unsecure Wi-Fi networks
> http://bit.ly/cVyYrK   (Techcrunch - "An AOL Company")
> The existence of the exploit ("Firesheep") described at=20
> http://bit.ly/d7nPNH  (Code Butler) should surprise nobody.
> The browser plugin "workaround" described at the Techcrunch/AOL link
> is useful as a transitional tool in the absence of integral crypto
> protection, but what percentage of vulnerable users will be using it
> in the long run?
> Unsecured Wi-Fi is ... unsecure.  Unless end-to-end connections (wired
> and wireless) are protected by strong encryption (and that does not
> necessarily means SSL/TLS within the current certificate-based PKI
> with all its problems) users will be increasingly vulnerable.
> I'm now waiting for the privacy commissioners and other parties who
> have had such a field day crucifying Google over *accidental* Wi-Fi
> payload data collection to take a similar hard line against Firesheep
> and the multitude of other purpose-built Wi-Fi payload monitoring tools=

> available for all manner of applications both fair and foul.

I do not work for a Data Protection Authority (what in some countries is
called "privacy commissioner") so I certainly do not speak on behalf of
any of them. But I do work for the European Commission, so I want to
make clear that what follows is NOT the official position or opinion of
the European Commission.

I have some difficulties reconciling the notion that Google engaged in
"accidental" payload data collection with the document "Source Code
Analysis of gstumbler - prepared for Google and Perkins Coie by STROZ
FRIEDBGERG" of 3 June 2010

In particular, the conclusions of the report (which, to the best of my
knowledge, has not been challenged by Google or others - but feel free
to update me if I'm wrong) state:

"Gslite is an executable program that captures, parses, and writes to
disk 802.11 wireless frame data. In particular, it parses all frame
header data and associates it with its GPS coordinates for easy storage
and use in mapping network locations. The program does not analyze or
parse the body of Data frames, which contain user content. The data in
the Data frame body passes through memory and is written to disk in
unparsed format if the frame is sent over an unencrypted wireless
network, and is discarded if the frame is sent over an encrypted

Admittedly, the program itself does not "analyse or parse" the payload,
but it does discard encrypted payload while writing to secondary storage
the non-encrypted payload.

What is the reason for this different treatment? And more to the point,
how can someone write a program that *accidentally* verifies whether the
payload is encrypted or not and *accidentally* saves such payload on
disk if it is not encrypted?

I would honestly like to hear opinions on these questions, which have
been puzzing me for quite some time.

For what concerns Firesheep and similar tools, in the EU data protection
law broadly regulates the behaviour of "personal data processing", not
the technology that could be used for such processing. But the existence
of the tool might be worth signalling to Data Protection Authorities.

Last, not least, we can all agree that end-to-end encryption should be
the default for all communications. But suggesting, as some did, that
those who do not use end-to-end encryption should see their personal
data collected and processed is akin to saying that since you do not
cover your face with a scarf when walking on the street, there is no
problem at all with companies building a database of your movements
using face-recognition software.

Perhaps we have a different perspective in the EU - where the concept of
"reasonable expectation of privacy" is not as widely used as in the USA
- but on this side of the Atlantic, the fact that something can be seen
does not mean that it can be processed (in the sense of "personal data
processing" ex Data Protection Directive).