NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information

 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] Re: User sues AT&T after $5000+ bill for exceeding 5 GB, bandwidth cap (Brett Glass)


#1.  They're not doing HSPA+ yet.  It's HSUPA with 14 Mbps of downstream
over 5 MHz of spectrum.
#2.  42 Mbps HSPA+ assumes MIMO dual radio configuration on the tower and
the client side, which is a substantial cost increase.
#3.  All wireless technologies decline in performance in weaker coverage
areas.  We're talking more than a 10-fold decline.  So you can't quote the
best-case data rates.
#4.  The towers cost way more than the spectrum.  A dual purpose CDMA/GSM 3G
tower costs on average $850,000.

-----Original Message-----
From: nnsquad-bounces+george_ou=lanarchitect.net@nnsquad.org
[mailto:nnsquad-bounces+george_ou=lanarchitect.net@nnsquad.org] On Behalf Of
David P. Reed
Sent: Wednesday, March 04, 2009 9:32 AM
To: nnsquad@nnsquad.org
Subject: [ NNSquad ] Re: User sues AT&T after $5000+ bill for exceeding 5
GB, bandwidth cap (Brett Glass)

Let me understand the calculation Brett is making.  If AT&T pays $3M for 
5MHz of spectrum for its cellular data, one would have to determine how 
many bits can be transmitted per second.  Let's assume HSPA+ which is 
what ATT is deploying.  That gets 42 Mb/sec down or 22 Mb/sec up in a 
good case, which is what will happen when it deploys enough cells to 
provide good coverage.  (not counting cost of equipment here: Brett said 
*cost of spectrum*).

Let's assume the license is for one year.   So, 30 megaseconds in a year 
gives $0.10 per second as the rate ATT pays for a second, FOR A WHOLE 
LICENSED REGION (say NYC).  

So let's assume ATT has a few (2) thousand cell site sectors in that 
region (sectors are directional swaths from a cell site, and they can 
build more).  Then the cost of spectrum per cell site sector would be 
$0.00005/sec.  So assuming the user gets 42 Mb/sec, we get a cost for 1 
GByte to be 0.00005 /42,000,00 * 8,000,000,000 or $0.10.  Yup, that's 
right: a Gigabyte should cost about 10 pennies of spectrum in NYC.  So 
50 G per month costs ATT about $5.00 worth of spectrum.

In theory, if traffic got heavier, ATT could deploy more sites, further 
cutting the cost of spectrum per user, because they can service the same 
user with more spectrum. 

The real point here is that Brett's response was too glib by half.  Not 
what one would expect from a businessperson or engineer doing their 
math.  Of course there are lots of other costs that ATT incurs.  But it 
is JUST WRONG to say that ATT, who touts the best and most efficient 
bits/Hz wireless data network on the planet, pays more for spectrum than 
the user spends.