NNSquad - Network Neutrality Squad

NNSquad Home Page

NNSquad Mailing List Information

 


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[ NNSquad ] The Un-Internet or the Next Level Internet?


[I originally sent a slightly different version of this message just to Lauren, but Bob Frankston's thoughtful post made me reconsider and address this to a broader audience.]

I have very mixed feeling about the whole business of locked up vs unlocked systems (to coin some terminology, since closed and open have too many overlaid meanings by now).  On the one hand, hell, I'm a hacker; I want to know how stuff works, I want to develop my own stuff, I don't want people telling me what I can and can't have on my system. On the other … I used Windows, and felt the incredible pain of trying to keep it working and reasonably secure.  In fact, when I finally bought a "PC" for the family - before that, I had VMS Vaxes and a Unix box as my "home" machines - I got a Mac, entirely on the theory that I really didn't want to be a Windows system manager at home.  Linux, or any other Unix, was just a non-starter.  I have a Mac laptop and a Linux desk side at work, and use both all the time.  The Linux box has some of the tools I need, but the Mac is a much more pleasant experience.

The only iOS device I have is an iTouch, and I also have had a couple of Android phones.  For all the noise about openness, in practical terms there is little difference to me between the devices.  I have exactly one app that, at one time, existing on iOS but was then locked out (but still exists on Android) - WiFi Scanner.  It does annoy me that Apple chose to lock out a whole class of useful applications, but having been on the other side, trying to keep some control over what undocumented features customers (internal and external) were tying themselves to and then insisting we continue to support forever, I can well understand Apple's insistence that only documented, supported interfaces be used.  Hell, that's exactly what I told *my* customers.  And there definitely is a payoff in usability and general quality.

I have Zitrain's "The Future of the Internet and How To Stop It".  It's on my list to read fully, but his basic refrain is that iOS (which at the time he wrote didn't even allow apps to be written) was indicative of a trend that was killing the "generative" side of computers and the net.  But … you need only look at the app store to see just how much "generativity" iOS allows.  Sure, there's tons of repetitive crud - but there are also a whole bunch of really clever, creative, useful apps.  (Recent clever idea:  An alarm clock that you can set to wake you a bit earlier if it's snowed.)

So … what's really happening is that generativity isn't going away; rather, it's moving elsewhere in the stack.  Prior to the late 1970's, machines were closed because they were too expensive to be anything else.  We then went through successive waves of "openness".  First we had the hardware hackers.  But those faded as standardized architectures took over and also as more and more stuff went on-chip and you really couldn't build anything competitively interesting without a fab.  Then we had creativity with low-level system software - everyone wrote their own OS and compiler and tools.  But those got standardized, too - and the creativity moved to a higher level of abstraction.  This is a natural, inevitable process, and it just keeps happening.  No one will write the equivalent of sed or tar for Android, much less iOS - those are tools at entirely the wrong level of abstraction.  (Besides, if you *want* sed or tar, they already exist.)

Software for iOS (or Android) is about interacting with people and the web and the physical world around the user.  Just think about that clever alarm clock app!  This kind of software is just not comparable to what we hacked together to use from the command line on machines whose only tie to the real world was a clock.  (Hell, the first machine I programmed on - an IBM 1620 - didn't even have a clock!  It had no idea what the date or time was.  It was years before machines made the transition from needing you to tell them the date and time to serving as your timekeepers.)  For software at this level, the "openness" of the underlying platform is so many levels of abstraction down that it really makes no difference.  What matters most is that there are effective, usable API's for the kinds of things that make up such software:  Displays, sounds, user inputs like touch and soon speech, real-world inputs of many sorts.  It's at this level that Android - and iOS - are both wide open, with Android perhaps a bit more so.  (That wireless scanner:  Apple refuses to allow it because they haven't (yet?) published an API to get at the low-level "real world of visible networks".  In the Android world, you'd hack something together and people would live with the fact that it might break if the API's it was (perhaps improperly) were changed.)

The element that's new, and most worrying perhaps, is the limitations on the App Store, iTunes, and so on.  Winer talks about media, which are subject to some limitations but, really, he's exaggerating when he compares it to Disneyland.  Books, movies, music - Apple sells stuff like that under pretty much the same rules as a mainstream bookstore, DVD, or CD seller would have a couple of years ago.  Anything short of outright pornography - the kind "you know when you see it" - is available.  Besides, you can buy media from anywhere - even iOS imposes no limitations on what media files you can import.  The real issue is apps.

It's one of the curiosities of this whole debate that Apple claims - and the results prove - that they are providing what consumers want by limiting what can go into the app store.  This is the same claim made in response to claims that newspapers deliver too many sensational crime and celebrity stories instead of "hard news", that TV and movies are too full of sex and violence - or even that food makers supply "junk food" full of fat and sugar and salt.  The fact is that we want the market to decide, but we're often uncomfortable with the results.  There are no easy answers.

For apps, one thing I've always been surprised no one picks up on is that *anyone* can register as an Apple developer for $99/year.  Then you can build your own stuff and download it to your iOS devices.  Of course, "your own stuff" is anything you compile from source - which if you're into OSS, is a pretty open-ended class.  A bit less convenient that pre-compiled binaries - but really not so bad.  If you never try to get approval to ship a product, any approval policies are just irrelevant.  Doing this may or may not technically be consistent with Apple's TOS, but I doubt they'd care.  As a developer, you've inherently bought in to the idea that you're running flakey under-development software - you're not going to expect final finished Apple (or good 3rd party) behavior.

If the complaint is that $99/year is too much - well, then as G.B. Shaw said, we already know what you are - we're just haggling over price.  (Personally, I'd love to see Apple come up with a "hobbyist" registration:  Allow everything that a developer is able to do except submit stuff to the app store.  It would be reasonable to charge a small fee - say $10/year - to defray administrative costs.)

In sum:  It can be difficult to separate real issues from the complaints of us old-timers who long for the time when writing some simple C command line program was the height of hackerdom.  That world is long gone….

                                                       -- Jerry


_______________________________________________
nnsquad mailing list
http://lists.nnsquad.org/mailman/listinfo/nnsquad