The Persistent Stream

Today, Wired posted an story about AT&T proposing a general data cap on thier DSL and UVerse Fiber networks, 150 GB a month for DSL and 250 GB a month for Uverse.

Now I can go on and on about how once again a telecommunications company is diluting services in the name of preventing network congestion, but only seem that it’s just to line their pockets. But I’m not, there will be plenty of consumer advocates and information activists that will be doing that Rather, I would like to argue the case that any cap can’t be justified, even with a high threshold.

Most domestic network users, when seeing the 150 and especially 250 GB data cap will say, “That’s a ridiculously high cap. Even at my heaviest internet use I would never surpass that in a month.”  And that may be true, at least today. But we are in the middle of a transition in which the bulk of both computing and media is being done through the network.

Think about what you do on the daily basis with a connected device. You may be watching HD video via Hulu or Netflix, talking to people on skype, or playing a multi-player game online via Xbox, Wii, or PS3. Even doing work can involve a great deal of network access. E-mail, writing up paperwork or working on various traditional desktop computer work can be doing entirely online at this point more and less, and the convenience of these web apps are growing. This type of internet access, one in which every aspect of computing is done with some amount of network access  is what I called “Persistent Streaming”.

The persistent stream is John Gage’s term “the network is the computer” finally taking shape in our daily lives. None of us can argue that the majority of the time we spend in front of a screen isn’t dedicated to sending and receiving bits to and from the ether. Network access has become so intertwined in the way software operates that it becomes invisible to us. Not only that, the network has ceased to be just a means in which data transferred to be processed on home desktops, but rather its being processed remotely and presented to us via  messages or fully interactive interfaces.

And similar to the way computer power can be quantified via CPU clock speed, the power of the persistent stream is directly contributed to the amount of bandwidth a connection has. The quality of a video stream, a VOIP call, connecting to some cloud based application, all of these services improve when done over a fast connection. And like desktop applications evolving to in a matter to require more cpu processing power over time, so will network based applications and services.

And that’s my point. An ISP now saying to you that you can only have 150 GB/Month now is like Intel in 1998 telling you that you were only need a 450 Mhz CPU to do what you need to do, with software being written three years that require over 1 Ghz of processing power in order to function. Eventually data and services on the network will require more bandwidth and that huge 150 GB a month will seem quite quaint in a couple of years. Unfortunately, unlike the PC hardware industry there is no competitive insentient  for ISPs like AT&T to create faster network infrastructure to keep in pace with evolving use of the network. Just the opposite in fact, ISPs are trying to tell it’s users that their use of the persistent stream is something that needs to be controlled, eliminating unlimited accesses and clocking it as some twisted conservation effort. But in fact, increasing bandwidth use is just another facet of Moore’s law playing it’s geometric course.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s