Does faster broadband really matter?

Internet blogger Om Malik has written an interesting piece on the new, faster broadband connections that are now becoming available to US consumers. His premise is that the faster speeds are not that important, because they don’t translate into a significantly better experience for the end user.

The gist of his argument is that most online activities, like standard websurfing, are not significantly sped up by high-bandwidth connections, and the few that are, such as downloading, are not typically time-sensitive anyway:

Websurfing runs at only about a megabit per second, and nearly everything else except downloading is effectively throttled down at the source. Downloading turns out to have some natural limits as well; at 100 Mbps, you can download enough music for 24 hours of listening in only four minutes per day. The practical result, confirmed by high speed leaders like Masayoshi Son of Yahoo BB in Japan, is that the faster speeds yield only a extremely modest increase in real traffic demand.

He goes on to suggest that the push for faster Internet connections comes from the network operators themselves. With all the infrastructure already in place, it costs only a tiny bit more to offer a higher-speed connection, but the consumer can be charged a significantly higher amount for “premium” access. This is a fairly standard formula for most businesses. The concept of “super-sizing” fast food works in a similar fashion, for example.

The idea of those who do prefer a faster connection (for example people who like to watch streaming video from the BBC) paying for faster service seems at first to be relatively benign. However, when you look under the surface, it may in fact be part of a larger plan by the network operators that could end up creating a “two-tier” Internet. This plan may be the beginning of the end for “network neutrality.”

Network neutrality is the concept that no particular type of traffic on the Internet should receive preferential treatment for speed or access. It was the design for network neutrality that made the Internet possible in the first place. It keeps ISPs from becoming liable for the content that may pass through their systems.

However, an article by the BBC examines the fact that many service providers are starting to prioritize their own content at the expense of those from rivals. Many countries have started or are considering blocking Voice-over-IP (VOIP) traffic in order to protect the phone companies from competition. For example, this summer reports from Germany indicated that Vodafone had begun to block VOIP traffic, treating the popular Skype program as “inappropriate content.” On the other side of the coin, Canadian cable provider Shaw now offers a premium VOIP service that promises to prioritize Internet telephony traffic for a monthly fee. In some countries the government itself is getting involved in putting forth legislation to restrict certain types of traffic.

The implications for the consumer of this Balkanization of the Internet are scary indeed. In the guise of offering “premium” services, the networks are trying to set up a model where they can control what kind of content gets priority service for distribution, or even gets blocked completely. The implications for freedom of information are enormous.

By Jeremy Reimer, Ars Technica on Om Malik

Tags: , , , , , , , , , , ,

1 Comment »

  1. JowieNeckBone said,

    February 18, 2006 @ 7:09 pm

    Anyone reading this ought to be made aware of some facts about Jeremy Reimer:

    Jeremy Reimer doesn’t even have a single degree that is about the field of computer sciences or even a certification in comp. sci. fields (like MCSE).

    He is utterly lacking professional/in-the-trenches experience in computer science as well.

    Yet I see you cited Jeremy Reimer here as somekind of expert in the field of comp. sci.?

    Research who it is you are citing for your own sake.

    Making a ’sidewalk-surgeon/quack’ out to be an expert in a particular field is just bad business.

    I.E.-> Jeremy Reimer is just another wannabe poseur with no skills period in computer science, who merely scours wikipedia and other sources and spits back already known information.

    This is intelligence? This is being an expert??

    I know not: I could do the same myself and so could you other readers.

    In other words, nothing fundamental or original exists in the lot of his ‘articles’ for arstechnica. This makes complete sense - he is not qualified to do otherwise.

    (His ‘articles’ are mere high-school level termpapers @ best)

    They are written by a charlatan named Jeremy Reimer posing as a computer expert.

    (If this is hard to believe, ask Jeremy Reimer yourself if he has a degree or certification in the field of comp. sci., but more importantly if he has years of actual professional work experience in computer networking, programming, etc. in this field)

    You will see he is nothing more than a wannabe trying to pull the wool over your eyes and come off as some sort of authority in this field when he is anything but that).

    Jeremy Reimer’s just some dunce attempting to create the perception of being a noted authority in the field of computers, with no noted accomplishments in this field, other than writing articles for the playpen arstechnica because he is a forums member there.

    You don’t see the likes of IBM, Microsoft, or any known/respected companies in the field of computers citing Jeremy Reimer do you? This is because they know better.

    Reimer is completely lacking in education, professional experience, & skills is why and this is why he is only capable of being ‘published’ @ arstechnica. This ought to make you question their expertise & judgement as to who is an expert author as well!

    That alone should make you think twice before citing him as somekind of expert.

RSS feed for comments on this post · TrackBack URI

Leave a Comment

Couldn't find your convert utility. Check that you have ImageMagick installed.