Driving the Web; Internet2
Visionaries in the scientific, military and business industries contributed to the World Wide Web as we now know it. Certainly, many individuals observed altruistic notions of a communications medium for Mankind but there is no denying the United States and international war governments sponsored the research initiatives behind its initial development, funding their countries’ respective war and Cold War efforts and eventually recognising, perhaps reluctantly, the critical contributions of their peoples’ academic and scientific communities.
The World Wide Web is not the Internet, it is a subset, designed specifically for the universal interchange and dissemination of information, although the terms have become synonymous to many users. To put it another way, all internet users have access to the Web but not conversely since some areas of the Internet are restricted access – many scientific, military, educational and business networks require privileged access to non public areas, areas often dedicated to research and development.
Internet2 High Speed Networks
One such area is Internet2, a subscription-only multidiscipline consortium of high-speed networks connected (at least in the United States) by an ultra high-speed backbone, formed for the investigation, creation and deployment of cutting edge internet technology. It links some 200+ United States universities in addition to scientific communities, governments and business, many of whom pay some $30,000 in annual membership fees plus an annual connection fee (where a point-of-presence is available) to Abilene, the company providing a 10 Gigabit fibre-optic, high-speed router transmission infrastructure, which may amount to hundreds of thousands of dollars. It further extends to research centres in other countries via high-speed links.
One reason for the creation of such and similar ultra high-speed networks is a direct result of scientific research where, with regards to particle physics for instance, vast quantities of data are generated, data requiring many months or even years to transmit at conventional speeds. A far less extreme reason is for transmission of broadcast quality video and streaming multimedia files – big business a la video on demand.
Faster Communications
Bandwidth, the capacity of a network to carry information, depends on a number of factors which are predetermined and usually hardware limiting. The transport protocol managing movement of data (packets) around the Web, TCP/IP, is a variable factor. A user may well enjoy a high-speed broadband link to their ISP but from there outwards there is no guaranteeing the speed or capacity of subsequent internet connections to the streaming video server you subscribed to.
As mentioned earlier, TCP/IP was developed in the mid-’70s and governs all Internet communications. It has remained largely unchanged. It’s strength – and weakness – lies in its ability to adjust data transmission to meet internet conditions, namely congestion, transmission urgency and quality. It does this by sending re-requests for information when it doesn’t receive confirmation of receipt by a certain time but it doubles the wait time after each re-request in response to net congestion algorithms. This is often why file downloads may begin with a burst of activity then speed deteriorates to frustrating slowness.
In response to this a new approach has been developed to using TCP: FAST or Fast Active queue Scalable Transmission protocol. FAST dynamically adjusts transmission speeds in response to how quickly it receives acknowledgements of successful packet transmission and has managed spectacular sustained transfer speeds. This is not to say FAST increases bandwidth – largely fixed by physical hardware limitations (and, of course, set to maximums by lease costs) – but it does increase efficiency – from a typical 25% to upwards of 95%.
Big business players like Microsoft and Disney have shown keen interest in its development especially now that digital media has come of age. And the beauty of FAST is its implementation does not require specialised client-based software or hardware upgrades; existing computers will be able to make use of it immediately upon its release (although current versions do not support Windows-based servers).
Article: Internet2 (Part.6) written by Vincent Zegna and Mike Pepper. Published 23rd November 2005.