Friday, February 19, 2010

How to Make the Internet a Lot Faster


Last week, Google announced its plans to build an experimental fiber network that would offer gigabit-per-second broadband speeds to up to 500,000 U.S. homes. Among other goals, the company said it wanted to "test new ways to build fiber networks, and to help inform and support deployments elsewhere."
Google hasn't released many details yet, but experts believe that the key to successful very-high-speed broadband doesn't lie in fiber alone. To really speed up the Internet, Google will have to operate at many levels of its infrastructure.

Gigabit-per-second speeds are much faster than, for example, the speed currently offered by high-speed services such as Verizon FiOS. However, Google's network won't be the first to reach such speeds. There are several such deployments internationally, including in Hong Kong, the Netherlands, and Australia. Internet2, a nonprofit advanced networking consortium in the United States, has been experimenting with very-high-speed Internet for more than a decade, routinely offering 10-gigabit connections to university researchers.

Existing applications for very-high-speed Internet include the transfer of very large files, streaming high-definition (and possibly 3-D) video, video conferencing, and gaming. Some experts speculate that accessing large data files and applications through the cloud may also require better broadband.

"Just big pipes alone to an end user does not necessarily guarantee that you can deliver high-end applications," says Gary Bachula, vice president of external relations for Internet2. There are many factors beyond raw bandwidth, Bachula says. For example, an improperly configured router or a university firewall can affect performance and end up acting as a network bottleneck.

"You need to have open networks, you need to publish your performance data, you need to have people troubleshoot your network remotely," says Bachula. In recent years, Internet2 has been researching tools and technologies that can help find and resolve the performance issues that occur on high-speed connections "in a systematic and seamless way." Ideally, he says, consumers as well as network managers would be able to use these tools to diagnose the network.
"If we're really going to realize the vision of some of these high-end applications, it does have to go beyond basic raw bandwidth," he adds.

It's also not enough to build a fast hardware infrastructure, says Steven Low, a professor of computer science and electrical engineering at Caltech, and cofounder of the network optimization technology company FastSoft, based in Pasadena, CA. Low believes the protocols that move traffic through the network will also need to be updated to make effective use of very-high-speed capabilities.

No comments: