Aug 29, 2004 (12:08 PM EDT)
At 35, The Internet Remains A Work In Progress
Read the Original Article at InformationWeek
NEW YORK (AP) -- Thirty-five years after computer scientists at UCLA linked two bulky computers using a 15-foot gray cable, testing a new way for exchanging data over networks, what would ultimately become the Internet remains a work in progress.
University researchers are experimenting with ways to increase its capacity and speed. Programmers are trying to imbue Web pages with intelligence. And work is underway to re-engineer the network to reduce spam and security troubles.
All the while threats loom: Critics warn that commercial, legal and political pressures could hinder the types of innovations that made the Internet what it is today.
Stephen Crocker and Vinton Cerf were among the graduate students who joined UCLA professor Len Kleinrock in an engineering lab on Sept. 2, 1969, as bits of meaningless test data flowed silently between the two computers. By January, three other "nodes" joined the fledgling network.
Then came E-mail a few years later, a core communications protocol called TCP/IP in the late 70s, the domain name system in the 80s and the World Wide Web--now the second most popular application behind E-mail--in 1990. The Internet expanded beyond its initial military and educational domain into businesses and homes around the world.
Today, Crocker continues work on the Internet, designing better tools for collaboration. And as security chairman for the Internet's key oversight body, he is trying to defend the core addressing system from outside threats, including an attempt last year by a private search engine to grab Web surfers who mistype addresses.
He acknowledges the Internet he helped build is far from finished, and changes are in store to meet growing demands for multimedia. Network providers now make only "best efforts" at delivering data packets, and Crocker said better guarantees are needed to prevent the skips and stutters now common with video.
Cerf, now at MCI Inc., said he wished he could have designed the Internet with security built-in. Microsoft Corp., Yahoo Inc., and America Online Inc., among others, are currently trying to retrofit the network so E-mail senders can be authenticated--a way to cut down on junk messages sent using spoofed addresses.
Among Cerf's other projects: a next-generation numbering system called IPv6 to accommodate the ever-growing armies of Internet-ready wireless devices, game consoles, even dog collars. Working with NASA, Cerf is also trying to extend the network into outer space to better communicate with spacecraft.
But many features being developed today wouldn't have been possible at birth given the slower computing speeds and narrower Internet pipes, or bandwidth, Cerf said.
"With the tools we had then, we did as much as we could reasonably have done," he said.
While engineers tinker with the Internet's core framework, some university researchers looking for more speed are developing separate systems that parallel the Internet. That way, data-intensive applications like videoconferencing, brain imaging, and global climate research won't have to compete with E-mail and E-commerce.
Think information highway with an express lane.
Some applications are so data-intensive, they are "simply impractical to do on the current Internet," said Tracy Futhey, chairwoman of the National LambdaRail. The project offers for its members dedicated high-speed lines so data can "get from point A to point B and not have to contend with the other traffic."
LambdaRail recently completed its first optical connection from San Diego to Seattle to Pittsburgh to Jacksonville, Fla. Work on additional links is planned for next year.
Undersea explorer Robert Ballard has used another network, Internet2, to host live, interactive presentations with students and aquarium visitors from the wreck of the Titanic, which he found in 1985.
The Internet's bandwidth can carry only "lousy" video and "can't compete with looking out the window," Ballard said. But with Internet2, "high-definition zoom cameras can show them the eyelids."
Internet2, with speeds 100 times the typical broadband service at home, is now limited to selected universities, companies, and institutions, but researchers expect any breakthroughs to ultimately migrate to the main Internet.
While Internet2 and LambdaRail seek to move data faster and faster, researchers with the World Wide Web Consortium are trying to make information smarter and smarter. Semantic Web is a next-generation Web designed to make more kinds of data easier for computers to locate and process.
Consider the separate teams of scientists who study genes, proteins and chemical pathways. With the Semantic Web, tags are added to information in databases describing gene and protein sequences. One group may use one scheme and another team something else; the Semantic Web could help link the two. Ultimately, software could be written to process the data and make inferences that previously required human intervention.
With the same principles, searching to buy an automobile in Massachusetts will also incorporate listings for cars in Boston.
Change doesn't come easily, however. For instance, the IPv6 numbering system was deemed an Internet standard about five years ago, but the vast majority of software and hardware today still runs on the older IPv4, which is rapidly running out of room.
And the Internet faces general resistance from old-world forces that want to preserve their current ways of doing things: Companies that value profit over greater good. Copyright holders who want to protect their music and movies. Governments that seek to censor information or spy on its citizens.
In early August, the Federal Communications Commission declared that Internet-based phone calls should be subject to the same type of law-enforcement surveillance as cell and landline phones. That means Internet service providers would have to design their systems to permit police wiretaps.
Jonathan Zittrain, a professor with Harvard's Berkman Center for Internet and Society, fears a slippery slope. As these outside pressures meddle with the Net's open architecture, he said, there's less opportunity for experimentation and for innovations like the World Wide Web, born out of an unauthorized project at a Swiss nuclear research lab.