35 Who Made a Difference: Tim Berners-Lee

First he wrote the code for the World Wide Web. Then he gave it away

35_berners-lee.jpg
Sam Ogden Jaime Morales (Clickability client services)

The origins of great inventions are generally more complicated than they appear. Thomas Edison did not make the first light bulb, nor did Samuel Morse build the first electric telegraph. Yet in the case of the British scientist Tim Berners-Lee, the story is unusually straightforward. In the fall of 1990, he wrote some software with the aim of making it easier for particle physicists to share their results by interlinking documents on different computers.

Of course, the idea of "hypertext"—linking a word or phrase in one document to another document—was not new. Commentaries on the Torah and even the notebooks of Leonardo da Vinci take the form of hypertexts. Much later, once the computer age began, visionaries including Vannevar Bush and Ted Nelson proposed elaborate hypertext systems. But Berners-Lee actually implemented his scheme in working software and then released it into the world. He considered calling it Information Mesh, or Mine of Information, but eventually settled on the name World Wide Web.

At the time, Berners-Lee was working at CERN, a physics laboratory in Geneva, Switzerland, and he first made his new software available to other physicists. A few months later, in the summer of 1991, he made it freely available on the Internet. And the rest is historic: the Web became the accessible face of the Internet and now consists of billions of pages. Yet beneath modern adornments such as animations and video clips, all those pages still rely on conventions (such as "http," "html," and so on) that Berners-Lee came up with 15 years ago.

Like the Internet that underpins it, the Web has flourished because of its openness and its creator's deliberate decision not to predict or prejudge how it would be used. As the Web took off, there was a debate within CERN about whether to try to profit from it. Berners-Lee argued strongly against this idea: without an open standard, he reasoned, there would end up being several incompatible forms of Internet media, backed by Microsoft, AOL and others. Making the Web royalty-free made it more attractive than any proprietary alternative. "Without that, it never would have happened," he says.

While the benefits of openness were clear to him, Berners-Lee did not foresee the many ways in which the Web would be used. He first realized the extent of its potential in the summer of 1993, the day he began using a large color monitor. As he was browsing the Web, still in its infancy, he stumbled upon a Web-based exhibit of Renaissance art from the Vatican, based on images posted on-line by the Library of Congress, wrapped up in a few simple Web pages by a Dutch programmer. As a colorful illuminated manuscript unfurled on his screen, Berners-Lee says, it took his breath away. Not only was it beautiful; it also demonstrated the Web's power to promote international collaboration and sharing.

Berners-Lee, 50, is now based at the Massachusetts Institute of Technology, where he continues to defend the Web's founding principle of openness as the head of the W3C, the Web's standards body. Though modest and soft-spoken, he is also charming and persuasive, which makes him the ideal person to steer the Web's development and ensure it remains open.

To have changed the world once would be enough for most inventors, but Berners-Lee still regards the Web as a work in progress. "The Web is not done," he says. One area where there is room for improvement is in making the Web a two-way medium, as it was in its earliest days: the original Web browser was also an editor (it not only displayed pages, but also let the user alter them), but this feature was not included in subsequent browsers as the Web went mainstream. Berners-Lee regards the current mania for Weblogs (on-line journals) and wikis (pages anyone can edit) as a step in the right direction. "One of the things that makes wikis and blogs attractive is that everybody is able to express themselves," he says. But there is still room to make them easier to use, he believes.

Most of his effort is now devoted to creating a "semantic Web," in which documents on the Web make sense to machines as well as people. At the moment, a page containing a weather forecast, for example, can be understood by a human, but is merely numbers and letters to a machine.

The semantic Web involves labeling information on Web pages and in databases with "metadata"—data about data—saying what it is. This would make novel forms of searching possible and would even allow software to make deductions using retrieved information. The W3C approved the required standards last year.

Just as the Web was first adopted by particle physicists in 1991, the semantic Web seems to be taking root initially in the life sciences. In a field that faces daunting data-management challenges and where a lot of money is at stake, Berners-Lee says, the technology allows disparate databases of genomic information to be tied together seamlessly and searched in clever new ways. But it will be harder for the semantic Web to reach critical mass than it was for the Web, he admits, since it is difficult to demonstrate its benefits until the metadata is in place.

Won't that mean rejiggering all of today's Web pages? Not necessarily. Many Web pages are generated on the fly from databases, so adding metadata labels is simply a matter of changing the wrappers put around the data. And large software vendors, which have pooh-poohed the idea of the semantic Web for several years, have recently begun to change their view. "They have started to understand it," Berners-Lee says.

It is an enormously ambitious scheme: an attempt not just to make information available, but to organize it too. Back in 1991, however, the idea that the Web would become what it is today seemed just as implausible. So perhaps lightning will strike twice after all.

Get the latest Science stories in your inbox.