You wouldn’t think something as unscientific as accident could have played much of a role in the life of Tim Berners-Lee, the brilliant British physicist and computer scientist who in 1991 invented the World Wide Web. He conceived it and still controls a lot of how it operates from his unimposing office at the Massachusetts Institute of Technology. In 1999, Time placed Berners-Lee on its list of the “100 Persons of the Century.” No fewer than seven different universities have awarded him honorary degrees.
But the great breakthrough engineered by this icon of cyberspace did occur, in part, by chance. “There was an element of serendipity,” says Arthur Molella, director of the LemelsonCenter for the Study of Invention and Innovation at the Smithsonian’s National Museum of American History. “At first, he was just noodling around, trying to find a way to organize his research files. So he began to develop a tool just for his own personal use.”
The “tool” was a software program that, as Berners-Lee puts it, was “really useful for keeping track of all the random associations one comes across in real life, and [which] brains are supposed to be so good at remembering—but sometimes mine wouldn’t.” He called it Enquire, and it worked so well, creating effective linkages between huge amounts of information, that it eventually became the basis for the revolution we now casually refer to as the Web. “It would be akin to a carpenter building a little cabinet for himself,” Molella says, “and suddenly discovering he could store the entire world inside the thing. There was quite a bit of luck in it.”
The element of chance has helped produce many of the most important innovations in modern life. Many are created by it; others become successful because of it, and some fail for the same reason. As Mark Twain, an inventor himself, once scribbled in his notebook: “Name the greatest of all the inventors. Accident.” If you don’t believe it, go into your kitchen and look around. There might be a Teflon pan on the stove, a microwave oven above it, Post-its sticking out of cookbooks, matches in a drawer; Coke, Popsicles and ketchup stashed in a refrigerator. Accident played a role in their invention.
Happenstance works in many ways. One is the observed event: the “invention” is the way the mind seizes upon an inconspicuous occurrence. The best known of these is Alexander Fleming’s role in the discovery of penicillin. One day in 1928 some mold drifted through an open window in a London hospital and landed in Fleming’s petri dish, where he’d placed a culture of staphylococcus bacteria. What Fleming did next got him and two colleagues a Nobel Prize in 1945: he looked through the microscope. What he saw was the mold efficiently destroying the germs. Presto! The creation of penicillin began with that unlikely turn of events.
But Robert Friedel, historian of technology at the University of Maryland, cautions that “serendipity is no accident.” What’s important about an unintended event, Friedel asserts, is the creative way it is used. As Louis Pasteur once said, “Chance favors only the prepared mind.”
Any of us might happen to see a cat pull feathers through a birdcage; but when Eli Whitney saw that, he got the idea of how to comb cotton mechanically. Hence the cotton gin. “Some people are just more likely to pay attention when they see something,” says Rini Paiva of the National Inventors Hall of Fame in Akron, Ohio. “If you have a certain type of brain, you might see something weird and say, ‘Hey, what can I do with this?’ ”
Take Percy Lebaron Spencer. Ahero of World War II for his work in developing radar, Spencer obtained more than 120 patents in his lifetime. One day shortly after the war, he was walking through his lab at the Raytheon Company in Cambridge, Massachusetts, when he stopped briefly by a magnetron—the tube that produces the high-frequency microwaves that power radar. “He was working on things like missile-defense systems,” Paiva says. “But just that second he got a strange feeling. He realized that a candy bar in his jacket pocket had melted.” Odd, Spencer thought. Immediately, he performed a makeshift experiment: he put some popcorn kernels in front of the magnetron. Soon, popcorn was popping all over the place. “There’s actually a drawing of a bag of popcorn in one of Spencer’s patents,” Paiva says. “Other people might just make a note or two in a lab notebook and let it go. But right away Percy Spencer was thinking about what this could be used for—a microwave oven.”
It’s not just scientists hanging around high-tech labs whom accident favors. Hans Lippershey, a 17th-century Dutch eyeglass maker, simply happened—so the story goes—to look through two lenses one day and notice that objects at a distance were greatly magnified. When he put the lenses in a tube, he created the world’s first telescope. John Walker was a pharmacist, not a scientist. One day in 1826 he was mixing potassium chlorate and antimony sulfide together with a stick, but the mixture stuck to the stick. When he tried to scrape the stuff off against the stone floor, it burst into flames. Walker quickly produced for sale the first friction matches, or, to use his catchy name, “sulphuretted peroxide strikables.”
Inspiration can take a lot longer to strike than a match. Frank Epperson was an 11-year-old boy at the dawn of the 20th century when he accidentally left a mixture of soda powder and water out on the back porch one cold night. In it was the stick he’d used as a mixer. Next morning, Epperson found the soda water frozen around the stick. Nearly 20 years passed before he realized that by adding some flavoring, he could concoct a frosty treat, and with that he began to manufacture what he called “Eppsicles.” Eventually the name changed, and he earned royalties on more than 60 million Popsicles. (That success inspired the creation of the Fudgsicle, the Creamsicle and the Dreamsicle.)
Sometimes Lady Luck delivers the invention but not the fortune that should go with it. One day in 1839, a failed hardware salesman was tinkering at his boardinghouse in Woburn, Massachusetts. He’d been hauled off to debtor’s prison so often that he called it his “hotel.” Even there, he kept doing experiments, doggedly trying to make a useful material out of a substance from Brazil called rubber. People bought it for erasing—“rubbing” out mistakes. Because it became brittle in the cold and melted in high heat, that was about all it was good for. The amateur inventor tried mixing it with numerous chemicals all without success, until that day in Woburn when he blended rubber with sulfur—and happened to drop the mixture onto a hot stove. After he cleaned it up, he realized that the rubber had suddenly become more solid, yet was still flexible.
Charles Goodyear had vulcanized rubber, a process that gives it useful properties, such as strength, elasticity and stability. (Today it is used in everything from automobile tires to golf balls.) But that practical discovery did little to help Goodyear himself. His many patents were regularly violated; when he died in 1860, he was more than $200,000 in debt.
In one common scenario, inventors are hard at work trying to make one thing when accident intervenes to create something else. The first practical synthetic dye was “invented” when an 18-year-old student in London was trying to synthesize an antimalarial drug; the material that led to throwaway tissues was first intended as a filter for gas masks.
In the late 1960s, 3M Company researcher Spence Silver was trying to create a superglue but ended up with the opposite—a glue that wouldn’t dry, wouldn’t melt and hardly stuck to anything. It could just barely hold two pieces of paper together. What the devil could he use the stuff for? Silver never did come up with a good answer, but five years later a fellow employee, Art Fry, began using the glue on small scraps of paper, making bookmarks for his church hymnal. It took another eight years before “Post-it” sticky notepaper became an overnight sensation.
Another everyday accessory we all take for granted, Teflon, has been called “the greatest accidental invention of the century.” In 1938, a 27-year-old chemist, Dr. Roy Plunkett, was working with technician Jack Rebok at Dupont’s Jackson Laboratory in Deepwater Point, New Jersey. Plunkett was trying to create a new kind of refrigerant by mixing a gas called tetrafluoroethylene (TFE) with hydrochloric acid, but one April morning something went wrong.
Plunkett had stored several canisters of TFE on dry ice, to prevent the gas from exploding. When they opened the valve on one of the canisters, nothing came out. They removed the valve, turned the cylinder on its head and shook it. This time something did come out—a white waxy powder.
“What the hell is going on, Doc?” Rebok blurted out.
What was going on was this: the TFE gas had frozen and transformed into a solid, coating the insides of the canisters. TFE’s simple molecules had combined into long, elaborate chains, forming the giant molecules of a new substance with bizarre, almost unimaginable traits. It was inert to virtually all chemicals, which made it the most slippery material in existence. That slipperiness has proved tremendously useful. Teflon has been incorporated into bomb fuses, clothing, space capsules, heart valves and, of course, one conservative U.S. presidency.
At times, serendipity has provided the motivation for invention rather than the invention itself. The switching system that led to the dial telephone, for example, was invented in 1888 by an undertaker with a problem. Almon Strowger’s Kansas City funeral parlor was losing out to a competitor with an unfair advantage. The other undertaker’s wife was a telephone operator, and since every phone call had to be placed by an operator in those days, the other undertaker’s wife was usually one of the first people in town to hear about a death. Then her husband would phone the bereaved to offer his services. This unfair marketing advantage called for action, but the only solution Almon Strowger could come up with was to eliminate the problem—the operator. So to replace human intermediaries, he invented electromechanical switches to direct calls.
Some might argue that Strowger’s invention wasn’t really so serendipitous because the dial telephone was bound to come along sooner or later. But was it? Not according to Judith McGaw, a historian who specializes in American technology. “No reputable historian of technology would argue that inventions are somehow destined to happen,” she says.
Although the need for an invention can seem quite obvious, it usually doesn’t appear so until after the fact. Mark Twain, who patented such far-from-obvious devices as an “Improvement in adjustable and detachable straps for garments,” once put it this way: “The man with a new idea is a crank until the idea succeeds.”
Indeed, some of the most consequential inventions in history were dismissed as the brainchildren of cranks. Thomas Edison once thought that his own great coup, the phonograph, had little commercial value. In 1876, an executive with the Western Union Company declared that “this ‘telephone’ has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us.”
What about putting sound into movies? Harry Warner of Warner Brothers was epigrammatic in his misjudgment: “Who the hell wants to hear actors talk?” he said.
As for television’s prospects, Darryl F. Zanuck of 20th Century Fox claimed in 1946 that TV “won’t be able to hold any market after the first six months. People will soon get tired of staring at a plywood box every night.” Even as late as 1977, the president of Digital Equipment Corporation avowed, “There is no reason anyone would want a computer in their home.”
Or, take Xerox machines. In 1938, Chester Carlson got tired of making copies with carbon paper. In his makeshift laboratory in Queens, the New York patent agent came up with a way to make copies automatically and took his invention to IBM. People wouldn’t want to use a “bulky machine,” the wise men at IBM said, when they could use carbon paper. Of course, the process Carlson invented, xerography, continues to churn out billions of copies yearly.
“There tends to be a cleaning-up of the record,” says Arthur Molella. “People like to say, ‘Yeah, we knew it all along.’ They put the best face on it, in hindsight. But the world is full of chance.”
In such a world, timing is everything. A great discovery can turn into a spectacular flop if it has the misfortune to come along at the wrong moment. Take the amazing Video-Harp, for example. Chris Patton, a 52-year-old composer and musician who lives in Silver Spring, Maryland, says he is “the first professional VideoHarp player in the world . . . and also the last.” Made of amber Plexiglas and black aluminum, the instrument straps over his shoulders like a futuristic accordion. It responds to the movement of Patton’s hands, using a system of mirrors and optical sensors to translate light and shadow into synthesized music.
The VideoHarp was created in the late 1980s by South Carolina inventor Paul McAvinney and his grad student Dean Rubine. Only eight instruments were ever made. “The main problem was a sudden scarcity of optical sensors,” McAvinney says. “Because of that, a VideoHarp ended up costing $9,000—too expensive for the market.” Today sensors are both plentiful and cheaper, so McAvinney could make a better VideoHarp for much less. “But by now my resources are pretty well drained,” he says with a sigh. Still, McAvinney has faith that the future may have an accident or two up its sleeve. “Who knows?” he says. “With a little luck, maybe someday they’ll be playing VideoHarps on the shores of a distant planet.”