Episode one of “Star Trek,” Stardate 1513.1. Chief medical officer Leonard “Bones” McCoy beams onto a desolate planet, M-113, with orders to perform a routine physical on Prof. Robert Crater, an ill-tempered archaeologist who wishes McCoy would just go away.
“Doubtless the good surgeon will enjoy prodding and poking us with his arcane machinery,” Crater snipes.
Think again, Crater: Prodding and poking is so last millennium.
Dr. McCoy packs a medical “tricorder.” Wand the body with this hand-held computer, and seconds later it coughs up the particulars of a patient’s condition.
“The machine is capable of almost anything,” McCoy says. As he sweeps the device across Crater’s chest and back, it purrs like a blissed-out electronic cat. In the 23rd century—as pictured by television writers in the late 1960s—that purr was a sign that a very sophisticated machine was working.
The tricorder-like devices in the UCLA engineering labs of Aydogan Ozcan don’t purr. Nor do they cause the shoulder strain of the cassette recorder-size clunkers of Trekkie lore. But in other respects, they’re the closest thing yet to the real McCoy.
Ozcan’s sleek gizmos, which fit onto the back of a smartphone, count thousands of red and white blood cells in seconds; screen urine for signs of kidney disease; spot viruses like HIV and influenza in a smear of blood; and test water for bacteria, parasites and toxic chemicals. Another phone attachment, the iTube, scanned for microscopic specks of allergy-causing peanut in what one of Ozcan’s journal articles last year described as “3 different kinds of Mrs. Fields Cookies.”
When I visited Ozcan on the UCLA campus, a dozen of the devices were arrayed like museum pieces in an illuminated glass display case in a corner of his laboratory. The ones in the original “Star Trek” series resembled antediluvian Walkmen. Ozcan’s devices are the size of a lipstick case or matchbox.
“This is honestly one of our first hacks,” he told me with a touch of nostalgia, pulling out a six-year-old Nokia phone that he’d somehow retooled into a lens-free digital microscope. He says “hack” because he takes technology already in our pockets—the smartphone, another gadget anticipated by “Star Trek’s” inaugural episode—and cheaply reworks it into lightweight, automated versions of the bulky instruments found in medical laboratories.
At the rate he’s going, Ozcan, who at 35 already holds the title of UCLA chancellor’s professor, may soon hack the whole clinical lab. He wants nothing less than to make it small and cheap enough—and so idiot- and klutz-proof—that we can carry it in our pocket like loose change.
I’d visited Ozcan during a week in January when temperatures tripped into the 80s. So when one of his postdocs, Qingshan Wei, a 32-year-old with stylish clip-on shades, asked if I wanted to scope out the waves in Marina del Rey, I raised no objection.
Our “scope” was a Samsung Galaxy with an attachment that turned the phone’s camera into a mercury detection system. The toxic metal can build up in fish, and water tests can serve as an early warning system. “We want to detect mercury in water before it goes into the food chain,” Wei told me.
We splashed barefoot into shin-deep surf, and Wei pipetted seawater into a small plastic box on the back of the phone. Inside were a pair of LEDs that fired red and green beams of light through the water sample and onto the phone’s camera chip. An app scrutinized the subtle shifts in color intensity, and four seconds later, results flashed on the screen.
Two months earlier, mercury levels at this very spot had been worrisome. Today, the phone told us, the water was safe.
Similar tests performed by a full-scale environmental laboratory are very expensive, Wei told me. They also require schlepping the sample to the lab, for a complicated analysis called inductively coupled plasma-mass spectrometry. “For this,” Wei said, nodding at the mercury tester, which cost $37 and was made by a 3-D printer, “we write a smart application. You just sample, click open the application, follow the instructions and click ‘analyze this.’”
The brains of the system are Ozcan’s algorithms, which turn the phone’s humdrum camera into a powerful optical instrument that sees what the eye can’t, then tells us how worried to be. His devices—because they piggyback on GPS-enabled smartphones—no sooner test a sample than they can send time- and location-stamped results to your doctor, an environmental agency or, say, Google Maps. Supply the technology to enough of the world’s three billion mobile subscribers, and you’ve got battalions of citizen scientists beaming up health and environmental data from across the globe in real time.
Ozcan’s software funnels the data into a continually updating map where epidemiologists, public health officials and your uncle Murray could follow the spread of a disease or chemical spill live, the way our smartphones already use our speed and location to crowd-source data for mobile traffic apps. Ozcan’s goal: to chart the world’s invisible threats—the pollutants in water, the allergens in food, the pathogens in air—as panoramically as traffic or weather.