The expression “a bird’s-eye view” has taken on a brand-new meaning this week, after an international team of researchers released a new video technology that allows humans to experience the world as birds and other animals might see it.
Advances in both camera hardware and software programs allowed the team to record clips that approximate how the world is perceived by myriad creatures. Birds, butterflies, honeybees and mice were among the first animals to have their color vision made accessible to human eyes. Their views of the world were recreated on camera and published Tuesday in the journal PLOS Biology.
“Traditional techniques for measuring these colors often told only part of the story,” senior author Daniel Hanley, a sensory ecologist at George Mason University, tells Popular Science’s Laura Baisas. “The scientific community lacked adequate tools for studying colors in motion… now, we can record color signals as they would appear to wild animals.”
Seeing beyond visible light
On the electromagnetic (EM) spectrum—the entire distribution of light in our universe, from gamma rays to radio waves—human beings can only see a small portion: a sliver of the spectrum called visible light that consists of our familiar rainbow of colors.
For a bird—which, unlike a human, can see ultraviolet rays—vision is quite different. A cloudless day, for example, doesn’t seem so blue to birds. “Their sky will be essentially an ultraviolet sky,” Hanley tells the New York Times’ Emily Anthes.
Should a rainbow appear, birds would see a much wider arc of color than what humans see, extending lower on the spectrum to show more indigo and violet, which have shorter wavelengths. For a mouse, a rainbow consists of only two bands: green and ultraviolet.
The new camera technology can capture these differently colored views of any scene: To honeybees, as one clip shows, our skin appears rather unremarkable until white sunscreen is applied—then, it absorbs more ultraviolet light and pops, to the insects, as a vibrant yellow.
Animals’ visual abilities are determined by their total number of photoreceptor channels, or the total types of eye cells that capture light and send it to the brain, Jan Hemmi, an animal vision expert at the University of Western Australia and a Smithsonian Institution research associate, tells Smithsonian magazine in an email.
“Different photoreceptors have different sensitivity to different wavelengths of light, therefore it matters greatly how many channels we have to work with,” he says. “Humans have three; dogs have two; bees, many insects and birds have four; and some reptiles have five.”
With four photoreceptor channels, Hemmi says, “[birds’] color perception has three dimensions. What that actually looks like? We have no idea—but it must be amazing.”
Capturing color on video
The science behind the study combines new video technologies that can isolate light of various wavelengths with the existing biological understanding of the different types of light animals are able to process.
To “see through” animal eyes, the team uses two cameras—one sensitive to ultraviolet light and one sensitive to visible light. Together, they capture light in four distinct wavelengths: blue, green, red and ultraviolet. Using a novel software that the researchers created in Python, the videos are converted into data and broken down into what they call “perceptual units”—values that correspond to colors animals can see, based on what is already known about different species’ photoreceptors.
One of the key breakthroughs in the process was developing the camera hardware to capture ultraviolet light, a challenge that had plagued scientists in the past. Likewise, consistently capturing the perceived color of a moving object or scene hadn’t been reliably achieved. But this new software can predict the colors animals see with 92 percent accuracy, per the paper.
According to Hemmi, the researchers have done “an amazing job” in producing these videos. The catch, though, is that we are still viewing them through our human eyes. “There is no real solution to this, however, but the videos do a good job at highlighting which aspects of images and scenes appear different to the animals and which do not,” he says.
Opportunities for research
The team made their Python code open-source and built their cameras using commercially available parts, which allows scientists around the world to more easily reproduce their technique. They hope others will continue to develop the technology and share footage in unique ways, such as in film, conservation and natural history projects. The research, partly funded by the National Geographic Society, may one day influence how nature documentaries are presented.
Some scientists are already dreaming of new research possibilities. “I can’t wait to get my hands on the video camera,” Eunice Jingmei Tan, an evolutionary biologist at the National University of Singapore who studies the color displays and signaling behaviors of spiders and insects, tells Scientific American’s Lauren Leffer.
In the future, researchers might be able to find out how animals use their perception of colors to make decisions and interact with the world. That kind of breakthrough, Hemmi says, would be a “game changer.”
“What the cameras can show is how the animal’s eyes see the stimuli,” he says. “The next step is to find out how the brain sees and uses them.”