When Anika Puri visited India with her family four years ago, she was surprised to come across a market in Bombay filled with rows of ivory jewelry and statues. Globally, ivory trade has been illegal for more than 30 years, and elephant hunting has been prohibited in India since the 1970s.
“I was quite taken aback,” the 17-year-old from Chappaqua, New York, recalls. “Because I always thought, ‘well, poaching is illegal, how come it really is still such a big issue?’”
Curious, Puri did some research and discovered a shocking statistic: Africa’s forest elephant population had declined by about 62 percent between 2002 and 2011. Years later, the numbers continue to drop. A wildlife lover, Puri wanted to do something to help protect the species and others still threatened by poaching.
Drones are currently used to detect and capture images of poachers, and they aren’t that accurate, the teenager explains. But after watching videos of elephants and humans, she saw how the two differed vastly in the way they move—their speed, their turning patterns and other motions.
“I realized that we could use this disparity between these two movement patterns in order to actually increase the detection accuracy of potential poachers,” she says.
Over the course of two years, Puri created ElSa (short for elephant savior), a low-cost prototype of a machine-learning-driven software that analyzes movement patterns in thermal infrared videos of humans and elephants. Puri says the software is four times more accurate than existing state-of-the-art detection methods. It also eliminates the need for expensive high-resolution thermal cameras, which can cost in the thousands, she says. ElSa uses a $250 FLIR ONE Pro thermal camera with 206x156 pixel resolution that plugs into an off-the-shelf iPhone 6. The camera and iPhone are then attached to a drone, and the system produces real-time inferences as it flies over parks as to whether objects below are human or elephant.Puri submitted her project to this year’s Regeneron International Science and Engineering Fair, the world’s largest international pre-college STEM competition, where her work is in the company of other highschoolers’ novel designs for an electric vehicle motor, electronic waste sorting robot arm and a pipe-climbing robot. Her eloquence in describing her research and its potential impact on society earned her the Peggy Scripps Award for Science Communication, and she also won a top award in the competition’s earth and environmental sciences category.
“It's really amazing just to see all these kids coming together. And for the same purpose— enjoying science and doing research,” Puri says. “I was honored just to be on that stage.”
Puri first learned about the capabilities of artificial intelligence just after ninth grade, when she was selected to attend Stanford A.I. Lab’s summer program.
“Initially, my enthusiasm for artificial intelligence was based off of this limitless possibility for social good,” she says. But she soon discovered that because data is collected and analyzed by humans, it contains human biases, and so does A.I. as a result.
“It really has the capability to reinforce some of the worst aspects of our society,” she says. “What I really realized from this is how important it is that women, people of color, all sorts of minorities in the field of technology are at the forefront of this kind of groundbreaking technology.”
About a year later, Puri founded a nonprofit called mozAIrt, which inspires girls and other underrepresented groups to get involved in computer science using a combination of music, art and A.I.
At an A.I. conference where she held a workshop, Puri met Elizabeth Bondi-Kelly, a Harvard computer scientist who was working on a wildlife conservation project using drones and machine learning. Bondi-Kelly had also started a nonprofit, called Try AI, to increase diversity in the field.
Puri reached out to the computer scientist about her idea to catch elephant poachers using movement patterns, and Bondi-Kelly became her mentor for the project.
To create her model, Puri first found movement patterns of humans and elephants using the Benchmarking IR Dataset for Surveillance with Aerial Intelligence (BIRDSAI), a dataset collected by Bondi-Kelly and her colleagues using a thermal infrared camera attached to an unmanned aerial vehicle (UAV) in multiple protected areas in Africa. Sifting through the data, Puri identified 516 time series extracted from videos that captured humans or elephants in motion.
Puri used a machine learning algorithm to train a model to classify a figure as either an elephant or a human based on its speed, group size, turning radius, number of turns and other patterns. She used 372 series—300 elephant movements, and 72 human movements. The remaining 144 were used to test her model with data it hadn’t seen before. When tested on the BIRDSAI dataset, her model was able to detect humans with over 90 percent accuracy.
Puri's software is "quite commendable," says Jasper Eikelboom, an ecologist at Wageningen University in the Netherlands who is designing a system to detect poachers using GPS trackers on animals. “It's quite remarkable that a high school student has been able to do something like this,” he says. “Not only the research and the analysis, but also…being able to implement it in the prototypes.”
Eikelboom cautions that Puri’s model still needs to be tested on raw video footage to see how well it can detect poachers—the accuracy of Puri’s model was tested using figures already determined either human or elephant. He also says other barriers already exist to using drones in parks, such as the money and manpower to keep them flying.
ElSa, he notes, could be used broadly for other conservation goals, not just for spotting poachers, too.
“In ecology in general, we like to track animals and see what they're doing and how it impacts the ecosystem,” he says. “And if we look, for example, on the satellite data, we can find a lot of moving patterns, but we don't know what species they are. I think it's a very smart move to look at these movement patterns themselves instead of only at the image—at the pixels—to determine what kind of species it is.”
In the fall, Puri will attend the Massachusetts Institute of Technology, where she wants to study electrical engineering and computer science. She has plans to expand her movement pattern research into other endangered animals. Next up is rhinos, she says. And she wants to begin implementing her software in national parks in Africa, including South Africa’s Kruger National Park. Covid-19 restrictions delayed some of her plans to travel to these parks to get her project off the ground, but she hopes to explore her options after she starts college. Because drones only have a battery life of a few hours, she is currently creating a path-planning algorithm to ensure maximum efficiency in the drone’s flight course.
“Research isn't a straight line,” Puri says. “That has made me more resourceful. It also helped me develop into a more innovative thinker. You learn along the way.”