Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?

The robot, developed by Alphabet Inc.’s X, will make its public debut at the Smithsonian

Mineral_T4_Salinas_DSC_6213.jpg
Can a machine be taught to understand the plant world? X, the Moonshot Factory

By the year 2050, Earth's population is expected to reach nearly ten billion people. With this growth comes a staggering demand for food resources, particularly drought, heat, pest and disease resistant crop varieties that give high yields in the face of climate change.

Enter X, Alphabet Inc.’s so-called “moonshot factory,” where innovators face the world’s biggest challenges head-on and develop ground-breaking technology at a startup pace. Project Mineral, one of X’s current efforts, is focused on finding an effective way to address the global food security crisis through “computational agriculture,” a term coined by X to describe new technologies that will further increase understanding about the plant world.

“The agriculture industry has digitized,” says Project Mineral lead Elliot Grant. Farmers today use sensors, GPS and spreadsheets to collect data on crops and generate satellite imagery of their fields. “But it hasn't led to more understanding. So the next step beyond digitization, is the science of making sense of this very complex plant world by combining multiple technologies such as robotics, sensors, data modeling, machine learning and simulation. The subtle difference is that computational agriculture is the sense making of all the data,” Grant explains.

Since the project launched in 2016, Mineral team innovators have been focused on answering one critical question: Can a machine be taught to understand the plant world?

Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?
The sleek, four-wheeled plant rover is about as tall as a shipping container and as wide as a car. X, the Moonshot Factory

After years of tweaking, Grant and his team’s latest prototype—a plant-scanning, rover-like robot powered by artificial intelligence—will make its public debut at the Smithsonian’s “Futures” exhibition, an expansive exploration of the future through art, history, design and technology opening at the Arts & Industries Building in Washington, D.C. later this year. Capable of syncing up with satellite imagery, weather data and soil information, the sleek, four-wheeled plant rover, about as tall as a shipping container and as wide as a car, uses various cameras and machine algorithms to monitor and spot potential issues with plants. As it rolls through farmland, it can identify weeds, measure the ripeness of fruit and predict crop yields. The Mineral rover can also adjust its width, length and height to accommodate crops in numerous stages of development. For example, it can be taller to image towering, mature wheat plants, or widen to scan a broad bed of lettuce.

But it didn’t start out quite so chic and impressive: The first prototype was made with two bikes, some scaffolding, a roll of duct tape and several Google Pixel phones. To put their Franken-machine to the test, Mineral’s diverse team, consisting of engineers, biologists, agronomists and more, whisked it away to a nearby strawberry field and pulled it through rows of red fruit to see if it could capture enough plant images to use for machine learning.

“So, after a few hours of pushing and pulling this contraption, through the mud and a bunch of squashed berries, we came back to the lab, looked at the imagery we had, and concluded that although there were a couple hundred things we still needed to improve, there was a glimmer of hope that this was going to work,” Grant explains.

Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?
The first prototype was made with two bikes, some scaffolding, a roll of duct tape and several Google Pixel phones. X, the Moonshot Factory

After their initial experiment, and discussions with farmers and plant breeders, the Mineral team built, scrapped and reimagined their rover. This burn-and-churn, momentum-building phase is part of X’s rapid iteration methodology. If an experiment is simply not working out, X project leaders learn from errors and move on. “The essence of a rapid iteration is to move quickly, take risks, take smart risks, but do it in a way that continually leads to learning,” says Grant.

In one experiment, Mineral used a machine learning algorithm called CycleGAN, or cycle generative adversarial networks, to see if they could create simulated plant images of strawberries. CycleGAN generates realistic images, which Mineral can then use to diversify the rover’s image library. This way, when the rover encounters various scenarios out in the field, it can accurately identify specific crops, traits or ailments.

Limited Edition: Futures Merch Available Now!

Galactic gifts from the time-traveling museum experience

A.I. like this is useful for simulating plant diseases, pests or pathogens, especially when a robot needs to recognize it without having ever seen it before. (This approach prevents the detrimental alternative of purposefully inoculating fields with diseases.)

“We're able to create simulated images of plants that are so realistic we can use them for training a model [artificial neural network or computing system], even if it's never seen that plant in the real world,” explains Grant.

Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?
The Mineral rover can identify weeds from crops, which, in turn, can help farmers use fewer chemicals to keep them at bay. X, the Moonshot Factory

Eventually, the team built a rover that is so sophisticated it can detect rust disease and other plant fungal diseases. Mineral has partnered with a farmer in the Philippines who is helping the team develop ways to catch diseases in bananas. Images of diseased bananas will be used to teach the rover how to detect diseases that are detrimental to banana crops like, nitrogen deficiencies, Panama disease and Sigatoka disease.

The robot also takes images of flowers and then uses the machine learning model to count a plant's flowering rate, which is essential to understand how a plant responds to its environment and predict how much fruit a plant will produce. In this way, the rover can count individual buds on raspberry canes and also estimate the number of soybeans in a field. So far, Mineral has experimented with imaging soybeans, strawberries, melons, oilseeds, lettuce, oats and barley—from early spouts to fully grown produce.

Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?
The rover can estimate the number of soybeans in a field. X, the Moonshot Factory

The robot can measure various leaf sizes and greenness. Greenness can be indicative of healthy plant growth, and in some plants it is predictive of yield. However, it's difficult for people to measure, since color perception varies from person to person. The rover takes pictures of plants from numerous angles and converts each image pixel into data. It then uses RGB (Red, Green, Blue) and HSV (Hue Saturation Value) color coding to objectively determine the color of a plant.

Moving beyond farmers managing their own crops, plant breeders spend many hours manually documenting the physical characteristics of thousands of plants across a field, a process known as phenotyping. But phenotype data collection relies on human perception—and human perception alone is not always accurate.

“Can we develop a technical set of tools to provide these breeders—to help them see the plant world in a new way, higher fidelity, more frequently, and more easily?” says Grant. “It's very tedious work going through the field and phenotyping plants.”

Is This Weed-Spotting, Yield-Predicting Rover the Future of Farming?
Here, the rover is counting flowers and buds on canola plants. X, the Moonshot Factory

Meanwhile, scientists are working rapidly to learn more about plants’ genes, or their genotype, and match these genetic traits with the plants’ physical traits, or their phenotype. In the world of agriculture, this missing information on how genes are linked to desired traits is known as the phenotyping bottleneck. Understanding how plant traits are expressed and combining them with available logs of genetic sequences might allow scientists to propagate more robust plants that are ready to face the challenges of climate change.

Bringing new strains of crops to market is time-consuming. With immense amounts of genetic and phenotype data to analyze, understanding how those genes express themselves through plant traits and environmental responses takes time.

“We can't really look at the genome and know which genes are responsible for drought tolerance, nitrogen deficiency or resistance to a particular disease, because we don't know what's happening in the field,” explains Chinmay Soman, co-founder and CEO of the agri-tech company EarthSense, which is working on similar rover technology. “So, it all starts with high throughput field phenotyping.”

More and more, computer vision is becoming a solution to the phenotyping bottleneck, because A.I. can derive plant information from a simple photograph. EarthSense's TerraSentia is a robust robot, small enough to fit in the trunk of a car and zip underneath a plant’s canopy, whereas Mineral's rover towers over crops, takes data from above, and needs a truck to transport it. Both are employing A.I. that could enable crop breeders to develop better varieties of crops more effectively and efficiently through capturing data on plant traits. Mineral’s rover takes thousands of photos every minute, which amounts to over a hundred million images in a single season.

Project Mineral’s rover has come a long way from its cobbled-together origin—but it is still a prototype. Despite all its tech, Mineral emphasizes that they are constantly improving and working closely with experts in the agricultural field to understand plants further.

“You can think of the rover as being the current instantiation of that vision that we've designed for breeders, and we're learning with them,” says Grant.

Introducing the Mineral rover

In “Futures,” the prototype will be on display in the “Futures that Work” portion of the exhibit in the AIB's West Hall. This space was created to reflect on renewability and sustainability, and to showcase various innovations that may soon be available.

“We’re really pleased that we’re able to show something that’s still in a semi-finished prototypical phase,” says special projects curator Ashley Molese for the Smithsonian’s Arts & Industries Building. “You know, it’s not necessarily like rolling out of machine factory floors just yet. But it’s beyond that stage of early prototyping, where there’s still a lot more kinks to work out.”

Behind the rover display, a video will show a fleet of Mineral rovers trundeling through a field before cutting to footage of what the rover sees while it images strawberries, soybeans and cantelopes.

“There's something that's slightly anthropomorphic about it in the ways that its cameras are sort of like eyes that look forward,” Molese says. “I'm very curious to see how visitors respond to it.”

Within the space, visitors can inspect Mineral's plant rover, imagine the future of food sustainability and security, and just like the Mineral team does, think about all the “what ifs.”

“What if that farmer could manage every single plant individually? What would that do to sustainability? What if you could detect disease, before it became visible? Or what if we could grow plants together in a way that was symbiotic and therefore needed fewer inputs, while having healthier plants? These are the things that get us up every day,” says Grant.

Get the latest stories in your inbox every weekday.