Scientists Use AI to Decode the Ultrasonic Language of Rodents

The DeepSqueak software translates the high-pitched communication into sonograms, which can be analyzed to determine what mice and rats are saying

deepsqueak_illo.jpg
UW Medicine/Alice Gray

Luckily for anyone with musophobia—a fear of mice and similar rodents—most of the sounds the little squeakers make while scurrying through the walls of our homes are in a range well out of human hearing. But that’s not ideal for scientists, who would benefit from listening to the reactions lab rats and lab mice have to certain stimuli. Now they can. A new type of AI called DeepSqueak is able to decode mouse speak and help researchers match the vocalizations with behaviors.

Glenn McDonald at Seeker reports that scientists at the University of Washington came up with the software, which analyzes high-pitched, or ultrasonic, mouse vocalizations and turns them into sonograms, or visual representations of the sound. Machine-learning algorithms then analyze those sonograms for patterns that can be connected with behavior and emotion. The research appears in the journal Neuropsychopharmacology.

“As it turns out, rats and mice have this rich vocal communication, but it’s way, way above our hearing range…so it's been really hard to detect and analyze these calls,” co-author Russel Marx says in a video. “So our software allows us to visualize all those calls, look at their shape and structure, play them back and categorize them.”

So far, the team has found the mice studied make about 20 different types of calls. But the aim of the mouse-translator isn’t just to understand what they are saying to each other. The lab studies the psychological aspect of drug addiction, and knowing which calls indicated positive and negative emotions can help the researchers understand what the animals are experiencing during experiments.

According to a press release, the animals make their happiest calls when they know they are going to be given a treat, like sugar, or when they are playing with one another. When male mice see or smell females, they begin to sing unique courtship songs. When the rodents are exposed to drugs that can be abused, they make both positive and negative calls, which, co-author Kevin Coffey says demonstrates the complex nature of drug abuse. It can also be used to understand the effects of medications.

“We are primarily interested in using DeepSqueak to improve our understanding of psychiatric disorders such as anxiety and depression. Vocalizations provide insight into the animal’s internal state that we can use to judge the efficacy of our treatments,” Coffey tells Elizabeth Doughman at Laboratory Equipment. “Basically, the animals can tell us directly how they are feeling. For another example, vocalizations might also be effective to track neurodegenerative disorders that effect speech, such as Alzheimer's disease.”

The team also realizes that listening in on lab rodents could be useful for other researchers or in order to improve the well-being of the animals. That’s why they are releasing the software for free via Github.

DeepSqueak isn’t the first attempt to understand mouse speak, but it does make the process much more efficient. Just last month, Leslie Nemo at National Geographic reported researchers studying the California mouse, Peromyscus californicus, which like most rodents also communicates in the ultrasonic range, were able to determine which calls were angry, which were friendly and even found that the monogamous mice “argue” after they are separated and reunited.

Those researchers argue that studying mouse vocalizations is as important to understanding the little mammals as birdsong is to understanding our feathered friends, and decoding their language will drastically change what we know about the social lives of the 1,300 rodent-like species on Earth. Perhaps DeepSqueak will also be a part of unlocking all those mousy mysteries.

Get the latest stories in your inbox every weekday.