Listen Live to the Total Solar Eclipse, Transformed Into a Real-Time Musical Composition

A composer based at San Francisco’s Exploratorium museum will use data coming from the eclipsed sun to create an out-of-this-world “sonification” on April 8

Totality during a total solar eclipse, with the moon fully covering the sun and the corona visible
The solar eclipse’s path of totality stretches across North America in a roughly 115-mile-long band, from Mexico to Canada. NASA / Nat Gopalswamy

On Monday, April 8, a total solar eclipse will darken skies in the contiguous United States for the last time until 2044. The moon’s shadow will sweep across nearly 32 million U.S. residents, and an additional 1.8 million to 7.4 million people will travel to the path of totality and look up.

But the Exploratorium—a San Francisco museum of science, technology and the arts—doesn’t just want you to watch the eclipse. It wants you to listen.

While the eclipse is underway, Wayne Grim, a musician and composer based at the museum, will create a live sonification, or translation into sound, of the spectacular celestial event. Grim and a team of Exploratorium scientists and media specialists will be in Junction, Texas, to capture footage of the eclipse through telescopes. The live feed will be relayed to Grim’s sonification software, where it will be transformed into music in real time, then played on a live stream.

“It’s just this beautiful and magical moment… I really think about the sonification as a wonderful way to experience the phenomenon of an eclipse with another one of our senses,” Julie Yu, the Exploratorium’s principal scientist, tells Smithsonian magazine.

The museum’s April 8 programming begins at 10:00 a.m. PST and includes four streams: live coverage in both English and Spanish; a real-time telescope feed from Torreón, Mexico; and the musical sonification, which will accompany the visuals streamed from Junction. Yu and Desiré Whitmore, the museum’s senior physics educator, are set to host the English live coverage.

Sonification is the process of collecting data, perhaps from an image or video, and turning it into sound. The invention of the Geiger counter in 1908 marked one of the first sonifications in history—the device emits clicking sounds when it detects radioactivity. Since then, sonification technology has become a popular tool for making astronomy more accessible; NASA and the Smithsonian Astrophysical Observatory have an extensive online sonification library called A Universe of Sound.

“Data sonification is analogous to data visualization, but instead of perceptualizing data in the visual realm, it perceptualizes data in the sonic realm,” Ellwood Colahan, a music and performing arts reference librarian at the University of Denver, wrote in a 2023 article for Humanities Commons.

The Exploratorium also streamed live coverage and sonification of total solar eclipses in 2016 and 2017, as well as for the annular eclipse in 2023. The 2017 sonification featured a live collaboration with the San Francisco-based Kronos Quartet, in which the string group performed a score composed by Grim that allowed the instrumentalists to improvise while they watched a telescope feed of the eclipse.

Live Telescope View of Annular Eclipse w/Sonification | Valley of the Gods, UT | 10/14/23 | 8am PDT

To create the sonifications, Grim uses custom-made computer software that synthesizes light from telescope video feeds and turns it into data points. Each bit of information—from the light’s color to its intensity—is mapped to a musical parameter, such as pitch, volume, rhythm, audio filters and tempo, then played using the sounds of 10 to 15 instruments. The parameters and instruments are pre-assigned to all possible data values the telescope could pick up.

“99.9 percent of the [sonification] work is in the building and programming. And then, when the eclipse happens, all I really do is mix in and mix out some of the sounds. … But it’s all being created live through these different synthesis modules or oscillators that I built,” Grim tells Smithsonian magazine.

Monday’s real-time piece will use the five phases of the eclipse as compositional markers to create a more musical structure—guiding listeners through the start of the partial eclipse, the beginning of the total eclipse, the moment of totality or maximum eclipse, the end of the total eclipse and finally, the end of the partial eclipse.

While the sonification will primarily react to the moon covering the sun, it will also incorporate elements to represent the Earth and other planets that can be seen during the eclipse. According to NASA, if watching beneath clear skies in the path of totality, the planets Jupiter, Venus, Saturn and Mars will become visible as the sun is blocked.

2024 Eclipse Livestream | Telescope View from Junction, Texas with Sonification | Exploratorium

The ability to listen to the eclipse also creates an opportunity for those who are blind or visually impaired to experience the celestial event.

“It sort of creates a picture in your mind while you’re listening to it, even though it is more artistic than a scientific one-to-one,” Grim says. “I have had a few people who say that they aren’t able to see what’s happening, and this has been a valuable experience to have.”

On Monday, approximately 99 percent of Americans will be able to observe at least a partial eclipse, according to NASA. And the live sonification will bring the otherworldly experience to even more people.

“We really believe that experiencing the natural phenomenon of our world, and then trying to understand it, is the foundation of science learning. The ways that actually experiencing things helps you understand the world can’t be substituted with a textbook,” Yu says. “Total solar eclipses actually won’t happen half a billion years from now. So, we should catch them now, while they do.”

Get the latest stories in your inbox every weekday.