Elaine Chew was sitting on her hospital bed, awaiting a cardiac procedure that would fix her heart’s irregular rhythm, when the doctor came to take down her details.
“What do you do for work?” he asked her.
She explained that she was a professor of digital media who focused on music cognition and musical information research.
The doctor was pleased. He didn’t have a music background himself, he said, but at the previous year’s cardiologists' holiday party he’d made a quiz with music that matched different arrhythmias, and had colleagues guess which arrhythmias they were.
He explained that he’d simply found music that matched the tempo—the speed—of different irregular heartbeat patterns. Tachycardia—a very fast heartbeat—was represented by super-speedy techno music.
“I was thinking, ‘oh, we can do a lot better than that,’” says Chew, who works at the Centre for Digital Music at Queen Mary University of London.
Her four-hour procedure—an ablation that used a catheter threaded from a blood vessel to her heart to freeze the electrical pathways that were causing the irregular rhythm—gave her plenty of time to think. She was awake the whole time.
“I would have loved to see the whole operation, except my view was blocked, so my head was free to wander,” Chew says. “When I could sit up again, I had this plan that I would do this research project. It was all mapped out.”
The project involved taking the recorded data of heartbeats—Chew’s own and others—and turning them into music that goes far beyond simply matching the tempo. Looking at things like rhythmic patterns, duration of beats and silences, and tempo modulations (the changing of beat rate from one value to another), she and her students transcribed the heartbeats into musical notation.
She describes the experience as being like an ethnomusicologist transcribing unfamiliar music in a way that can be shared.
“In music, the whole point of the notation is that you want to encode the information so that it can be reproduced,” she says.
Chew then made a set of piano pieces, which she calls "The Arrhythmia Suite," by modifying existing music to match the heartbeat data. The pieces are prettier and happier than one might expect.
“The music shouldn’t be sad or ominous,” Chew says. “Apart from the symptoms [of arrhythmias]—not being able to walk or fainting—the rhythms themselves are actually very interesting.”
The music may also serve a medical purpose. Chew and the cardiologists and researchers she collaborated with hope that turning arrhythmias into music could be a way of understanding the condition more deeply. There are many types of arrhythmias, some of which have different subtypes that are not completely understood. One subtype of an arrhythmia may be more dangerous than another, one may be more amenable to ablation and so on.
Capturing the richer details of an arrhythmia, the way Chew did in her musical notation, could help doctors better understand different patterns and subtypes.
“This could enable us to target treatments in a more personalized manner by identifying patients at different stages of the disease to decide if they are better to be treated with drugs or a cardiac procedure,” says Pier Lambiase, a cardiologist who specializes in heart rhythm disorders who helped Chew source her data.
Cardiac arrhythmias are a common medical problem, affecting millions of Americans every year. Some are merely annoying, while others can be deadly.
Chew also hopes music could be a way of explaining the sensation of arrhythmia to friends or family members who have never had the condition themselves.
“We have a way of conveying to people who may not have experienced this what it feels like,” she says.