Physicians promise in the Hippocratic oath to keep their patients from harm, so intentionally exposing people to a deadly disease would seem to run counter to that contract. But with human challenge studies, they do exactly that. In challenge studies, medical professionals purposefully expose patients to illnesses so that they can study the patient’s symptoms and immune system response. Such studies can also help physicians discover what vaccines will work to prevent the affliction. Historically in such experiments, the health of individual patients, usually voluntary but at times, horrifically, not, has been sacrificed for medical knowledge and future treatments.
Researchers are planning new human challenge trials as the race to develop vaccines against Covid-19 is in a full sprint, with Pfizer’s vaccine receiving authorization in several countries and Moderna’s not far behind. But the end of the pandemic won’t just come from these two pharmaceutical breakthroughs. In order to fully contain the spread of Covid-19, many treatments and vaccines may be necessary in order to vaccinate billions of people. And some experts say that the fastest way to test those second-generation vaccines is through human challenge trials.
Imperial College London intends to begin a human challenge study related to Covid-19 as soon as January. During the study, scientists would purposely infect up to 100 young, healthy volunteers with the coronavirus that causes Covid-19 in the hopes of accelerating the search for new vaccines.
Supporters of the controversial Covid-19 human challenge trial argue that if it can be done safely then it provides a uniquely controlled environment to study factors that are difficult to unravel in longer, large-scale Phase III trials of thousands of people. Critics say that challenge studies are either unnecessary because of vaccine successes so far, or should be put on pause until a later date when they can be run safely. Critics also point out that safety is a concern even for young volunteers because scientists do not know how to treat Covid-19 or what its long-term effects are, and evidence presented by the World Health Organization in September showed that at least a fifth of people between 18 and 34 who catch Covid-19 experience prolonged symptoms.
The debate over such a contentious experiment is nothing new. Human challenge trials are as old as inoculation itself. In 1796, English surgeon Edward Jenner tested the world’s first vaccine by exposing his gardener’s 8-year-old son to cowpox and then smallpox. Human challenge trials have since been used to study dozens of diseases from cholera to cancer, but early studies often put participants directly in harm’s way, not always with their knowledge.
Today, challenge studies undergo careful review by boards of experts before they can begin. A key requirement of an ethical study is that volunteers provide informed consent, proving that they understand the risks of joining a study. The first informed consent process was introduced more than a century after Jenner’s human challenge study.
In 1898, as the U.S. warred with Spain in Cuba, yellow fever—which can cause liver damage, nausea, high fever and bleeding—killed 13 times more soldiers than war wounds. So in 1900, the U.S. Army established a commission led by pathologist Walter Reed to figure out how yellow fever spread and how to stop it. Because only humans seemed to fall ill with the disease, Reed and three colleagues on the commission designed a human challenge study to test a leading theory of yellow fever transmission: mosquito bites.
Reed recognized that if he was correct, then the study itself would be incredibly risky. The need to expose volunteers to deadly disease would have to be weighed with the responsibility to keep the volunteers safe.
“The general that created the commission told Walter Reed… that he had to be absolutely sure that no harm would be caused to the volunteers,” says Enrique Chaves-Carballo, a historian of medicine at the University of Kansas. “He was pretty specific about that.”
To balance his superior’s order with the study’s inherent risk, the commission came up with a novel solution: the first informed consent contract. The commission created a document for volunteers to sign, stating that they understood the study’s risks. However, the form suggested that abstaining from the study was risky, too. The contract stated:
“The undersigned understands perfectly well that in the case of the development of yellow fever in him, that he endangers his life to a certain extent but it being entirely impossible for him to avoid the infection during his stay in the island, he prefers to take the chance of contracting it intentionally in the belief that he will receive from the said Commission the greatest care and the most skillful medical service.”
During the experiment, the scientists first allowed mosquitoes to bite yellow fever patients so the insects would pick up the disease. Then, they brought the mosquitoes to healthy volunteers, and allowed the mosquitoes to bite them. When volunteers fell ill, Reed scoured blood samples for the microbe causing their illness.
Those with yellow fever were prescribed complete bed rest and fasting except for “a few sips of champagne” and some pain medication, says Chaves-Carballo. Volunteers received a hefty payment of $100 in gold per mosquito bite, and another $100 if they fell ill.
In the first round of experiments, 11 volunteers got mosquito bites. Two fell ill, and survived. The third man to fall ill, Jesse W. Lazear, was one of the scientists running the study. He was bitten by accident and died of yellow fever 12 days later.
Though Reed considered ending the study after the death of his colleague, the commission instead named a sanitary station Camp Lazear in his honor. And by 1901, Reed and the commission had shown through their mosquito bite experiments that the insects transmit yellow fever. Inoculation of more volunteers with yellow fever patients’ filtered blood samples showed that a virus causes the disease—making yellow fever the first human virus scientists discovered.
With the disease-causing culprit identified, Reed returned to George Washington University (then Columbian University) to teach, and other scientists picked up the search for a yellow fever vaccine. U.S. army physician William Gorgas and Cuban-born physician Juan Guiteras established an inoculation station for a new round of human challenge studies in Havana. They hoped to learn how to induce light cases of yellow fever with mosquito bites in order to give people immunity. More than 20 volunteers signed up for the first experimental inoculations in 1901, including the only woman to participate in the study, a military nurse named Clara Maass.
Maass was bitten five times without developing yellow fever, and received $100 to send home to her mother and nine siblings in New Jersey—a huge sum compared to her monthly pay of $30.
Her sixth mosquito bite proved fatal. She and two other volunteers were infected with a particularly violent strain of the virus—the doctors didn’t know how to induce just light cases—and all three died in August of 1901.
“Some of the headlines of the newspapers are like, ‘Nurse Dies for a Hundred Dollars,’” says Chaves-Carballo. “People responded to the fact that she was a young nurse who was trying her best to help her family.”
Public outcry in the U.S. brought the Havana experiments to an end. Maass’ death brought the study’s exorbitant pay under fire, as such a large incentive may have interfered with the participants’ ability to accurately weigh the risk of joining the study. The fact that the study was run by the U.S. Army, and Reed’s participants were members of the military, also brought into question the participants’ ability to freely opt out of the study, says Monica McArthur, pediatrician and infectious disease specialist at the University of Maryland School of Medicine’s Center for Vaccine Development and Global Health.
“In a lot of the studies early on, the Walter Reed experiment and other studies, used what we would now consider vulnerable populations,” people who couldn’t freely agree to participate or make a fully informed decision, says McArthur. “Prisoners, for example, could be enrolled in studies.”
A classic example of a challenge study that relied on a vulnerable population is the Tuskegee Syphilis Study. Beginning in 1932, the U.S. Public Health Service recruited about 600 poor African American men from around Tuskegee, Alabama, for a study of how syphilis worsens over time. About two-thirds of the men had syphilis, but the study doctors informed them they had “bad blood.”
After receiving this phony diagnosis, the men were persuaded to join the study in exchange for free meals, hospital access and treatment for “bad blood” and other unrelated conditions. The scientists also provided participants a burial stipend that would be paid to their survivors after their deaths.
Only about half of the men with syphilis received a treatment that was usually prescribed in the 1930s: doses of toxic arsenic and mercury. The doctors subjected the participants to blood draws and spinal taps, and after they died of syphilis, autopsies, all in pursuit of more information about the natural course of the disease. The study lasted for decades, and even after the medical community established that penicillin could cure the disease in the 1940s the men did not receive the medication.
In 1972, journalist Jean Heller of the Associated Press brought the Tuskegee Syphilis Study to light and shared how the doctors involved in the study had deceived the men participating. By then, only 74 of the men with syphilis still survived. Public outrage shut the study down three months after the report.
While the Tuskegee Syphilis Study relied on participants who were already ill, other studies exposed otherwise healthy people to deadly diseases. For example, from 1955 to 1970, a pediatrician exposed more than 50 children with mental disabilities to hepatitis in order to identify different strains of the disease and eventually develop vaccines. The trial took place at Willowbrook State School, a home for children and adults with developmental disabilities in Staten Island, New York.
The school was overcrowded and had a lengthy waitlist for new patients. But the study’s principal investigator, Saul Krugman, offered several parents the opportunity to cut the line if they agreed to enroll their children in the study. Krugman told them that their children were likely to catch the disease at the facility anyway, but by joining the study, they would have access to cleaner facilities and a chance at an eventual vaccine.
“I did feel coerced,” said Diana McCourt, who enrolled her daughter in the Willowbrook study, to Forbes’ Leah Rosenbaum. “I felt like I was denied help unless I took this [opportunity].”
The Willowbrook studies, which ended in 1970, revealed the existence of the A and B strains of hepatitis and sped up the development of a hepatitis B vaccine. But the studies progressed even as some in the medical community criticized Krugman’s methods. In 1966, anesthesiologist Henry K. Beecher published a landmark essay detailing 22 examples of ongoing unethical research on human subjects, including the Willowbrook hepatitis studies, in order to raise awareness and end unethical practices that continued despite the creation of international human experimentation guidelines—the Nuremberg Code in 1947 and the Declaration of Helsinki in 1964.
In addition to the Willowbrook study, Beecher highlighted one study in which melanoma, a serious form of skin cancer, was transferred from a woman to her mother “in the hope of gaining a little better understanding of cancer immunity.” The woman died on the same day that her mother was to receive the melanoma injection, so the doctors knew the cancer was deadly. Her mother died 451 days after receiving the injection.
Beecher concluded that an ethical approach to experimentation requires, first and foremost, the informed consent of study volunteers. “The difficulty of obtaining this is discussed in detail,” he writes, “But it is absolutely essential to strive for it for moral, sociologic and legal reasons. The statement that consent has been obtained has little meaning unless the subject or his guardian is capable of understanding what is to be undertaken and unless all hazards are made clear.”
Human challenge studies became less common after the 1970s with the conclusion of unethical studies that shocked the public. Since then, the Declaration of Helsinki has been amended seven times to clarify ethical standards for human experiments, most recently in October of 2013. The current declaration states that “While the primary purpose of medical research is to generate new knowledge, this goal can never take precedence over the rights and interests of individual research subjects.”
When run well, challenge studies are still uniquely able to provide clear data about infectious diseases. “They are now coming back in favor with very rigorous ethical principles in place,” adds McArthur.
The University of Maryland used human challenge studies in 2012 and 2013 to develop a vaccine for cholera, which was approved by the FDA in 2016. Cholera was an ideal candidate for a safe human challenge study because it is well understood by scientists, is reliably treatable with fluids and antibiotics, and has no long-term effects after the infection is gone.
Informed consent procedures have come a long way since Reed’s contract. Volunteers can ask questions and seek outside guidance, and must pass an assessment designed by the researchers to prove that they understand the risks of a study. And the volunteers have the power to quit. “Every time there’s an encounter with the volunteer, it’s reaffirming that the volunteer is still willing and able to participate,” says McArthur.
According to a statement by Imperial College London, which still needs to have its experimental plan approved by government regulators before researchers can begin recruiting participants, volunteers’ safety is the number one priority. “It would be nice to see exactly how [Imperial College London] explains the risks and benefits to those participating in this study,” says Chaves-Carballo.
Covid-19 is different from other challenge study diseases: Scientists have been studying it for less than a year, physicians have no approved treatments to intervene if a volunteer’s illness becomes severe, and early evidence suggests Covid-19 can cause long-term effects even in young, previously healthy people. The Imperial College London study aims to first identify the minimum dose of coronavirus necessary to cause disease. The study would use that dose of virus to study how vaccines work in the body to prevent Covid-19, to look at potential treatments and study the immune response. The biomedical community remains split on whether such a study should be run, given all of the unknowns around Covid-19.
When scientists develop second- and third-generation vaccines, a challenge study allows researchers to work with just 100 people instead of tens of thousands. That means fewer people are asked to go without the vaccine for the sake of research. And by waiting to conduct a challenge study on Covid-19 until a later date, researchers might get access to new information about risk factors for severe disease, which could help make the study safer.
“I am not a fan of SARS-CoV-2 challenge studies,” says McArthur. “But if I’m playing devil’s advocate against myself, some of the very reasons [not to do a challenge study] that I listed might be reasons that someone else might say that a challenge study is beneficial. Because we don’t know that much about a disease, so we could learn more about it.”