Mad Scientists: When the Scientist Becomes the Subject

Self-experimentation has a long and noble — and Nobel-winning — history in medical breakthroughs. However, due to collaborative studies and improved technology, these methods have been largely relegated to late-night movies…just ask Dr. Jekyll! 

Image copyright 20th Century Fox
Image copyright 20th Century Fox

As any scientist knows, getting a new drug, device, or procedure to the human trial stage requires a Herculean effort. It can take years — even decades — to overcome the associated regulatory hurdles. In the quest for discovery, some researchers decide to circumvent the rules and test their hypotheses on just one person: themselves.

While certainly more ethical than experimenting on unknowing or disenfranchised populations, making oneself the human subject can go wrong in many, many ways. Death, disease, and dismemberment are just a few of the obvious reasons that self-experimentation is a dangerous game.

However, numerous important medical advances over the centuries involved scientists experimenting on themselves. So where does the line fall between reckless insanity and daring genius?

Nobel Approach

Most Nobel Prize-winners in the categories of Physiology and Medicine achieve their breakthroughs with research that involves lab animals — proving that self-experimentation is not needed to attain great discoveries. Only occasionally has a Nobel laureate turned to self-experimentation as a last resort.

In the case of Ralph Steinman, MD, he felt he had nothing to lose after he was diagnosed with pancreatic cancer and told that he had only a 5% chance of living longer than a year. He plunged into eight different experimental therapies based on his “dendritic cells” research.

With the help of colleagues and traditional chemotherapy, he lived four and a half more years. Steinman died three days before the Nobel Committee awarded him the 2011 prize for his discovery of the dendritic cell type.

Barry Marshall, AC, an Australian physician and 2005 Nobel Prize-winner, proved his theory to the scientific world through a self-inflicted infection that he was confident could be cured. He and a colleague, Robin Warren, AC, had strong evidence that a bacteria strain, which they dubbed Helicobacter pylori, was at the root of most peptic ulcers and gastritis.

Facing skepticism from medical community and a lack of suitable animal models, Marshall chugged a cocktail of the bacteria. He was diagnosed with gastritis within a week, but eliminated his symptoms with a round of antibiotics — offering the first effective cure for stomach ulcers and gastritis.

The risks taken by Steinman and Marshall may seem rational, given their specific circumstances. But several decades earlier, Werner Forssmann, a German urologist, conducted a procedure on himself that could be called crazy.

Forssmann developed the first method of cardiac catheterization, which requires the insertion of a catheter into the heart to assess whether or not surgery is necessary. He wanted to replicate the procedure in a human after it had been successfully completed in a horse in 1861. After being denied his request due to the level of danger involved, Forssmann forged on with himself as the patient.

He anesthetized his own arm and pushed the catheter 30 cm into his vein, then walked up two flights of stairs to the x-ray room before inserting it the rest of the way into his heart. His work went relatively unrecognized until André Frédéric Cournand, MD, and Dickinson W. Richards, MD, refined his technique. They subsequently shared the Nobel Prize with him in 1956.

Changing Times

In addition to several Nobel Prizes, self-experimentation has played a role in numerous landmark findings, including the mosquito-borne nature of yellow fever, the development of ibuprofen and anesthesia medications, asthma remedies, the link between hygiene and diseases like cholera, and the ABO blood group system — just to name a few.

A review in the Texas Heart Institute Journal examined 465 documented occurrences of self-experimentation in medicine over the past two centuries to see what trends emerged. Most cases of self-experimentation took place around the first half of the 20th century. Among these instances, eight deaths took place.

The author noted that quite a few of these scientists went on to enjoy successful careers. He highlighted the importance of their contributions to modern medicine, but found that self-experimentation has been in steady decline despite such accomplishments.

The cause of this decrease is likely due to improved alternative research options, better technology, and the large collaborative studies that are growing in prevalence. Through these resources, scientists are less likely to find themselves facing desperate measures like self-experimentation.

“The trend in recent years toward collaborative studies, often on a massive scale, makes self-experimentation by a single individual, tucked away in his laboratory, seem almost quaint, a relic of the past,” the author explains.

Governing bodies have been reticent to take a firm stance for or against self-experimentation. The positive results yielded in the past make it difficult to condemn the practice. Simultaneously, the enormous risks involved make it impossible to endorse.

The U.S. Food and Drug Administration does not officially recognize a difference between self-experimentation and any other human trial. The approval of an institutional review board and other measures are considered necessary before a scientist can include themselves in their research. This means that investigators can volunteer for their study like any other qualifying participant, given that the research proposal gets the green light for human subjects.

Self-experimentation is likely to remain an ethical gray area in science. Some would argue that a researcher is ultimately responsible for whatever tests he or she chooses to endure, while others would say that there is an inherent bias when including oneself in an experiment and that the results are likely to be unreliable.

Although research is trending away from self-experimentation, medicine has been undeniably bolstered by the gutsy — and perhaps mad — undertakings of such scientists.

Mapes is a Washington D.C.-based freelance writer and a regular contributor to Endocrine News. She wrote about the human epigenome in the August issue.

You may also like

  • Q&A with Jonathan Cedernaes, PhD, Uppsala University

    Just one night of wakefulness can lead to alterations in epigenetic and transcriptional profile of core circadian clock genes in key metabolic tissues, which could explain why shift workers are at an increased risk of metabolic morbidities, according to a new study published in the Journal of Clinical Endocrinology & Metabolism. Researchers led by Jonathan…

  • From Boardroom to Bench

    Nowhere is mentoring more important than in the world of medicine. Here are six mentoring tips from the business world that could handily apply to your lab. Every physician and researcher fills the role of an apprentice in the early years of his or her career. The guidance of seasoned experts is vital to mastering…