Advertisement

SKIP ADVERTISEMENT

Guest Essay

A Plea for Making Virus Research Safer

Credit...Carolina Moscoso

Dr. Bloom studies virus evolution at the Fred Hutchinson Cancer Center in Seattle.

Viruses far more devastating than the coronavirus that causes Covid-19 have plagued humankind. Smallpox, for example, killed up to 30 percent of people it infected. Thanks to science, it’s now a plague of the past, with the last natural infection occurring in 1977.

But the last cases of smallpox came from a lab, when a British medical photographer was accidentally infected at the University of Birmingham Medical School in 1978. She died after transmitting the virus to her mother, but fortunately, it did not spread further. The respected virologist who headed the lab died by suicide, with colleagues saying he was hounded to death by journalists looking for someone to blame.

The year before, the world was swept by the 1977 influenza pandemic, caused by a previously extinct strain of influenza. While some have suggested this pandemic was triggered by a lab accident, many scientists (including me) think it most likely resulted from a misguided vaccine trial.

Repeats of those disasters seem unlikely. Research on the smallpox virus is now highly restricted, nobody would run vaccine trials with extinct influenza strains today, and lab safety has greatly improved since the 1970s.

But new scientific breakthroughs make it increasingly easy to identify dangerous viruses in nature, manipulate them in the lab and synthetically create them from genetic sequences. In just the past few weeks, scientists at Boston University reported making hybrids of variants of SARS-CoV-2, the virus that causes Covid-19, while the media reported on a proliferation of labs studying dangerous viruses. And so a debate rages: Is virology making us more or less safe?

I am a virologist who studies how mutations enable viruses to escape antibodies, resist drugs and bind to cells. I know virology has done much to advance public health. But a few aspects of modern virology can be a double-edged sword, and we need to promote beneficial, lifesaving research without creating new risks in the lab.

Viruses caused outbreaks long before labs existed. Throughout history, animal viruses have jumped into humans. Once viruses start spreading, they can evolve to become more transmissible or evade immunity, as we’ve seen with SARS-CoV-2.

Scientific research on these natural threats has immense benefits. Understanding of avian influenza has informed efforts to stop transmission in poultry, possibly preventing a pandemic. Covid-19 vaccines were based on studies of the spike protein of other coronaviruses. Scientists track how antibodies and vaccines work against new Covid variants using viruses that don’t replicate, called pseudoviruses, which pose no risk to humans.

None of this research requires a dangerous virus in the lab. It’s done by surveillance, studying parts of the virus or using pseudoviruses. But some experiments do require an actual virus, such as when testing drugs like Paxlovid. What if there is an accident? There are documented cases of scientists being infected with coronaviruses even in modern Biosafety Level 3 and 4 labs, which are used to study dangerous pathogens.

Most of the time these risks are quite low. A researcher is much more likely to acquire a virus at a grocery store than in a modern lab. Of course there is some risk — but we allow people to drive even though each year they have about a 1 in 10,000 chance of dying in a car accident.

But it’s different when research involves a virus that could plausibly spark a pandemic, which is the case when a virus can transmit from person to person and most people lack immunity. I would not be allowed to drive if my car accident could kill millions of people and cost trillions of dollars in economic losses. But potential pandemic viruses pose that risk.

Scientists mostly study viruses that lack pandemic potential. Sometimes they use “safe” viruses that are unable to infect humans or have been weakened in the lab. Or they study viruses that already circulate in humans and are unlikely to cause a pandemic if there’s an accident. In my view, the much-discussed Boston University study falls in this category because it combines two SARS-CoV-2 variants that recently circulated in humans.

However, scientists sometimes study viruses that have made isolated jumps from animals to people. For example, a new strain of avian influenza is currently infecting many birds and some other mammals like seals and foxes, but so far only a few humans. Virologists worry it could adapt to transmit in humans and spark a pandemic. As scientists, we want to test how well such viruses can infect human cells or evade countermeasures. But a lab accident could expose the scientist.

I think that such studies should use the safer methods described above whenever possible, but exceptions can be made. For instance, people are already coming into contact with influenza-infected birds, and judicious research in a high-level biosafety lab can help assess the threat.

But some scientists have taken it further, adding “gain of function” mutations that make potential pandemic viruses more transmissible. The National Institutes of Health funded two research groups to increase the transmissibility of an earlier strain of avian influenza that had killed hundreds of people but could not efficiently spread from person to person. Both groups created viral mutants that could transmit in ferrets. The Obama administration was so alarmed that it halted gain-of-function work on potential pandemic influenza viruses in 2014, but the N.I.H. allowed it to restart by 2019.

In my view, there is no justification for intentionally making potential pandemic viruses more transmissible. The consequences of an accident could be too horrific, and such engineered viruses are not needed for vaccines anyway.

Natural viruses that haven’t yet infected humans can also pose a risk if researchers try to find the most dangerous ones and bring them back to the lab for experiments.

Suspicions about a lab-accident origin of SARS-CoV-2 have been fueled by the fact that the Wuhan Institute of Virology was involved in Chinese and international efforts to find and experiment with new high-risk coronaviruses. The W.I.V. says it did not perform experiments with viruses similar to SARS-CoV-2 before the Covid-19 pandemic. Even so, the pandemic shows just how dangerous these viruses are. The risk for accident isn’t outweighed by any concomitant benefit, because no one has explained how the pandemic would have been prevented if W.I.V. scientists had managed to experiment on such viruses beforehand. It also would not have helped with vaccines: Moderna designed its vaccine just two days after release of the SARS-CoV-2 genetic sequence, without access to the actual virus.

A final category of pandemic risk involves viruses that used to transmit in humans but became extinct long ago — like the 1918 influenza virus. That virus was synthetically reconstructed and is now studied by a number of labs to understand why it was so deadly. Although this research is scientifically fascinating, I’ve come to think that experimenting on extinct pandemic viruses just isn’t worth the risk.

There are also some gray zones that don’t directly involve pandemic viruses, but deserve further discussion.

One gray zone is mutants of current human viruses that escape antibodies or drugs. Studying such mutants is essential in designing vaccines and is even part of the Food and Drug Administration’s review process for antiviral drugs. But scientists should avoid generating more mutations than would be expected to evolve naturally within a few years.

Another gray zone involves information. Advances in sequencing, computation and safe experiments enable increasingly good predictions of the effects of viral mutations. This information helps track evolution, update vaccines and design drugs. But it has become easy to transform information into actual viruses. What if someone uses information to design a well-intentioned but risky experiment, or even worse a bioweapon?

Well-intentioned accidents can be addressed by regulating risky experiments, but nefarious actors can’t be regulated. The cat may be mostly out of the bag, since information about how to create several dangerous viruses is already in the public domain. However, we should control the most high-risk information (like how to create smallpox from synthetic DNA) while not disrupting the free flow of data on which science depends.

Overall, most virology research is safe and often beneficial. But experiments that pose pandemic risks should be stopped, and other areas require continued careful assessment. Several groups are developing frameworks for oversight and regulation.

But who should ultimately decide?

Some virologists think we should have the final say, since we’re the ones with technical expertise. I only partially agree. I’m a scientist. My dad is a scientist. My wife is a scientist. Most of my friends are scientists. I obviously think scientists are great. But we’re susceptible to the same professional and personal biases as anyone else and can lack a holistic view.

The French statesman Georges Clemenceau said, “War is too important to be left to the generals.” When it comes to regulating high-risk research on potential pandemic viruses, we similarly need a transparent and independent approach that involves virologists and the broader public that both funds and is affected by their work.

Jesse Bloom is a professor at the Fred Hutchinson Cancer Center and an investigator at the Howard Hughes Medical Institute.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Advertisement

SKIP ADVERTISEMENT