This is a perfectly reasonable point to bring up, and I agree that we should critically consider whether or not policy and regulation in the field is adequate. I want to emphasize some ways that high-risk biological research differs from finance, nuclear weapons, and money laundering.
First, people don’t do gain of function research (or whatever we ought to call it) for profit, so imposing gigantic fines, the threat of jail time, and constant severe scrutiny would be tantamount to banning it outright. Likewise, private companies are pursuing profits when they build nuclear weapons. Medicine is, of course, heavily regulated, and once again it is the profit motive that allows the industry to thrive even in such a heavily regulated context.
Soldiers operating and maintaining nuclear weapons have given permission for the military to exert extremely intrusive control over their activities. Some of the best and brightest scientists worked for the military as an act of patriotic service to build the nuclear bomb during WWII. However, the Manhattan Project was aimed at a specific engineering outcome, while GoF research would be an ongoing effort with no “definition of done,” and it might be hard to convince an adequate number of high-quality scientists to sign up for such strict controls if it was for their entire careers.
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
For a person who’d like to see gain of function research banned, any move to regulate it and punish violations would be a step in the right direction. However, those who’d like to enforce responsible behavior, perhaps by using regulations on part with those you describe, have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context. Is there a way we can reward people for achieving the maximum public health benefit with the minimum risk in their research?
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
Money laundering is a crime, one which the government primarily combats using AML (Anti Money Laundering) regulations. This apply to essentially all financial companies, and require them to evaluate their clients and counterparties for money laundering risk. If a bank facilitates money laundering it can be punished for this even if it didn’t want to do it; all that is required for AML violations is that the bank didn’t try hard enough to avoid accidentally helping launder money. These regulations are a big deal, with very large number of people employed in their enforcement and large fines for violations. It is much easier to fine a bank than it is to fine the underlying money launderer. AML rules are effectively a way for governments, which are not really capable of catching money laundering, to outsource the expense and difficulty.
The equivalent here would be if medical equipment companies, reagent companies, lab animal companies had to do due diligence on researchers and were liable if they lacked sufficient processes to ensure they didn’t sell supplies to researchers who performed dangerous experiments. As with AML, this allows the government to impose large fines on for-profit companies (which it can easily force to pay) over smaller, potentially judgement-proof labs, which still achieving much the same goal.
That’s a helpful reframing, thank you. I think there is still a disconnect between the two cases, however. As money laundering is a crime, companies have a relatively simple task before them: to identify and eliminate money laundering.
By contrast, GoF research is not a crime, and the objective, from a “responsibly pro-GoF” point of view, is to improve the risk/reward ratio to an acceptable level. A company would be likely to be highly conservative in making these judgments, as they would capture none of the benefits of successful and informative GoF research, but would be punished for allowing overly risky or failed GoF research to go forward. In other words, companies would likely refuse to sell to GoF research entirely in order to minimize or eliminate their risk.
The problem is even more acute if the work of evaluating GoF research was foisted onto companies. Scientists might be motivated by curiosity, altruism, or a desire for scientific credit, so there is at least some reward to be had even if GoF research were much more stringently regulated. By contrast, regulating companies in the manner you propose would come with no incentive whatsoever for companies to sell to GoF research, thus effectively banning it.
What exactly is money laundering is not always black and white, and financial firms do not have anything like certainty about whether any given person, entity or transaction is guilty. Instead they adopt a series of rules that, on average, reduce money laundering, but not to zero, and there are false positives. These especially effect low income people, immigrants, those with little documentation, and people with unusual situations. AML rules directly contribute to the problem of people being unbanked (lacking access to the formal financial system, being reliant on cheque cashers etc.) - the government knows this and accepts it as a necessary cost.
Similarly, I would imagine that not all GoF research would be illegal—but some would, and governments could deputize firms to help to differentiate. This would disrupt some legitimate researchers but could be generally regarded by policymakers as an acceptable price to pay.
Clearly there are some dis-analogies. There are many fewer biomedical researchers than money transfers, which makes in-depth evaluation of each one more viable. And as you noted the (financial and otherwise) benefit of research is more distant from the people undertaking it. I’m not trying to make a strong claim that this is a particularly good model for GoF regulation; just noting that I think researchers don’t realize quite how unregulated they are relative to other industries.
It’s important to keep in mind that while money laundering is typically carried out by profit-seeking criminals who take advantage of complex financial transactions to hide their illegal activities, GoF research is not driven by financial gain. Therefore, we need to consider the unique nature of GoF research when assessing the need for regulation.
It’s not just a matter of how much regulation is in place, but also about finding a balance between the pressures to engage in the research and a regulatory framework that effectively manages any potential risks. If there’s an inadequate regulatory apparatus in place relative to the pressures to participate, then the field is “underregulated.” Conversely, if there’s too much regulation, the field may be at risk of becoming “overregulated.”
Given the significant risks associated with GoF research, it requires a high level of regulation compared to other public service research areas that have similarly limited pressures to participate. However, because profit is not a driving force, the field can only tolerate a certain amount of regulation before participation becomes difficult.
Rather than focusing on increasing regulation dramatically or maintaining the status quo, we should look to refine and improve regulation for GoF research. While some scope exists to tighten regulations, excessive regulation could stifle the field altogether, which may or may not be desirable. If we wish the field to continue while enhancing the risk-benefit ratio, our focus should be on regulating the field proportionately to the pressures to participate.
It’s time to shift the discussion from “how regulated is the field” to “how regulated is the field relative to the pressures to participate.” By doing so, we can strike a balance between promoting the field’s progress and ensuring appropriate risk management.
The international community funded a database of Coronaviruses that was held by the lab in Wuhan. In September 2019, the month when the Chinese military overtook the lab, that database was taken offline.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context.
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
Asking those questions would mean doing more biomedical research that isn’t gain of function and loss of function.
have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
That assumes that it’s important to motivate them to do GoF research. It seems that research served for them as a distraction from doing the relevant research.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
Are you sure that virologists didn’t write such OPs?
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration. That doesn’t excuse them—they ought to have grown a spine! - but it’s important to recognize the cause of failure accurately so that we can work on the right problem.
Are you sure that virologists didn’t write such OPs?
Pretty much, when I googled about the fact that they took down the database I found no such OPeds. If you have any evidence to the contrary I would love to see it.
If you talk about that it’s wrong that they took down the database that points to the fact that the early lab leak denial was bullshit and the virologists cared nobody finding out that the arguments they made were bullshit.
Jeremy Farrar describes in his book that one of the key arguments they used to reject the lab leak theory as the huge distance from the openly published sequences to the COVID-19 sequence. That argument becomes a lot weaker when you factor in that the military overtook the lab in September 2019 and at that month they took down their database.
The virologists cared more about keeping the public uninformed about what happened at the Wuhan Institute for Virology than they cared about the database being available to help for fighting the pandemic.
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration.
Knowing that airborne transmission matters has consequences about what actions you want to take.
When the Japanese health authorities advice at the beginning of the pandemic to avoid closed spaces with poor ventilation US and EU authorities didn’t give that advice.
I find it pretty unlikely that Fauci et al didn’t give the same advice of avoiding closed spaces that the Japanese authorities gave out because the Trump administration didn’t want them to tell people to avoid closed spaces but the Trump administration preferred the advice of telling people to wash their hands.
One of the corollaries of “avoid closed spaces with poor ventilation” is that forbidding people from meeting each other outside is bad policy.
The 1.5 meter distance recommendation makes little sense with airborne spread but was quite central for pandemic guidance.
There’s some research that suggests that flu transmission can be reduced in school by controlling the humidity level. There’s a good chance that you can also reduce COVID-19 transmission by controlling indoor humidity but the virologists didn’t care enough about doing the basic research to establish that to get a policy in place that all public buildings get humidity controlled.
There was no ramp-up of indoor ventilation production at the start of the pandemic but it would have been the reasonable step if the problem would have been seen as one of airborne transmission.
The WHO took two years to acknowledge airborne transmission. If the virologist community would have done their job, they would have explained to the WHO early on that it has to acknowledge airborne transmission or be branded by the virologists as science deniers.
This is a perfectly reasonable point to bring up, and I agree that we should critically consider whether or not policy and regulation in the field is adequate. I want to emphasize some ways that high-risk biological research differs from finance, nuclear weapons, and money laundering.
First, people don’t do gain of function research (or whatever we ought to call it) for profit, so imposing gigantic fines, the threat of jail time, and constant severe scrutiny would be tantamount to banning it outright. Likewise, private companies are pursuing profits when they build nuclear weapons. Medicine is, of course, heavily regulated, and once again it is the profit motive that allows the industry to thrive even in such a heavily regulated context.
Soldiers operating and maintaining nuclear weapons have given permission for the military to exert extremely intrusive control over their activities. Some of the best and brightest scientists worked for the military as an act of patriotic service to build the nuclear bomb during WWII. However, the Manhattan Project was aimed at a specific engineering outcome, while GoF research would be an ongoing effort with no “definition of done,” and it might be hard to convince an adequate number of high-quality scientists to sign up for such strict controls if it was for their entire careers.
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
For a person who’d like to see gain of function research banned, any move to regulate it and punish violations would be a step in the right direction. However, those who’d like to enforce responsible behavior, perhaps by using regulations on part with those you describe, have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context. Is there a way we can reward people for achieving the maximum public health benefit with the minimum risk in their research?
Money laundering is a crime, one which the government primarily combats using AML (Anti Money Laundering) regulations. This apply to essentially all financial companies, and require them to evaluate their clients and counterparties for money laundering risk. If a bank facilitates money laundering it can be punished for this even if it didn’t want to do it; all that is required for AML violations is that the bank didn’t try hard enough to avoid accidentally helping launder money. These regulations are a big deal, with very large number of people employed in their enforcement and large fines for violations. It is much easier to fine a bank than it is to fine the underlying money launderer. AML rules are effectively a way for governments, which are not really capable of catching money laundering, to outsource the expense and difficulty.
The equivalent here would be if medical equipment companies, reagent companies, lab animal companies had to do due diligence on researchers and were liable if they lacked sufficient processes to ensure they didn’t sell supplies to researchers who performed dangerous experiments. As with AML, this allows the government to impose large fines on for-profit companies (which it can easily force to pay) over smaller, potentially judgement-proof labs, which still achieving much the same goal.
That’s a helpful reframing, thank you. I think there is still a disconnect between the two cases, however. As money laundering is a crime, companies have a relatively simple task before them: to identify and eliminate money laundering.
By contrast, GoF research is not a crime, and the objective, from a “responsibly pro-GoF” point of view, is to improve the risk/reward ratio to an acceptable level. A company would be likely to be highly conservative in making these judgments, as they would capture none of the benefits of successful and informative GoF research, but would be punished for allowing overly risky or failed GoF research to go forward. In other words, companies would likely refuse to sell to GoF research entirely in order to minimize or eliminate their risk.
The problem is even more acute if the work of evaluating GoF research was foisted onto companies. Scientists might be motivated by curiosity, altruism, or a desire for scientific credit, so there is at least some reward to be had even if GoF research were much more stringently regulated. By contrast, regulating companies in the manner you propose would come with no incentive whatsoever for companies to sell to GoF research, thus effectively banning it.
I think actually the analogy extends even here!
What exactly is money laundering is not always black and white, and financial firms do not have anything like certainty about whether any given person, entity or transaction is guilty. Instead they adopt a series of rules that, on average, reduce money laundering, but not to zero, and there are false positives. These especially effect low income people, immigrants, those with little documentation, and people with unusual situations. AML rules directly contribute to the problem of people being unbanked (lacking access to the formal financial system, being reliant on cheque cashers etc.) - the government knows this and accepts it as a necessary cost.
Similarly, I would imagine that not all GoF research would be illegal—but some would, and governments could deputize firms to help to differentiate. This would disrupt some legitimate researchers but could be generally regarded by policymakers as an acceptable price to pay.
Clearly there are some dis-analogies. There are many fewer biomedical researchers than money transfers, which makes in-depth evaluation of each one more viable. And as you noted the (financial and otherwise) benefit of research is more distant from the people undertaking it. I’m not trying to make a strong claim that this is a particularly good model for GoF regulation; just noting that I think researchers don’t realize quite how unregulated they are relative to other industries.
It’s important to keep in mind that while money laundering is typically carried out by profit-seeking criminals who take advantage of complex financial transactions to hide their illegal activities, GoF research is not driven by financial gain. Therefore, we need to consider the unique nature of GoF research when assessing the need for regulation.
It’s not just a matter of how much regulation is in place, but also about finding a balance between the pressures to engage in the research and a regulatory framework that effectively manages any potential risks. If there’s an inadequate regulatory apparatus in place relative to the pressures to participate, then the field is “underregulated.” Conversely, if there’s too much regulation, the field may be at risk of becoming “overregulated.”
Given the significant risks associated with GoF research, it requires a high level of regulation compared to other public service research areas that have similarly limited pressures to participate. However, because profit is not a driving force, the field can only tolerate a certain amount of regulation before participation becomes difficult.
Rather than focusing on increasing regulation dramatically or maintaining the status quo, we should look to refine and improve regulation for GoF research. While some scope exists to tighten regulations, excessive regulation could stifle the field altogether, which may or may not be desirable. If we wish the field to continue while enhancing the risk-benefit ratio, our focus should be on regulating the field proportionately to the pressures to participate.
It’s time to shift the discussion from “how regulated is the field” to “how regulated is the field relative to the pressures to participate.” By doing so, we can strike a balance between promoting the field’s progress and ensuring appropriate risk management.
The international community funded a database of Coronaviruses that was held by the lab in Wuhan. In September 2019, the month when the Chinese military overtook the lab, that database was taken offline.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
Asking those questions would mean doing more biomedical research that isn’t gain of function and loss of function.
That assumes that it’s important to motivate them to do GoF research. It seems that research served for them as a distraction from doing the relevant research.
Are you sure that virologists didn’t write such OPs?
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration. That doesn’t excuse them—they ought to have grown a spine! - but it’s important to recognize the cause of failure accurately so that we can work on the right problem.
Pretty much, when I googled about the fact that they took down the database I found no such OPeds. If you have any evidence to the contrary I would love to see it.
If you talk about that it’s wrong that they took down the database that points to the fact that the early lab leak denial was bullshit and the virologists cared nobody finding out that the arguments they made were bullshit.
Jeremy Farrar describes in his book that one of the key arguments they used to reject the lab leak theory as the huge distance from the openly published sequences to the COVID-19 sequence. That argument becomes a lot weaker when you factor in that the military overtook the lab in September 2019 and at that month they took down their database.
The virologists cared more about keeping the public uninformed about what happened at the Wuhan Institute for Virology than they cared about the database being available to help for fighting the pandemic.
Knowing that airborne transmission matters has consequences about what actions you want to take.
When the Japanese health authorities advice at the beginning of the pandemic to avoid closed spaces with poor ventilation US and EU authorities didn’t give that advice.
I find it pretty unlikely that Fauci et al didn’t give the same advice of avoiding closed spaces that the Japanese authorities gave out because the Trump administration didn’t want them to tell people to avoid closed spaces but the Trump administration preferred the advice of telling people to wash their hands.
One of the corollaries of “avoid closed spaces with poor ventilation” is that forbidding people from meeting each other outside is bad policy.
The 1.5 meter distance recommendation makes little sense with airborne spread but was quite central for pandemic guidance.
There’s some research that suggests that flu transmission can be reduced in school by controlling the humidity level. There’s a good chance that you can also reduce COVID-19 transmission by controlling indoor humidity but the virologists didn’t care enough about doing the basic research to establish that to get a policy in place that all public buildings get humidity controlled.
There was no ramp-up of indoor ventilation production at the start of the pandemic but it would have been the reasonable step if the problem would have been seen as one of airborne transmission.
The WHO took two years to acknowledge airborne transmission. If the virologist community would have done their job, they would have explained to the WHO early on that it has to acknowledge airborne transmission or be branded by the virologists as science deniers.