Thanks very much for writing this. Unnecessarily alienating people for stupid reasons like misusing terminology or not understanding their perspective seems like a clear mistake and hopefully this can help people avoid these pitfalls.
I was curious about your suggestion that a lot of researchers think that basically all biomedical research is gain/loss of function. My impression is that one of the key defenses that the Fauci/NIH/EcoHealth/etc. offered for their research in Wuhan was that it was technically not Gain of Function, even if some parts of it might sound like Gain of Function to the layperson, which seems in tension with this claim. Do you think they were wrong about this?
Your point about many researchers in this area feeling defensive was a good one, and it’s important to take this into account when talking to people. It’s not fun to be blamed for causing a global crisis, especially if its possible you didn’t do it!
I do think there’s a risk of a missing mood, however. Speaking as an outsider, the amount of regulation on what you refer to as ePPP (adding functionality to make diseases more dangerous) seems shockingly low. The article you link to tries to make it sound like there are a lot of safeguards, but it seems to me like virtually all the steps only apply if you are seeking federal funding. This is not a standard we accept in other areas! If you are making a car, or building a nuclear power plant, or running a bank or airline, you have to accept extremely intrusive regulation regardless of your funding, and for many things—like nuclear weapons or money laundering—US regulation has world-wide reach.
For comparison, here are some of the regulatory approaches used in finance, another industry that was widely blamed for causing a global crisis:
Monitoring of all electronic communications.
Quarterly reporting of activities to regulators.
Regulators placed inside your organization.
Risk Management departments at every firm, making up a non-trivial fraction of their total size.
Compliance departments at every firm, making up a non-trivial fraction of their total size.
Mandatory compliance training for all employees.
Novel regulatory bodies with broad mandate to find and punish things that seem bad.
AML & KYC rules that impose stringent penalties for doing business with suspisious actors.
Mandatory insurance.
Direct government control over key areas.
‘Voluntary’ advice from regulators (100% adoption due to legal risks of disobedience).
Multi-million-dollar whistleblower awards.
Billion-dollar fines.
Decades long jail sentences.
Lifetime bans from the industry.
… and most importantly, if you mess up, you will probably lose a lot of money.
As far as I can, few if any of these approaches are imposed on Gain of Function research, despite the potential to directly cause the death of all of humanity.
Hi, thanks for the comments! Some broad thoughts in response:
Re
My impression is that one of the key defenses that the Fauci/NIH/EcoHealth/etc. offered for their research in Wuhan was that it was technically not Gain of Function, even if some parts of it might sound like Gain of Function to the layperson, which seems in tension with this claim. Do you think they were wrong about this?
It’s hard for me to go into detail on a public platform on this (just to be cautious to my job) but I can broadly say that there’s a difference between research that is a) gaining a function, b) gain-of-function as defined by informal norms in the biomedical community, and c) what is formally DURC / GoF research as defined by U.S. government policy. The EcoHealth grants fall confusingly as or as not GoF depending on how GoF is defined.
Re
Speaking as an outsider, the amount of regulation on what you refer to as ePPP (adding functionality to make diseases more dangerous) seems shockingly low. The article you link to tries to make it sound like there are a lot of safeguards, but it seems to me like virtually all the steps only apply if you are seeking federal funding. This is not a standard we accept in other areas! If you are making a car, or building a nuclear power plant, or running a bank or airline, you have to accept extremely intrusive regulation regardless of your funding, and for many things—like nuclear weapons or money laundering—US regulation has world-wide reach.
I fully agree! I think there are many concrete needs in this space including legal regulation over DURC /ePPP/GoF research in the U.S. particularly but also every country that practices such research. To achieve such regulation requires a ton of work, consensus building, and thought into what constructive regulation that captures risk while not alienating / shutting down an entire research field is tough and part of the nuances that I think we as a community need to work towards
This is a perfectly reasonable point to bring up, and I agree that we should critically consider whether or not policy and regulation in the field is adequate. I want to emphasize some ways that high-risk biological research differs from finance, nuclear weapons, and money laundering.
First, people don’t do gain of function research (or whatever we ought to call it) for profit, so imposing gigantic fines, the threat of jail time, and constant severe scrutiny would be tantamount to banning it outright. Likewise, private companies are pursuing profits when they build nuclear weapons. Medicine is, of course, heavily regulated, and once again it is the profit motive that allows the industry to thrive even in such a heavily regulated context.
Soldiers operating and maintaining nuclear weapons have given permission for the military to exert extremely intrusive control over their activities. Some of the best and brightest scientists worked for the military as an act of patriotic service to build the nuclear bomb during WWII. However, the Manhattan Project was aimed at a specific engineering outcome, while GoF research would be an ongoing effort with no “definition of done,” and it might be hard to convince an adequate number of high-quality scientists to sign up for such strict controls if it was for their entire careers.
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
For a person who’d like to see gain of function research banned, any move to regulate it and punish violations would be a step in the right direction. However, those who’d like to enforce responsible behavior, perhaps by using regulations on part with those you describe, have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context. Is there a way we can reward people for achieving the maximum public health benefit with the minimum risk in their research?
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
Money laundering is a crime, one which the government primarily combats using AML (Anti Money Laundering) regulations. This apply to essentially all financial companies, and require them to evaluate their clients and counterparties for money laundering risk. If a bank facilitates money laundering it can be punished for this even if it didn’t want to do it; all that is required for AML violations is that the bank didn’t try hard enough to avoid accidentally helping launder money. These regulations are a big deal, with very large number of people employed in their enforcement and large fines for violations. It is much easier to fine a bank than it is to fine the underlying money launderer. AML rules are effectively a way for governments, which are not really capable of catching money laundering, to outsource the expense and difficulty.
The equivalent here would be if medical equipment companies, reagent companies, lab animal companies had to do due diligence on researchers and were liable if they lacked sufficient processes to ensure they didn’t sell supplies to researchers who performed dangerous experiments. As with AML, this allows the government to impose large fines on for-profit companies (which it can easily force to pay) over smaller, potentially judgement-proof labs, which still achieving much the same goal.
That’s a helpful reframing, thank you. I think there is still a disconnect between the two cases, however. As money laundering is a crime, companies have a relatively simple task before them: to identify and eliminate money laundering.
By contrast, GoF research is not a crime, and the objective, from a “responsibly pro-GoF” point of view, is to improve the risk/reward ratio to an acceptable level. A company would be likely to be highly conservative in making these judgments, as they would capture none of the benefits of successful and informative GoF research, but would be punished for allowing overly risky or failed GoF research to go forward. In other words, companies would likely refuse to sell to GoF research entirely in order to minimize or eliminate their risk.
The problem is even more acute if the work of evaluating GoF research was foisted onto companies. Scientists might be motivated by curiosity, altruism, or a desire for scientific credit, so there is at least some reward to be had even if GoF research were much more stringently regulated. By contrast, regulating companies in the manner you propose would come with no incentive whatsoever for companies to sell to GoF research, thus effectively banning it.
What exactly is money laundering is not always black and white, and financial firms do not have anything like certainty about whether any given person, entity or transaction is guilty. Instead they adopt a series of rules that, on average, reduce money laundering, but not to zero, and there are false positives. These especially effect low income people, immigrants, those with little documentation, and people with unusual situations. AML rules directly contribute to the problem of people being unbanked (lacking access to the formal financial system, being reliant on cheque cashers etc.) - the government knows this and accepts it as a necessary cost.
Similarly, I would imagine that not all GoF research would be illegal—but some would, and governments could deputize firms to help to differentiate. This would disrupt some legitimate researchers but could be generally regarded by policymakers as an acceptable price to pay.
Clearly there are some dis-analogies. There are many fewer biomedical researchers than money transfers, which makes in-depth evaluation of each one more viable. And as you noted the (financial and otherwise) benefit of research is more distant from the people undertaking it. I’m not trying to make a strong claim that this is a particularly good model for GoF regulation; just noting that I think researchers don’t realize quite how unregulated they are relative to other industries.
It’s important to keep in mind that while money laundering is typically carried out by profit-seeking criminals who take advantage of complex financial transactions to hide their illegal activities, GoF research is not driven by financial gain. Therefore, we need to consider the unique nature of GoF research when assessing the need for regulation.
It’s not just a matter of how much regulation is in place, but also about finding a balance between the pressures to engage in the research and a regulatory framework that effectively manages any potential risks. If there’s an inadequate regulatory apparatus in place relative to the pressures to participate, then the field is “underregulated.” Conversely, if there’s too much regulation, the field may be at risk of becoming “overregulated.”
Given the significant risks associated with GoF research, it requires a high level of regulation compared to other public service research areas that have similarly limited pressures to participate. However, because profit is not a driving force, the field can only tolerate a certain amount of regulation before participation becomes difficult.
Rather than focusing on increasing regulation dramatically or maintaining the status quo, we should look to refine and improve regulation for GoF research. While some scope exists to tighten regulations, excessive regulation could stifle the field altogether, which may or may not be desirable. If we wish the field to continue while enhancing the risk-benefit ratio, our focus should be on regulating the field proportionately to the pressures to participate.
It’s time to shift the discussion from “how regulated is the field” to “how regulated is the field relative to the pressures to participate.” By doing so, we can strike a balance between promoting the field’s progress and ensuring appropriate risk management.
The international community funded a database of Coronaviruses that was held by the lab in Wuhan. In September 2019, the month when the Chinese military overtook the lab, that database was taken offline.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context.
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
Asking those questions would mean doing more biomedical research that isn’t gain of function and loss of function.
have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
That assumes that it’s important to motivate them to do GoF research. It seems that research served for them as a distraction from doing the relevant research.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
Are you sure that virologists didn’t write such OPs?
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration. That doesn’t excuse them—they ought to have grown a spine! - but it’s important to recognize the cause of failure accurately so that we can work on the right problem.
Are you sure that virologists didn’t write such OPs?
Pretty much, when I googled about the fact that they took down the database I found no such OPeds. If you have any evidence to the contrary I would love to see it.
If you talk about that it’s wrong that they took down the database that points to the fact that the early lab leak denial was bullshit and the virologists cared nobody finding out that the arguments they made were bullshit.
Jeremy Farrar describes in his book that one of the key arguments they used to reject the lab leak theory as the huge distance from the openly published sequences to the COVID-19 sequence. That argument becomes a lot weaker when you factor in that the military overtook the lab in September 2019 and at that month they took down their database.
The virologists cared more about keeping the public uninformed about what happened at the Wuhan Institute for Virology than they cared about the database being available to help for fighting the pandemic.
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration.
Knowing that airborne transmission matters has consequences about what actions you want to take.
When the Japanese health authorities advice at the beginning of the pandemic to avoid closed spaces with poor ventilation US and EU authorities didn’t give that advice.
I find it pretty unlikely that Fauci et al didn’t give the same advice of avoiding closed spaces that the Japanese authorities gave out because the Trump administration didn’t want them to tell people to avoid closed spaces but the Trump administration preferred the advice of telling people to wash their hands.
One of the corollaries of “avoid closed spaces with poor ventilation” is that forbidding people from meeting each other outside is bad policy.
The 1.5 meter distance recommendation makes little sense with airborne spread but was quite central for pandemic guidance.
There’s some research that suggests that flu transmission can be reduced in school by controlling the humidity level. There’s a good chance that you can also reduce COVID-19 transmission by controlling indoor humidity but the virologists didn’t care enough about doing the basic research to establish that to get a policy in place that all public buildings get humidity controlled.
There was no ramp-up of indoor ventilation production at the start of the pandemic but it would have been the reasonable step if the problem would have been seen as one of airborne transmission.
The WHO took two years to acknowledge airborne transmission. If the virologist community would have done their job, they would have explained to the WHO early on that it has to acknowledge airborne transmission or be branded by the virologists as science deniers.
I was curious about your suggestion that a lot of researchers think that basically all biomedical research is gain/loss of function.
Not completely clear on what the context the researchers were speaking to but a standard strategy in figuring out what genes do is to knock out (loss of function) the gene of interest in a model organism and observe what happens. Synthetic biology also has a lot of ‘gain of function’ engineering e.g. make microbes produce insulin.
My impression is that one of the key defenses that the Fauci/NIH/EcoHealth/etc. offered for their research in Wuhan was that it was technically not Gain of Function, even if some parts of it might sound like Gain of Function to the layperson, which seems in tension with this claim.
It not only sounds that way to a lay-person. The NIH stopped the EcoHealth grant that was partly paying for the research in Wuhan for a short time in 2016. When they renewed the grant Peter Dasek from EcoHealth wrote back:
“This is terrific! We are very happy to hear that our Gain of Function research funding pause has been lifted.”
Fauci himself wrote on the 1st February 2020 and email that had one of the study in the attachment with the file name “Baric, Shi et al—Nature medicine—SARS Gain of function”.
What Fauci/NIH/EcoHealth is saying seems to be something like “when people say ‘gain of function’ they really mean ePPP and the research they funded in Wuhan wasn’t ePPP because we never put it through the P3O process that could have decided that it was an ePPP”.
Thanks very much for writing this. Unnecessarily alienating people for stupid reasons like misusing terminology or not understanding their perspective seems like a clear mistake and hopefully this can help people avoid these pitfalls.
I was curious about your suggestion that a lot of researchers think that basically all biomedical research is gain/loss of function. My impression is that one of the key defenses that the Fauci/NIH/EcoHealth/etc. offered for their research in Wuhan was that it was technically not Gain of Function, even if some parts of it might sound like Gain of Function to the layperson, which seems in tension with this claim. Do you think they were wrong about this?
Your point about many researchers in this area feeling defensive was a good one, and it’s important to take this into account when talking to people. It’s not fun to be blamed for causing a global crisis, especially if its possible you didn’t do it!
I do think there’s a risk of a missing mood, however. Speaking as an outsider, the amount of regulation on what you refer to as ePPP (adding functionality to make diseases more dangerous) seems shockingly low. The article you link to tries to make it sound like there are a lot of safeguards, but it seems to me like virtually all the steps only apply if you are seeking federal funding. This is not a standard we accept in other areas! If you are making a car, or building a nuclear power plant, or running a bank or airline, you have to accept extremely intrusive regulation regardless of your funding, and for many things—like nuclear weapons or money laundering—US regulation has world-wide reach.
For comparison, here are some of the regulatory approaches used in finance, another industry that was widely blamed for causing a global crisis:
Monitoring of all electronic communications.
Quarterly reporting of activities to regulators.
Regulators placed inside your organization.
Risk Management departments at every firm, making up a non-trivial fraction of their total size.
Compliance departments at every firm, making up a non-trivial fraction of their total size.
Mandatory compliance training for all employees.
Novel regulatory bodies with broad mandate to find and punish things that seem bad.
AML & KYC rules that impose stringent penalties for doing business with suspisious actors.
Mandatory insurance.
Direct government control over key areas.
‘Voluntary’ advice from regulators (100% adoption due to legal risks of disobedience).
Multi-million-dollar whistleblower awards.
Billion-dollar fines.
Decades long jail sentences.
Lifetime bans from the industry.
… and most importantly, if you mess up, you will probably lose a lot of money.
As far as I can, few if any of these approaches are imposed on Gain of Function research, despite the potential to directly cause the death of all of humanity.
Hi, thanks for the comments! Some broad thoughts in response:
Re
It’s hard for me to go into detail on a public platform on this (just to be cautious to my job) but I can broadly say that there’s a difference between research that is a) gaining a function, b) gain-of-function as defined by informal norms in the biomedical community, and c) what is formally DURC / GoF research as defined by U.S. government policy. The EcoHealth grants fall confusingly as or as not GoF depending on how GoF is defined.
Re
I fully agree! I think there are many concrete needs in this space including legal regulation over DURC /ePPP/GoF research in the U.S. particularly but also every country that practices such research. To achieve such regulation requires a ton of work, consensus building, and thought into what constructive regulation that captures risk while not alienating / shutting down an entire research field is tough and part of the nuances that I think we as a community need to work towards
This is a perfectly reasonable point to bring up, and I agree that we should critically consider whether or not policy and regulation in the field is adequate. I want to emphasize some ways that high-risk biological research differs from finance, nuclear weapons, and money laundering.
First, people don’t do gain of function research (or whatever we ought to call it) for profit, so imposing gigantic fines, the threat of jail time, and constant severe scrutiny would be tantamount to banning it outright. Likewise, private companies are pursuing profits when they build nuclear weapons. Medicine is, of course, heavily regulated, and once again it is the profit motive that allows the industry to thrive even in such a heavily regulated context.
Soldiers operating and maintaining nuclear weapons have given permission for the military to exert extremely intrusive control over their activities. Some of the best and brightest scientists worked for the military as an act of patriotic service to build the nuclear bomb during WWII. However, the Manhattan Project was aimed at a specific engineering outcome, while GoF research would be an ongoing effort with no “definition of done,” and it might be hard to convince an adequate number of high-quality scientists to sign up for such strict controls if it was for their entire careers.
Money laundering is a crime, so it is not “regulated” but policed. Nobody but terrorists would do gain of function research if it was illegal.
For a person who’d like to see gain of function research banned, any move to regulate it and punish violations would be a step in the right direction. However, those who’d like to enforce responsible behavior, perhaps by using regulations on part with those you describe, have to explain how they’d motivate already-beleaguered scientists to do GoF research when their proposal is “even more stick, still no carrot.”
I’m curious to know whether and to what extent we’ve considered ways to reward basic science researchers for making pandemic-mitigating discoveries in a public health context. Is there a way we can reward people for achieving the maximum public health benefit with the minimum risk in their research?
Money laundering is a crime, one which the government primarily combats using AML (Anti Money Laundering) regulations. This apply to essentially all financial companies, and require them to evaluate their clients and counterparties for money laundering risk. If a bank facilitates money laundering it can be punished for this even if it didn’t want to do it; all that is required for AML violations is that the bank didn’t try hard enough to avoid accidentally helping launder money. These regulations are a big deal, with very large number of people employed in their enforcement and large fines for violations. It is much easier to fine a bank than it is to fine the underlying money launderer. AML rules are effectively a way for governments, which are not really capable of catching money laundering, to outsource the expense and difficulty.
The equivalent here would be if medical equipment companies, reagent companies, lab animal companies had to do due diligence on researchers and were liable if they lacked sufficient processes to ensure they didn’t sell supplies to researchers who performed dangerous experiments. As with AML, this allows the government to impose large fines on for-profit companies (which it can easily force to pay) over smaller, potentially judgement-proof labs, which still achieving much the same goal.
That’s a helpful reframing, thank you. I think there is still a disconnect between the two cases, however. As money laundering is a crime, companies have a relatively simple task before them: to identify and eliminate money laundering.
By contrast, GoF research is not a crime, and the objective, from a “responsibly pro-GoF” point of view, is to improve the risk/reward ratio to an acceptable level. A company would be likely to be highly conservative in making these judgments, as they would capture none of the benefits of successful and informative GoF research, but would be punished for allowing overly risky or failed GoF research to go forward. In other words, companies would likely refuse to sell to GoF research entirely in order to minimize or eliminate their risk.
The problem is even more acute if the work of evaluating GoF research was foisted onto companies. Scientists might be motivated by curiosity, altruism, or a desire for scientific credit, so there is at least some reward to be had even if GoF research were much more stringently regulated. By contrast, regulating companies in the manner you propose would come with no incentive whatsoever for companies to sell to GoF research, thus effectively banning it.
I think actually the analogy extends even here!
What exactly is money laundering is not always black and white, and financial firms do not have anything like certainty about whether any given person, entity or transaction is guilty. Instead they adopt a series of rules that, on average, reduce money laundering, but not to zero, and there are false positives. These especially effect low income people, immigrants, those with little documentation, and people with unusual situations. AML rules directly contribute to the problem of people being unbanked (lacking access to the formal financial system, being reliant on cheque cashers etc.) - the government knows this and accepts it as a necessary cost.
Similarly, I would imagine that not all GoF research would be illegal—but some would, and governments could deputize firms to help to differentiate. This would disrupt some legitimate researchers but could be generally regarded by policymakers as an acceptable price to pay.
Clearly there are some dis-analogies. There are many fewer biomedical researchers than money transfers, which makes in-depth evaluation of each one more viable. And as you noted the (financial and otherwise) benefit of research is more distant from the people undertaking it. I’m not trying to make a strong claim that this is a particularly good model for GoF regulation; just noting that I think researchers don’t realize quite how unregulated they are relative to other industries.
It’s important to keep in mind that while money laundering is typically carried out by profit-seeking criminals who take advantage of complex financial transactions to hide their illegal activities, GoF research is not driven by financial gain. Therefore, we need to consider the unique nature of GoF research when assessing the need for regulation.
It’s not just a matter of how much regulation is in place, but also about finding a balance between the pressures to engage in the research and a regulatory framework that effectively manages any potential risks. If there’s an inadequate regulatory apparatus in place relative to the pressures to participate, then the field is “underregulated.” Conversely, if there’s too much regulation, the field may be at risk of becoming “overregulated.”
Given the significant risks associated with GoF research, it requires a high level of regulation compared to other public service research areas that have similarly limited pressures to participate. However, because profit is not a driving force, the field can only tolerate a certain amount of regulation before participation becomes difficult.
Rather than focusing on increasing regulation dramatically or maintaining the status quo, we should look to refine and improve regulation for GoF research. While some scope exists to tighten regulations, excessive regulation could stifle the field altogether, which may or may not be desirable. If we wish the field to continue while enhancing the risk-benefit ratio, our focus should be on regulating the field proportionately to the pressures to participate.
It’s time to shift the discussion from “how regulated is the field” to “how regulated is the field relative to the pressures to participate.” By doing so, we can strike a balance between promoting the field’s progress and ensuring appropriate risk management.
The international community funded a database of Coronaviruses that was held by the lab in Wuhan. In September 2019, the month when the Chinese military overtook the lab, that database was taken offline.
If that database would have been important for pandemic prevention and vaccine development, I would have expected the virologists to write OPs publically calling on China to release the data. That they didn’t is a clear statement about what they think for how useful that data is for pandemic prevention and how afraid they are that people look critically at the Wuhan Institute of Virology.
The virologists seemed to ignore the basic science questions such as “How do these viruses spread?” and “Are they airborne?” that actually mattered.
Asking those questions would mean doing more biomedical research that isn’t gain of function and loss of function.
That assumes that it’s important to motivate them to do GoF research. It seems that research served for them as a distraction from doing the relevant research.
Are you sure that virologists didn’t write such OPs?
My understanding is that in the US, they actually studied these questions hard and knew about things like airborn transmission and asymptomatic spread pretty early on, but were suppressed by the Trump administration. That doesn’t excuse them—they ought to have grown a spine! - but it’s important to recognize the cause of failure accurately so that we can work on the right problem.
Pretty much, when I googled about the fact that they took down the database I found no such OPeds. If you have any evidence to the contrary I would love to see it.
If you talk about that it’s wrong that they took down the database that points to the fact that the early lab leak denial was bullshit and the virologists cared nobody finding out that the arguments they made were bullshit.
Jeremy Farrar describes in his book that one of the key arguments they used to reject the lab leak theory as the huge distance from the openly published sequences to the COVID-19 sequence. That argument becomes a lot weaker when you factor in that the military overtook the lab in September 2019 and at that month they took down their database.
The virologists cared more about keeping the public uninformed about what happened at the Wuhan Institute for Virology than they cared about the database being available to help for fighting the pandemic.
Knowing that airborne transmission matters has consequences about what actions you want to take.
When the Japanese health authorities advice at the beginning of the pandemic to avoid closed spaces with poor ventilation US and EU authorities didn’t give that advice.
I find it pretty unlikely that Fauci et al didn’t give the same advice of avoiding closed spaces that the Japanese authorities gave out because the Trump administration didn’t want them to tell people to avoid closed spaces but the Trump administration preferred the advice of telling people to wash their hands.
One of the corollaries of “avoid closed spaces with poor ventilation” is that forbidding people from meeting each other outside is bad policy.
The 1.5 meter distance recommendation makes little sense with airborne spread but was quite central for pandemic guidance.
There’s some research that suggests that flu transmission can be reduced in school by controlling the humidity level. There’s a good chance that you can also reduce COVID-19 transmission by controlling indoor humidity but the virologists didn’t care enough about doing the basic research to establish that to get a policy in place that all public buildings get humidity controlled.
There was no ramp-up of indoor ventilation production at the start of the pandemic but it would have been the reasonable step if the problem would have been seen as one of airborne transmission.
The WHO took two years to acknowledge airborne transmission. If the virologist community would have done their job, they would have explained to the WHO early on that it has to acknowledge airborne transmission or be branded by the virologists as science deniers.
Not completely clear on what the context the researchers were speaking to but a standard strategy in figuring out what genes do is to knock out (loss of function) the gene of interest in a model organism and observe what happens. Synthetic biology also has a lot of ‘gain of function’ engineering e.g. make microbes produce insulin.
It not only sounds that way to a lay-person. The NIH stopped the EcoHealth grant that was partly paying for the research in Wuhan for a short time in 2016. When they renewed the grant Peter Dasek from EcoHealth wrote back:
“This is terrific! We are very happy to hear that our Gain of Function research funding pause has been lifted.”
Fauci himself wrote on the 1st February 2020 and email that had one of the study in the attachment with the file name “Baric, Shi et al—Nature medicine—SARS Gain of function”.
What Fauci/NIH/EcoHealth is saying seems to be something like “when people say ‘gain of function’ they really mean ePPP and the research they funded in Wuhan wasn’t ePPP because we never put it through the P3O process that could have decided that it was an ePPP”.