The barriers a “DNA registry” would impose on a terrorist (the only bad actor who’d be inconvenienced by it) would be trivial if they had the capability to do the other things necessary to produce a bioweapon. In fact, DNA synthesis and sequencing wouldn’t even be a necessary part of such an endeavor. I won’t describe the technological reasons why, but a basic familiarity with these technologies will make the reasons why clear. On the other hand, depending on execution, it could be rather annoying for legitimate researchers.
The idea of sealing off biological infohazards seems reasonable to me and like it might do some good. The world does not need public info on how to most effectively culture bird flu. This has precedent in the classification of military secrets and protection of patient data, would impact only a small number of researchers, and could be linked to the grant approval and IRB process for implementation.
As a key caveat, though, I would want such a system to be lightweight and specific. For example, you don’t want to make it harder to order COVID-19 genomic data or proteins, because then pandemic response becomes much harder. Thousands of scientists need that info, and if you tried to red tape it up, they’d slow to a crawl. Someone wiser than me would need to figure out the minimal information you need to hide to make it much harder to make a bioweapon while minimally inconveniencing the research community.
I don’t know much about the politics stuff here. However, my read is that those currently in power have a “nature is the ultimate bioterrorist” view. Convincing them to change their mind, replacing them with someone having a different viewpoint, or installing an alternative power center, seems hard.
I could imagine an approach involving a nonprofit staffed by people who know what they’re doing looking at grant applications or papers and putting a media spotlight on the most flagrantly risky and stupid stuff. But of course you’d have to find people willing to take on that role and also get the media willing to go agains the powers that be on this. I wouldn’t work there, and I don’t know if I’d listen to them either if they seemed unmeasured in their criticism. And who would choose to work at such an org but a bunch of alarmists? I think there would be perception issues.
Possibly you could “bribe” scientists by offering more and easier grant money for dangerous bioscience research provided they verifiably comply with enhanced safeguards during the research. That could allow an extremely well funded granting org like FTX to “pay for safety” rather than trying to gain control of the NIH purse strings to “enforce safety.” Think of it as harm reduction.
Thank you for your thoughts, I agree that this is tricky—but I believe we should at the very least have some discussions on this. The scenario I think about is based on the following reasoning (and targets not yet known pathogens): a) we are conducting research to identify new potential pandemic pathogens, b) DNA synthesis capabilities + other molecular biology capabilities required to synthesise viruses are becoming more accessible, we cannot count on all orders being properly screened, c) only a small number of labs (~20?) actually work on a given potential pandemic pathogen plus some public health folks, definitely not more than 1000s of people, therefore at least 1 to 2 order of magnitude fewer individuals than all those capable of synthesizing the potential pandemic pathogen (this obviously changes once a potential pandemic pathogen enters humans and becomes a pandemic pathogen, then genome definitely needs to be public), d) can we have those few people apply to access the genomes from established databases similar to how people apply to access patient data?
In terms of needing such a system to be lightweight and specific: this also implies needing it what is sometimes called “adaptive governance” (i.e. you have to be able to rapidly change your rules when new issues emerge).
For example, there were ambiguities about whether SARS-CoV-2 fell under Australia Group export controls on “SARS-like-coronaviruses” (related journal article)… a more functional system would include triggers for removing export controls (e.g. at a threshold of global transmission, public health needs will likely outweigh biosecurity concerns about pathogen access)
The barriers a “DNA registry” would impose on a terrorist (the only bad actor who’d be inconvenienced by it) would be trivial if they had the capability to do the other things necessary to produce a bioweapon. In fact, DNA synthesis and sequencing wouldn’t even be a necessary part of such an endeavor. I won’t describe the technological reasons why, but a basic familiarity with these technologies will make the reasons why clear. On the other hand, depending on execution, it could be rather annoying for legitimate researchers.
The idea of sealing off biological infohazards seems reasonable to me and like it might do some good. The world does not need public info on how to most effectively culture bird flu. This has precedent in the classification of military secrets and protection of patient data, would impact only a small number of researchers, and could be linked to the grant approval and IRB process for implementation.
As a key caveat, though, I would want such a system to be lightweight and specific. For example, you don’t want to make it harder to order COVID-19 genomic data or proteins, because then pandemic response becomes much harder. Thousands of scientists need that info, and if you tried to red tape it up, they’d slow to a crawl. Someone wiser than me would need to figure out the minimal information you need to hide to make it much harder to make a bioweapon while minimally inconveniencing the research community.
I don’t know much about the politics stuff here. However, my read is that those currently in power have a “nature is the ultimate bioterrorist” view. Convincing them to change their mind, replacing them with someone having a different viewpoint, or installing an alternative power center, seems hard.
I could imagine an approach involving a nonprofit staffed by people who know what they’re doing looking at grant applications or papers and putting a media spotlight on the most flagrantly risky and stupid stuff. But of course you’d have to find people willing to take on that role and also get the media willing to go agains the powers that be on this. I wouldn’t work there, and I don’t know if I’d listen to them either if they seemed unmeasured in their criticism. And who would choose to work at such an org but a bunch of alarmists? I think there would be perception issues.
Possibly you could “bribe” scientists by offering more and easier grant money for dangerous bioscience research provided they verifiably comply with enhanced safeguards during the research. That could allow an extremely well funded granting org like FTX to “pay for safety” rather than trying to gain control of the NIH purse strings to “enforce safety.” Think of it as harm reduction.
Thank you for your thoughts, I agree that this is tricky—but I believe we should at the very least have some discussions on this. The scenario I think about is based on the following reasoning (and targets not yet known pathogens): a) we are conducting research to identify new potential pandemic pathogens, b) DNA synthesis capabilities + other molecular biology capabilities required to synthesise viruses are becoming more accessible, we cannot count on all orders being properly screened, c) only a small number of labs (~20?) actually work on a given potential pandemic pathogen plus some public health folks, definitely not more than 1000s of people, therefore at least 1 to 2 order of magnitude fewer individuals than all those capable of synthesizing the potential pandemic pathogen (this obviously changes once a potential pandemic pathogen enters humans and becomes a pandemic pathogen, then genome definitely needs to be public), d) can we have those few people apply to access the genomes from established databases similar to how people apply to access patient data?
In terms of needing such a system to be lightweight and specific: this also implies needing it what is sometimes called “adaptive governance” (i.e. you have to be able to rapidly change your rules when new issues emerge).
For example, there were ambiguities about whether SARS-CoV-2 fell under Australia Group export controls on “SARS-like-coronaviruses” (related journal article)… a more functional system would include triggers for removing export controls (e.g. at a threshold of global transmission, public health needs will likely outweigh biosecurity concerns about pathogen access)