Science just released an article, with an accompanying technical report, about a neglected source of biological risk.
From the abstract of the technical report:
This report describes the technical feasibility of creating mirror bacteria and the potentially serious and wide-ranging risks that they could pose to humans, other animals, plants, and the environment…
In a mirror bacterium, all of the chiral molecules of existing bacteria—proteins, nucleic acids, and metabolites—are replaced by their mirror images. Mirror bacteria could not evolve from existing life, but their creation will become increasingly feasible as science advances. Interactions between organisms often depend on chirality, and so interactions between natural organisms and mirror bacteria would be profoundly different from those between natural organisms. Most importantly, immune defenses and predation typically rely on interactions between chiral molecules that could often fail to detect or kill mirror bacteria due to their reversed chirality. It therefore appears plausible, even likely, that sufficiently robust mirror bacteria could spread through the environment unchecked by natural biological controls and act as dangerous opportunistic pathogens in an unprecedentedly wide range of other multicellular organisms, including humans.
This report draws on expertise from synthetic biology, immunology, ecology, and related fields to provide the first comprehensive assessment of the risks from mirror bacteria.
Open Philanthropy helped to support this work and is now supporting the Mirror Biology Dialogues Fund (MBDF), along with the Sloan Foundation, the Packard Foundation, the Gordon and Betty Moore Foundation, and Patrick Collison. The Fund will coordinate scientific efforts to evaluate and address risks from mirror bacteria.
It was deeply concerning to learn about this risk, but gratifying to see how seriously the scientific community is taking the issue.
Given the potential infohazards inherent to a project like this, I imagine Forum readers might be interested in the rationale for public release. This question was discussed on page (iv) of the technical report.
The publications contain a lot more information about these risks and analysis from the scientists involved. If you have additional questions, I might be able to source an answer, but I can’t promise I’ll be able to respond and may take a while to do so. Thank you for understanding.
Copying over the rationale for publication here, for convenience:
IMO, one helpful side effect (albeit certainly not a main consideration) of making this work public, is that it seems very useful to have at least one worst-case biorisk that can be publicly discussed in a reasonable amount of detail. Previously, the whole field / cause area of biosecurity could feel cloaked in secrecy, backed up only by experts with arcane biological knowledge. This situation, although unfortunate, is probably justified by the nature of the risks! But still, it makes it hard for anyone on the outside to tell how serious the risks are, or understand the problems in detail, or feel sufficiently motivated about the urgency of creating solutions.
By disclosing the risks of mirror bacteria, there is finally a concrete example to discuss, which could be helpful even for people who are actually even more worried about, say, infohazardous-bioengineering-technique-#5, than they are about mirror life. Just being able to use mirror life as an example seems like it’s much healthier than having zero concrete examples and everything shrouded in secrecy.
Some of the cross-cutting things I am thinking about:
scientific norms about whether to fund / publish risky research
attempts to coordinate (on a national or international level) moratoriums against certain kinds of research
the desirability of things like metagenomic sequencing, DNA synthesis screening for harmful sequences, etc
research into broad-spectrum countermeasures like UVC light, super-PPE, pipelines for very quick vaccine development, etc
just emphasising the basic overall point that global catastrophic biorisk seems quite real and we should take it very seriously
and probably lots of other stuff!
So, I think it might be a kind of epistemic boon for all of biosecurity to have this public example, which will help clarify debates / advocacy / etc about the need for various proposed policies or investments.
I think it’s true that a lot of topics are not discussed because of concerns about info hazard. But I do think we already had some concrete examples, such as some hotly debated gain of function cases, considering the possibility of something as infectious as measles but as fatal as rabies, or Myxomatosis killing 99% of rabbits.
There is also this essay from Jason Crawford and this piece from Asimov Press that are less technical description of the Science article
I appreciated these parts of Jason’s article, and am curious if others have a different take:
Thanks for sharing this, Aaron!
I agree the “Rationale for Public Release” section is interesting; I’ve copied it here:
When to work on risks in public vs private is a really tricky question, and it’s nice to see this discussion on how this group handled it in this case.
I was first! :P
They both show up as 2:23 pm to me: is there a way to get second level precision?
You can sort by “oldest” and “newest” in the comment-sort order, and see that mine shows up earlier in the “oldest” order, and later in the “newest” order.
You can also right-click → inspect element on the time indicator:
I really appreciate that the comment section has rewarded you both precisely equally.
I’ve only upvoted Habryka , to reward good formatting
But I was first! I demand the moderators transfer all of the karma of Jeff’s comment to mine :P
Accolades for intellectual achievements by tradition go to the person who published them first.
Sure, but surely we give it according to Shapley values? What if you had missed this? We should reward Jeff for that.
Interesting that Robin Hanson brought this up 14 years ago.