What are some reasons people think GCBRs deserve less attention (relative to how Open Phil prioritizes this work)?
I’d be interest to learn more about reasons beyond “a diversity of perspectives and research focuses is good for the field”, or background on why diversifying outside of GCR might be really important for biosecurity in particular. (E.g., “demanding that biosecurity researchers demonstrate relevance to GCBR is likely to stunt more basic or early-stage research that’s also critical for GCBR, but at a greater temporal and causal remove”; or “GCBR is a bad way of thinking about the relationship between GCR and biosecurity, because the main GCR risks in this context are second-order effects from smaller-scale biosecurity incidents rather than e.g. global pandemics”.)
The main object-level argument in Lenzos’ article seems to be that GCBR is “extremely unlikely”:
Biosecurity covers a spectrum of risks, ranging from naturally occurring disease, through unintended consequences of research, lab accidents, negligence, and reckless behavior, to deliberate misuse of pathogens or technology by state and non-state actors. The scenarios all have different likelihoods of playing out—and risks with potential catastrophic consequences on a global scale are among the least likely. But Open Phil dollars are flooding into biosecurity and are absorbing much of the field’s experienced research capacity, focusing the attention of experts on this narrow, extremely unlikely, aspect of biosecurity risk.
If this argument can be made in a compelling way from a perspective that’s longtermist and focused on EV, I’d be really interested to learn more about it.
What are some reasons people think GCBRs deserve less attention (relative to how Open Phil prioritizes this work)?
I’d be interest to learn more about reasons beyond “a diversity of perspectives and research focuses is good for the field”, or background on why diversifying outside of GCR might be really important for biosecurity in particular. (E.g., “demanding that biosecurity researchers demonstrate relevance to GCBR is likely to stunt more basic or early-stage research that’s also critical for GCBR, but at a greater temporal and causal remove”; or “GCBR is a bad way of thinking about the relationship between GCR and biosecurity, because the main GCR risks in this context are second-order effects from smaller-scale biosecurity incidents rather than e.g. global pandemics”.)
The main object-level argument in Lenzos’ article seems to be that GCBR is “extremely unlikely”:
If this argument can be made in a compelling way from a perspective that’s longtermist and focused on EV, I’d be really interested to learn more about it.
Lentzos has written about elsewhere about why she thinks terrorists using synthetic bioweapons is so unlikely. I quickly summarised in this comment: https://forum.effectivealtruism.org/posts/Kkw8uDwGuNnBhiYHi/will-splashy-philanthropy-cause-the-biosecurity-field-to#QupzPSJLmjoF2A4pN