I agree with the last point. Based on Ben Todd’s presentation at EAG,
18% of engaged EAs work on AI alignment, while
4% work on Biosecurity.
Based on Toby Ord’s estimates in the Precipice, the risk of extinction in the next 100 years from
Unaligned artificial intelligence is ∼ 1 in 10, while
the risk from engineered pandemics is ∼ 1 in 30.
So, the stock of people in AI is 4.5x higher than Biosecurity, while AI is only 3x as important.
There is a lot of nuance missing here, but I’m moderately confident that this dysbalance warrants more people moving into Biosecurity. Especially now that there we’re in a moment of high traceability concerning pandemic preparedness.
I don’t have much of an opinion yet, but heard these sentences on EAG that might be interesting to consider here:
EA is overall too much invested in crypto and tech stocks
EAs are neglecting EU policy paths compared to UK
EAs are flocking too much towards anything AI
Caveat: I work in Biosecurity.
I agree with the last point. Based on Ben Todd’s presentation at EAG,
18% of engaged EAs work on AI alignment, while
4% work on Biosecurity.
Based on Toby Ord’s estimates in the Precipice, the risk of extinction in the next 100 years from
Unaligned artificial intelligence is ∼ 1 in 10, while
the risk from engineered pandemics is ∼ 1 in 30.
So, the stock of people in AI is 4.5x higher than Biosecurity, while AI is only 3x as important.
There is a lot of nuance missing here, but I’m moderately confident that this dysbalance warrants more people moving into Biosecurity. Especially now that there we’re in a moment of high traceability concerning pandemic preparedness.