I consider grantmakers and donors interested in decreasing extinction risk had better focus on artificial intelligence (AI) instead of nuclear war (more).
I would say the case for sometimes prioritising nuclear extinction risk over AI extinction risk is much weaker than the case for sometimes prioritising natural extinction risk over nuclear extinction risk (more).
I get a sense the extinction risk from nuclear war was massively overestimated in The Existential Risk Persuasion Tournament (XPT) (more).
I have the impression Toby Ord greatly overestimated tail risk in The Precipice (more).
I believe interventions to decrease deaths from nuclear war should be assessed based on standard cost-benefit analysis (more).
I think increasing calorie production via new food sectors is less cost-effective to save lives than measures targeting distribution (more).
Thanks for the update, Cullen! Relatedly, you may want to check my post on Nuclear war tail risk has been exaggerated?.