[This comment is a tangential and clarifying question; I havenāt yet read your post]
Ordās book is a great restatement of the ethical case, though I disagree with his prioritisation of climate change, nuclear weapons and collapse
If I didnāt know anything about you, Iād assume this meant āToby Ord suggests climate change, nuclear weapons, and collapse should be fairly high priorities. I disagree (while largely agreeing with Ordās other priorities).ā
But Iām guessing you might actually mean āToby Ord suggests climate change, nuclear weapons, and collapse should be much lower priorities than things like AI and biorisk (though they should still get substantial resources, and be much higher priorities than things like bednet distribution). I disagree; I think those things should be similarly high priorities to things like AI and biorisk.ā
Is that guess correct?
Iām not sure whether my guess is based on things Iāve read from you, vs just a general impression about what views seem common at CSER, so I could definitely be wrong.
Thatās right, I think they should be higher priorities. As you show in your very useful post, Ord has nuclear and climate change at 1/ā1000 and AI at 1ā10. Iāve got a draft book chapter on this, which I hope to be able to share a preprint of soon.
[This comment is a tangential and clarifying question; I havenāt yet read your post]
If I didnāt know anything about you, Iād assume this meant āToby Ord suggests climate change, nuclear weapons, and collapse should be fairly high priorities. I disagree (while largely agreeing with Ordās other priorities).ā
But Iām guessing you might actually mean āToby Ord suggests climate change, nuclear weapons, and collapse should be much lower priorities than things like AI and biorisk (though they should still get substantial resources, and be much higher priorities than things like bednet distribution). I disagree; I think those things should be similarly high priorities to things like AI and biorisk.ā
Is that guess correct?
Iām not sure whether my guess is based on things Iāve read from you, vs just a general impression about what views seem common at CSER, so I could definitely be wrong.
Thatās right, I think they should be higher priorities. As you show in your very useful post, Ord has nuclear and climate change at 1/ā1000 and AI at 1ā10. Iāve got a draft book chapter on this, which I hope to be able to share a preprint of soon.
Is your preprint available now? Iād be curious to read your thoughts about why climate change and nuclear war should be prioritized more.