A 10% chance of a million people dying is as bad as 100,000 people dying with certainty, if you’re risk-neutral. Essentially that’s the main argument for working on a speculative cause like AGI—if there’s a small chance of the end of humanity, that still matters a great deal.
As for “Won’t other people take care of this”, well...you could make that same argument about global health and development, too. More people is good for increasing potential impact of both fields.
(Also worth noting—EA as a whole does devote a lot of resources to global health and development, you just don’t see as many posts about it because there’s less to discuss/argue about)
But people are already dying—no mights required, why not focus on more immediate problems like global health and development
A 10% chance of a million people dying is as bad as 100,000 people dying with certainty, if you’re risk-neutral. Essentially that’s the main argument for working on a speculative cause like AGI—if there’s a small chance of the end of humanity, that still matters a great deal.
As for “Won’t other people take care of this”, well...you could make that same argument about global health and development, too. More people is good for increasing potential impact of both fields.
(Also worth noting—EA as a whole does devote a lot of resources to global health and development, you just don’t see as many posts about it because there’s less to discuss/argue about)