On this view, why not work to increase extinction risk? (It would be odd if doing nothing was the best course of action when the stakes are so high either way.)
You could defend the idea that extinction risk reduction is net negative or highly ambiguous in value, even just within EA and adjacent communities. Convincing people to not work on things that are net negative by your lights seems not to break good heuristics or norms.
On this view, why not work to increase extinction risk? (It would be odd if doing nothing was the best course of action when the stakes are so high either way.)
It’d be hard to do without breaking a lot of good heuristics (i.e. don’t lie, don’t kill people)
You could defend the idea that extinction risk reduction is net negative or highly ambiguous in value, even just within EA and adjacent communities. Convincing people to not work on things that are net negative by your lights seems not to break good heuristics or norms.
(I suspect this explain like half of Émile Torres’ deal.)