On this view, why not work to increase extinction risk? (It would be odd if doing nothing was the best course of action when the stakes are so high either way.)
You could defend the idea that extinction risk reduction is net negative or highly ambiguous in value, even just within EA and adjacent communities. Convincing people to not work on things that are net negative by your lights seems not to break good heuristics or norms.
The far future, on our current trajectory, seems net negative on average. Reducing extinction risk just multiplies its negative EV.
Have you written elsewhere on why you think the far future seems net negative on average on our current trajectory?
On this view, why not work to increase extinction risk? (It would be odd if doing nothing was the best course of action when the stakes are so high either way.)
It’d be hard to do without breaking a lot of good heuristics (i.e. don’t lie, don’t kill people)
You could defend the idea that extinction risk reduction is net negative or highly ambiguous in value, even just within EA and adjacent communities. Convincing people to not work on things that are net negative by your lights seems not to break good heuristics or norms.
(I suspect this explain like half of Émile Torres’ deal.)