I agree that it’s totally plausible that, once all the considerations are properly analyzed, we’ll wind up vindicating the existential risk view as a simplification of “maximize utility”. But in the meantime, unless one is very confident or thinks doom is very near, “properly analyze the considerations” strikes me as a better simplification of “maximize utility”.
Even if you do think possible doom is near, you might want an intermediate simplification like “some people think about consequentialist philosophy while most mitigate catastrophes that would put this thinking process at risk”.
I agree that it’s totally plausible that, once all the considerations are properly analyzed, we’ll wind up vindicating the existential risk view as a simplification of “maximize utility”. But in the meantime, unless one is very confident or thinks doom is very near, “properly analyze the considerations” strikes me as a better simplification of “maximize utility”.
Even if you do think possible doom is near, you might want an intermediate simplification like “some people think about consequentialist philosophy while most mitigate catastrophes that would put this thinking process at risk”.
Agreed.