But the expected value of existential risk reduction is—if not infinite, which I think it clearly is in expectation—extremely massive.
I commented something similar on your blog, but as soon as you allow that one decision is infinite in expectation you have to allow that all outcomes are, since whatever possibility of infinite value you have given that action must still be present without it.
If you think the Bostrom number of 10^52 happy people has a .01% chance of being right, then you’ll get 10^48 expected future people if we don’t go extinct, meaning reducing odds of existential risks by 1/10^20 creates 10^28 extra lives.
Reasoning like this seems kind of scope insensitive to me. In the real world, it’s common to see expected payoffs declining as offered rewards get larger, and I don’t see any reason to think this pattern shouldn’t typically generalise to most such prospects, even when the offer is astronomically large.
The odds are not trivial that if we get very advanced AI, we’ll basically eliminate any possibility of human extinction for billions of years.
I think the stronger case is just security in numbers. Get a civilisation around multiple star systems and capable of proliferating, and the odds of its complete destruction rapidly get indistinguishable from 0.
I commented something similar on your blog, but as soon as you allow that one decision is infinite in expectation you have to allow that all outcomes are, since whatever possibility of infinite value you have given that action must still be present without it.
Reasoning like this seems kind of scope insensitive to me. In the real world, it’s common to see expected payoffs declining as offered rewards get larger, and I don’t see any reason to think this pattern shouldn’t typically generalise to most such prospects, even when the offer is astronomically large.
I think the stronger case is just security in numbers. Get a civilisation around multiple star systems and capable of proliferating, and the odds of its complete destruction rapidly get indistinguishable from 0.