I think that the expected payoff and the reduction in P(extinction) are just equivalent. Like, a 1% chance of saving 25b is the same as reducing P(extinction) from 7% to 6%, that’s what a “1% chance of saving” means, because:
p(extinction) = 1 - p(extinction reduction from me) - p(extinction reduction from all other causes)
So, if I had a 100% chance of saving 25b lives, then that’d be a 100% reduction in extinction risk.
Of course, what we care about is the counterfactual, so if there’s already only a 50% chance of extinction, then you could say colloquially that I brought P(extinction) from 0.5 to 0, and there I had a “100% chance of saving 25b lives” but that’s not quite right, because I should only get credit for reducing it from 0.5 to 0, so it would be better in that scenario to say that I had a 50% chance of saving 25b, and that’s just as high as that can get.
I’m not sure I follow this exercise. Here’s how I’m thinking about it:
Option A: spend your career on malaria.
Cost: one career
Payoff: save 20k lives with probability 1.
Option B: spend your career on x-risk.
Cost: one career
Payoff: save 25B lives with probability p (=P(prevent extinction)), save 0 lives with probability 1-p.
Expected payoff: 25B*p.
Since the costs are the same, we can ignore them. Then you’re indifferent between A and B if p=8x10^-7, and B is better if p>8x10^-7.
But I’m not sure how this maps to a reduction in P(extinction).
I think that the expected payoff and the reduction in P(extinction) are just equivalent. Like, a 1% chance of saving 25b is the same as reducing P(extinction) from 7% to 6%, that’s what a “1% chance of saving” means, because:
p(extinction) = 1 - p(extinction reduction from me) - p(extinction reduction from all other causes)
So, if I had a 100% chance of saving 25b lives, then that’d be a 100% reduction in extinction risk.
Of course, what we care about is the counterfactual, so if there’s already only a 50% chance of extinction, then you could say colloquially that I brought P(extinction) from 0.5 to 0, and there I had a “100% chance of saving 25b lives” but that’s not quite right, because I should only get credit for reducing it from 0.5 to 0, so it would be better in that scenario to say that I had a 50% chance of saving 25b, and that’s just as high as that can get.