Thanks for writing this! I’m interested in brain preservation, and thought maybe this area would be competitive with existential risk stuff given certain philosophical pre-commitments, but now I’m actually not sure.
Even on a person affecting view, I’m not sure this is cost competitive with x-risk. If there are ~8 billion people who would be alive at the time of a human extinction event, then reducing the risks of extinction by 0.01% would save 800,000 of them (on expectation). Doing this might cost $100MM - $1B, for a cost of $125-$1,250/​person.
This is cheaper than I think even fairly optimistic estimates of the cost of brain preservation.
Caveats:
This analysis assumes that the people who survive an extinction event would go on to have a similar quantity and quality of life as those who have been preserved. I think this is a reasonable assumption if the extinction event is AI-singularity-shaped, but not if it’s something like a pandemic.
Certain methods of reducing extension risk (e.g. civilizational refuges) still result in almost everyone dying, so the cost-effectiveness of X-Risk reduction on person affecting grounds is probably lower than what I’m assuming above.
These caveats might get you the extra 10-100x you need to become cost competitive, but I’m not sure, and even then you’re only getting to cost competitiveness, not being much better.
I think I basically agree that if someone can identify a way to reduce extinction risk by 0.01% for $100M-1B, then that would be a better use of marginal funds than the direct effects of brain preservation.
Thanks for writing this! I’m interested in brain preservation, and thought maybe this area would be competitive with existential risk stuff given certain philosophical pre-commitments, but now I’m actually not sure.
Even on a person affecting view, I’m not sure this is cost competitive with x-risk. If there are ~8 billion people who would be alive at the time of a human extinction event, then reducing the risks of extinction by 0.01% would save 800,000 of them (on expectation). Doing this might cost $100MM - $1B, for a cost of $125-$1,250/​person.
This is cheaper than I think even fairly optimistic estimates of the cost of brain preservation.
Caveats:
This analysis assumes that the people who survive an extinction event would go on to have a similar quantity and quality of life as those who have been preserved. I think this is a reasonable assumption if the extinction event is AI-singularity-shaped, but not if it’s something like a pandemic.
Certain methods of reducing extension risk (e.g. civilizational refuges) still result in almost everyone dying, so the cost-effectiveness of X-Risk reduction on person affecting grounds is probably lower than what I’m assuming above.
These caveats might get you the extra 10-100x you need to become cost competitive, but I’m not sure, and even then you’re only getting to cost competitiveness, not being much better.
I think I basically agree that if someone can identify a way to reduce extinction risk by 0.01% for $100M-1B, then that would be a better use of marginal funds than the direct effects of brain preservation.