Nice—how about adding a GCR component? For instance, you could ask how much people value future lives as a fraction of current lives. Then you could say even if the only GCR were asteroid impact and even if you only thought humans would exist as long as the average mammal species, if you didn’t care when people were born, you could still save life years at only $2.5.
Good idea, I’ve had a similar discussion with someone else about this. I think that would be a good idea for a separate calculator more targeted at EAs, but since this one is targeted at those who are non-EA aligned, I don’t think that’s a good idea for the following reasons:
It would complicate the calculator, not the point of being too complicated to be accurate, but to be too complicated to resonate or be meaningful for someone from the vast non-EA population of people who don’t find such rigour to be very compelling.
I think it would be easier to get people thinking about their morals and effectiveness in present animal terms (human and non-human) first than to go straight for x-risk.
There could even be a follow-up calculator after this simple one (e.g. “Feeling compelled? Click here for an even more shocking calculator”).
Nice—how about adding a GCR component? For instance, you could ask how much people value future lives as a fraction of current lives. Then you could say even if the only GCR were asteroid impact and even if you only thought humans would exist as long as the average mammal species, if you didn’t care when people were born, you could still save life years at only $2.5.
Good idea, I’ve had a similar discussion with someone else about this. I think that would be a good idea for a separate calculator more targeted at EAs, but since this one is targeted at those who are non-EA aligned, I don’t think that’s a good idea for the following reasons:
It would complicate the calculator, not the point of being too complicated to be accurate, but to be too complicated to resonate or be meaningful for someone from the vast non-EA population of people who don’t find such rigour to be very compelling.
I think it would be easier to get people thinking about their morals and effectiveness in present animal terms (human and non-human) first than to go straight for x-risk.
There could even be a follow-up calculator after this simple one (e.g. “Feeling compelled? Click here for an even more shocking calculator”).