Good idea, I’ve had a similar discussion with someone else about this. I think that would be a good idea for a separate calculator more targeted at EAs, but since this one is targeted at those who are non-EA aligned, I don’t think that’s a good idea for the following reasons:
It would complicate the calculator, not the point of being too complicated to be accurate, but to be too complicated to resonate or be meaningful for someone from the vast non-EA population of people who don’t find such rigour to be very compelling.
I think it would be easier to get people thinking about their morals and effectiveness in present animal terms (human and non-human) first than to go straight for x-risk.
There could even be a follow-up calculator after this simple one (e.g. “Feeling compelled? Click here for an even more shocking calculator”).
Good idea, I’ve had a similar discussion with someone else about this. I think that would be a good idea for a separate calculator more targeted at EAs, but since this one is targeted at those who are non-EA aligned, I don’t think that’s a good idea for the following reasons:
It would complicate the calculator, not the point of being too complicated to be accurate, but to be too complicated to resonate or be meaningful for someone from the vast non-EA population of people who don’t find such rigour to be very compelling.
I think it would be easier to get people thinking about their morals and effectiveness in present animal terms (human and non-human) first than to go straight for x-risk.
There could even be a follow-up calculator after this simple one (e.g. “Feeling compelled? Click here for an even more shocking calculator”).