I would like to humbly suggest that people not engage in active plots to destroy humanity based on their personal back of the envelope moral calculations.
I think that the other 8 billion of us might want a say, and I’d guess we’d not be particularly happy if we got collectively eviscerated because some random person made a math error.
So we get to use cold hard rationality to tell most people that the stuff they are doing is relatively worthless compared to x-risk reduction, but when that same rationality argues that x-risk reduction is actually incredibly high variance and may very well be harming trillions of the people in the future we get to be humanists ?
I would like to humbly suggest that people not engage in active plots to destroy humanity based on their personal back of the envelope moral calculations.
I think that the other 8 billion of us might want a say, and I’d guess we’d not be particularly happy if we got collectively eviscerated because some random person made a math error.
So we get to use cold hard rationality to tell most people that the stuff they are doing is relatively worthless compared to x-risk reduction, but when that same rationality argues that x-risk reduction is actually incredibly high variance and may very well be harming trillions of the people in the future we get to be humanists ?