That’s certainly something worth worrying about. But we could also worry that if we successfully eliminate x-risks, we still need to ensure that the far future has lots of happiness and minimal suffering, and this might not happen by default. It’s not clear which is more important. I lean a little toward x-risk reduction but it’s hard to say.
That’s certainly something worth worrying about. But we could also worry that if we successfully eliminate x-risks, we still need to ensure that the far future has lots of happiness and minimal suffering, and this might not happen by default. It’s not clear which is more important. I lean a little toward x-risk reduction but it’s hard to say.