[Question] What is the best way to explain that s-risks are important—basically, why existence is not inherently better than non existence? Intending this for someone mostly unfamiliar with EA, like someone in an intro program
Also, I think we could say: Imagine you’re going to be thrown to a volcano for 10 minutes, but you’d get X years of happiness, how many years are you willing to do this exchange? I think most of us even if we wouldn’t want to be thrown to a volcano to exchange 1000 years of happiness, that’s why reducing extreme suffering is important
I think some of Brain Tomasiks essays are quite persuasive: https://briantomasik.com/
Also, I think we could say: Imagine you’re going to be thrown to a volcano for 10 minutes, but you’d get X years of happiness, how many years are you willing to do this exchange? I think most of us even if we wouldn’t want to be thrown to a volcano to exchange 1000 years of happiness, that’s why reducing extreme suffering is important