I’m mostly concerned with S-risks, i.e. risks of astronomical suffering. I view it as a more rational form of Pascal’s Wager, and as a form of extreme longtermist self-interest. Since there is still a >0% chance of some form of afterlife or a bad form of quantum immortality existing, raising awareness of S-risks and donating to S-risk reduction organizations like the Center on Long-Term Risk and the Center for Reducing Suffering likely reduces my risk of going to “hell”. See The Dilemma of Worse Than Death Scenarios.
The dilemma is that it does not seem possible to continue living as normal when considering the prevention of worse than death scenarios. If it is agreed that anything should be done to prevent them then Pascal’s Mugging seems inevitable. Suicide speaks for itself, and even the other two options, if taken seriously, would change your life. What I mean by this is that it would seem rational to completely devote your life to these causes. It would be rational to do anything to obtain money to donate to AI safety for example, and you would be obliged to sleep for exactly nine hours a day to improve your mental condition, increasing the probability that you will find a way to prevent the scenarios. I would be interested in hearing your thoughts on this dilemma and if you think there are better ways of reducing the probability.
I’m mostly concerned with S-risks, i.e. risks of astronomical suffering. I view it as a more rational form of Pascal’s Wager, and as a form of extreme longtermist self-interest. Since there is still a >0% chance of some form of afterlife or a bad form of quantum immortality existing, raising awareness of S-risks and donating to S-risk reduction organizations like the Center on Long-Term Risk and the Center for Reducing Suffering likely reduces my risk of going to “hell”. See The Dilemma of Worse Than Death Scenarios.