After thinking about this post (“Utilitarians Should Accept that Some Suffering Cannot be “Offset””) some more, there’s an additional, weaker claim I want to emphasize, which is: You should be very skeptical that it’s morally good to bring about worlds you wouldn’t personally want to experience all of
We can imagine a society of committed utilitarians all working to bring about a very large universe full of lots of happiness and, in an absolute sense, lots of extreme suffering. The catch is that these very utilitarians are the ones that are going to be experiencing this grand future—not future people or digital people or whomever.
Personally, they all desperately want to opt out of this—nobody wants to actually go through the extreme suffering regardless of what benefits await, and yet all work day in and day out to bring about this future anyway, condemning themselves to heaven and to hell.
Of course it’s not impossible that their stated morals are right and preferences are morally wrong, but my claim is that we should be very skeptical that the “skin in the game” answer is the wrong one.
And of course here I’m just supposing/assuming that these utilitarians have the preference to avoid this massive future altogether
The point, again, is that you should probably not work to build a future that you yourself would not want to experience just because some abstract arguments say that this is morally good
This is a strong hint (not definitive, but strong) that those abstract arguments are wrong.
After thinking about this post (“Utilitarians Should Accept that Some Suffering Cannot be “Offset””) some more, there’s an additional, weaker claim I want to emphasize, which is: You should be very skeptical that it’s morally good to bring about worlds you wouldn’t personally want to experience all of
We can imagine a society of committed utilitarians all working to bring about a very large universe full of lots of happiness and, in an absolute sense, lots of extreme suffering. The catch is that these very utilitarians are the ones that are going to be experiencing this grand future—not future people or digital people or whomever.
Personally, they all desperately want to opt out of this—nobody wants to actually go through the extreme suffering regardless of what benefits await, and yet all work day in and day out to bring about this future anyway, condemning themselves to heaven and to hell.
Of course it’s not impossible that their stated morals are right and preferences are morally wrong, but my claim is that we should be very skeptical that the “skin in the game” answer is the wrong one.
And of course here I’m just supposing/assuming that these utilitarians have the preference to avoid this massive future altogether
The point, again, is that you should probably not work to build a future that you yourself would not want to experience just because some abstract arguments say that this is morally good This is a strong hint (not definitive, but strong) that those abstract arguments are wrong.