I think mainstream longtermist EA is already on a path to try and help create the hedonium shockwave if and only if it’s the right thing to do. The “only if” part seems really important—turning 99.999% of the accessible universe into hedonium seems like a quite bad idea unless you’re extraordinarily confident in your ethical views. But it does seem like one theoretically possible outcome of the type of long reflection MacAskill advocates in WWOTF:
As an ideal, we could aim for what we can call the long reflection: a stable state of the world in which we are safe from calamity and we can reflect on and debate the nature of the good life, working out what the most flourishing society would be. I call this the “long” reflection not because of how long this period would last but because of how long it would be worth spending on it. It’s worth spending five minutes to decide where to spend two hours at dinner; it’s worth spending months to choose a profession for the rest of one’s life. But civilisation might last millions, billions, or even trillions of years. It would therefore be worth spending many centuries to ensure that we’ve really figured things out before we take irreversible actions like locking in values or spreading across the stars.
It’s really not clear to me that there’s a better path to the hedonium shockwave than what longtermsists are already doing—trying to ensure humanity survives and prospers and makes it to a place where we have more hope of reaching a consensus about whether or not it’s the right course of action. Of course, if the shockwave really is the right thing to do, waiting to start it would lead to a great deal of astronomical waste. But this is a small price to pay relative to the risks of destroying ourselves or causing great harm if our moral views are wrong.
I think mainstream longtermist EA is already on a path to try and help create the hedonium shockwave if and only if it’s the right thing to do. The “only if” part seems really important—turning 99.999% of the accessible universe into hedonium seems like a quite bad idea unless you’re extraordinarily confident in your ethical views. But it does seem like one theoretically possible outcome of the type of long reflection MacAskill advocates in WWOTF:
It’s really not clear to me that there’s a better path to the hedonium shockwave than what longtermsists are already doing—trying to ensure humanity survives and prospers and makes it to a place where we have more hope of reaching a consensus about whether or not it’s the right course of action. Of course, if the shockwave really is the right thing to do, waiting to start it would lead to a great deal of astronomical waste. But this is a small price to pay relative to the risks of destroying ourselves or causing great harm if our moral views are wrong.