As you laid out in the post, your biggest concern about the long reflection is the likely outcome of a pause—is that roughly correct?
In other words, I understand your preferences are roughly: Extinction < Eternal Long Reflection < Unconstrained Age of Em < Century-long reflection followed by Constrained Age of Em < No reflection + Constrained Age of Em
(As an aside, I would assume that without changing the preference order, we could replace unconstrained versus constrained Age of Em with, say, indefinite robust totalitarianism versus “traditional” transhumanist future.)
I don’t have great confidence that the kinds of constraints that would be imposed on an age of em after a long reflection would actually improve that and further ages.
Yes, you’ve mentioned your skepticism of the efficacy of a long reflection, but conditional on it successfully reducing bad outcomes, you agree with the ordering?
As you laid out in the post, your biggest concern about the long reflection is the likely outcome of a pause—is that roughly correct?
In other words, I understand your preferences are roughly:
Extinction < Eternal Long Reflection < Unconstrained Age of Em < Century-long reflection followed by Constrained Age of Em < No reflection + Constrained Age of Em
(As an aside, I would assume that without changing the preference order, we could replace unconstrained versus constrained Age of Em with, say, indefinite robust totalitarianism versus “traditional” transhumanist future.)
I don’t have great confidence that the kinds of constraints that would be imposed on an age of em after a long reflection would actually improve that and further ages.
Yes, you’ve mentioned your skepticism of the efficacy of a long reflection, but conditional on it successfully reducing bad outcomes, you agree with the ordering?
You ’ll also need to add increasing good outcomes, along with decreasing bad outcomes.