Because you’ve picked a particularly strong form of Maxipok to argue against, you’re pushed into choosing a particularly strong form of Dichotomy that would be necessary to support it
But I think that this strong form of Dichotomy is relatively implausible to start with
And I would guess that Bostrom at the time of writing the article would not have supported it; certainly the passages you quote feel to me like they’re supporting something weaker
Here’s a weaker form of Dichotomy that I feel much more intuitive sympathy for:
Most things that could be “locked in” such that they have predictable long-term effects on the total value of our future civilization, and move us away from the best outcomes, actually constrain us to worlds which are <10% as good than the worlds without any such lock-in (and would therefore count as existential catastrophes in their own right)
The word “most” is doing work there, and I definitely don’t think it’s absolute (e.g. as you point out, the idea of diving the universe up 50⁄50 between a civilization that will do good things with it and one that won’t); but it could plausibly still be enough to guide a lot of our actions
On Dichotomy:
Because you’ve picked a particularly strong form of Maxipok to argue against, you’re pushed into choosing a particularly strong form of Dichotomy that would be necessary to support it
But I think that this strong form of Dichotomy is relatively implausible to start with
And I would guess that Bostrom at the time of writing the article would not have supported it; certainly the passages you quote feel to me like they’re supporting something weaker
Here’s a weaker form of Dichotomy that I feel much more intuitive sympathy for:
Most things that could be “locked in” such that they have predictable long-term effects on the total value of our future civilization, and move us away from the best outcomes, actually constrain us to worlds which are <10% as good than the worlds without any such lock-in (and would therefore count as existential catastrophes in their own right)
The word “most” is doing work there, and I definitely don’t think it’s absolute (e.g. as you point out, the idea of diving the universe up 50⁄50 between a civilization that will do good things with it and one that won’t); but it could plausibly still be enough to guide a lot of our actions