Yes, to emphasize, the post is meant to define the situation under consideration as: “something close to a 10x increase in growth; or death”. We’re interested in this scenario only because it’s the modal scenario in the particular world of LW/EA/AI safety.
The logic of the argument does not apply as forcefully to “smaller” changes (which could potentially still be quite large), and would not apply at all if AI did not increase growth (ie did not decrease marginal utility of consumption)!
Yes, to emphasize, the post is meant to define the situation under consideration as: “something close to a 10x increase in growth; or death”. We’re interested in this scenario only because it’s the modal scenario in the particular world of LW/EA/AI safety.
The logic of the argument does not apply as forcefully to “smaller” changes (which could potentially still be quite large), and would not apply at all if AI did not increase growth (ie did not decrease marginal utility of consumption)!