I had another thought on why you might be underrating space settlement. The threat of an engineered pandemic, nuclear war etc constitute a certain type of constantish risk per year per colony. So we can agree that colonies reduce risk on nuclear war, and disagree for now on biorisk.
AIs seem like a separate class of one-off risk. At some point, we’ll create (or become) a superhuman AI, and at that point it will either wipe us out or not. If so, that I agree that even multiple colonies in the solar system, and perhaps even in other stars wouldn’t afford much protection—though they might afford some. But if not, it becomes much harder to envisage another AI coming along and doing what the first one didn’t, since now we presumably have an intelligence that can match it (and had a head start).
On this view, AI has its own built-in time-of-perilsness, and if in the scenario where it doesn’t wipe us all out it doesn’t also permanently fix all our problems, space colonisation now reduces the risk of the remaining threats by a much larger proportion.
I had another thought on why you might be underrating space settlement. The threat of an engineered pandemic, nuclear war etc constitute a certain type of constantish risk per year per colony. So we can agree that colonies reduce risk on nuclear war, and disagree for now on biorisk.
AIs seem like a separate class of one-off risk. At some point, we’ll create (or become) a superhuman AI, and at that point it will either wipe us out or not. If so, that I agree that even multiple colonies in the solar system, and perhaps even in other stars wouldn’t afford much protection—though they might afford some. But if not, it becomes much harder to envisage another AI coming along and doing what the first one didn’t, since now we presumably have an intelligence that can match it (and had a head start).
On this view, AI has its own built-in time-of-perilsness, and if in the scenario where it doesn’t wipe us all out it doesn’t also permanently fix all our problems, space colonisation now reduces the risk of the remaining threats by a much larger proportion.