I think it’s a very hard sell to try and get people to sacrifice themselves (and the whole world) for the sake of preventing “fates worse than death”.
I’m not talking about people sacrificing themselves or the whole world. Even if we were to adopt a purely survivalist perspective, I think it’s still far from obvious that trying to slow things down is more effective than is focusing on other aims. After all, the space of alternative aims that one could focus on is vast, and trying to slow things down comes with non-trivial risks of its own (e.g. risks of backlash from tech-accelerationists). Again, I’m not saying it’s clear; I’m saying that it seems to me unclear either way.
We should be doing all we can now to avoid having to face such a predicament!
But, as I see it, what’s at issue is what the best way is to avoid such a predicament/how to best navigate given our current all-too risky predicament.
FWIW, I think that a lot of the discussion around this issue appears strongly fear-driven, to such an extent that it seems to get in the way of sober and helpful analysis. This is, to be sure, extremely understandable. But I also suspect that it is not the optimal way to figure out how to best achieve our aims, nor an effective way to persuade readers on this forum. Likewise, I suspect that rallying calls along the lines of “Global moratorium on AGI, now” might generally be received less well than would, say, a deeper analysis of the reasons for and against attempts to institute that policy.
I feel like I’m one of the main characters in the film Don’t Look Up here.
the space of alternative aims that one could focus on is vast
Please can you name 10? The way I see it is—either alignment is solved in time with business as usual[1], or we Pause to allow time for alignment to be solved (or establish it’s impossibility). It is not a complicated situation. No need to be worrying about “fates worse than death” at this juncture.
I’m not talking about people sacrificing themselves or the whole world. Even if we were to adopt a purely survivalist perspective, I think it’s still far from obvious that trying to slow things down is more effective than is focusing on other aims. After all, the space of alternative aims that one could focus on is vast, and trying to slow things down comes with non-trivial risks of its own (e.g. risks of backlash from tech-accelerationists). Again, I’m not saying it’s clear; I’m saying that it seems to me unclear either way.
But, as I see it, what’s at issue is what the best way is to avoid such a predicament/how to best navigate given our current all-too risky predicament.
FWIW, I think that a lot of the discussion around this issue appears strongly fear-driven, to such an extent that it seems to get in the way of sober and helpful analysis. This is, to be sure, extremely understandable. But I also suspect that it is not the optimal way to figure out how to best achieve our aims, nor an effective way to persuade readers on this forum. Likewise, I suspect that rallying calls along the lines of “Global moratorium on AGI, now” might generally be received less well than would, say, a deeper analysis of the reasons for and against attempts to institute that policy.
I feel like I’m one of the main characters in the film Don’t Look Up here.
Please can you name 10? The way I see it is—either alignment is solved in time with business as usual[1], or we Pause to allow time for alignment to be solved (or establish it’s impossibility). It is not a complicated situation. No need to be worrying about “fates worse than death” at this juncture.
seems highly unlikely, but please say if you think there are promising solutions here