I’m confused why people think certainty is needed to characterize this as a game of chicken! It’s certainly not needed in order for the game theoretic dynamics to apply.
I can make a decision about whether to oppose something given that there is substantial uncertainty, and I have done so.
I agree with this comment, but I interpreted your original comment as implying a much greater degree of certainty of extinction assuming ASI is developed than you might have intended. My disagree vote was meant to disagree with the implication that it’s near certain. If you think it’s not near certain it’d cause extinction or equivalent, then it does seem worth considering who might end up controlling ASI!
If it’s “only” a coinflip if it causes extinction if developed today, to be wildly optimistic, then I will again argue that talking about who should flip the coin seems bad—the correct answer in that case is no one, and we should be incredibly clear on that!
I’m confused why people think certainty is needed to characterize this as a game of chicken! It’s certainly not needed in order for the game theoretic dynamics to apply.
I can make a decision about whether to oppose something given that there is substantial uncertainty, and I have done so.
I agree with this comment, but I interpreted your original comment as implying a much greater degree of certainty of extinction assuming ASI is developed than you might have intended. My disagree vote was meant to disagree with the implication that it’s near certain. If you think it’s not near certain it’d cause extinction or equivalent, then it does seem worth considering who might end up controlling ASI!
If it’s “only” a coinflip if it causes extinction if developed today, to be wildly optimistic, then I will again argue that talking about who should flip the coin seems bad—the correct answer in that case is no one, and we should be incredibly clear on that!
Agree coin flip is unacceptable! Or even much less than coin flip is still unacceptable.