I agree with this comment, but I interpreted your original comment as implying a much greater degree of certainty of extinction assuming ASI is developed than you might have intended. My disagree vote was meant to disagree with the implication that it’s near certain. If you think it’s not near certain it’d cause extinction or equivalent, then it does seem worth considering who might end up controlling ASI!
If it’s “only” a coinflip if it causes extinction if developed today, to be wildly optimistic, then I will again argue that talking about who should flip the coin seems bad—the correct answer in that case is no one, and we should be incredibly clear on that!
I agree with this comment, but I interpreted your original comment as implying a much greater degree of certainty of extinction assuming ASI is developed than you might have intended. My disagree vote was meant to disagree with the implication that it’s near certain. If you think it’s not near certain it’d cause extinction or equivalent, then it does seem worth considering who might end up controlling ASI!
If it’s “only” a coinflip if it causes extinction if developed today, to be wildly optimistic, then I will again argue that talking about who should flip the coin seems bad—the correct answer in that case is no one, and we should be incredibly clear on that!
Agree coin flip is unacceptable! Or even much less than coin flip is still unacceptable.