Who else is pushing for a global Pause/Stop/Moratorium/Non-Proliferation Treaty? Who else is doing that in a way such that PauseAI might be counterfactually harming their efforts? Again, no action on this, or waiting for others to do something “better”, are terrible choices when the consequences of insufficient global action are that we all die in the relatively near future.
Do you think it’s possible for you to be convinced that building ASI is a suicide race, short of an actual AI-mediated global catastrophe? What would it take?
Unrelated to my argument: Not sure what you mean by “high probability” but I’d take a combination of these views are a reasonable prior: XPT.
Who else is pushing for a global Pause/Stop/Moratorium/Non-Proliferation Treaty? Who else is doing that in a way such that PauseAI might be counterfactually harming their efforts? Again, no action on this, or waiting for others to do something “better”, are terrible choices when the consequences of insufficient global action are that we all die in the relatively near future.
Do you think it’s possible for you to be convinced that building ASI is a suicide race, short of an actual AI-mediated global catastrophe? What would it take?
~50%. I think XPT is a terrible prior. Much better to look at the most recent AI Impacts Survey, or the CAIS Statement on AI Risk.