If there’s an arms race dynamic, it’s probably a disaster no matter who wins. Having room to delay for late-stage alignment experiments is the barest minimum requirement in order for humanity to have any chance of survival. So the best case is to not have an arms race at all. The next-best thing is for the organization that wins to be the sort of organization that could stop at the brink for late-stage alignment research, if its leader decided to, and for it to have a stable leader who’s sane enough to make that decision. Then maximize the size of the gap to second place, to increase the probability and length of that delay.
Needing it to be possible to stop rules out all of government and academia in the US as the US exists now, since those organizations have their high-level decisions made by distant committees, who typically have strong incentives to maintain whatever superficially looks like the status quo, don’t typically have the prerequisites to undersand alignment-related strategy, and don’t have the technical expertise to recognize when they’re at the brink.
I believe that, of all of the organizations that could plausibly win an AGI arms race, this uniquely identifies DeepMind. I do have some misgivings about DeepMind’s strategy, and I don’t have full confidence that Demis would recognize when we’re at the brink or stop there, but no other organization seems even vaguely plausible.
If there’s an arms race dynamic, it’s probably a disaster no matter who wins. Having room to delay for late-stage alignment experiments is the barest minimum requirement in order for humanity to have any chance of survival. So the best case is to not have an arms race at all. The next-best thing is for the organization that wins to be the sort of organization that could stop at the brink for late-stage alignment research, if its leader decided to, and for it to have a stable leader who’s sane enough to make that decision. Then maximize the size of the gap to second place, to increase the probability and length of that delay.
Needing it to be possible to stop rules out all of government and academia in the US as the US exists now, since those organizations have their high-level decisions made by distant committees, who typically have strong incentives to maintain whatever superficially looks like the status quo, don’t typically have the prerequisites to undersand alignment-related strategy, and don’t have the technical expertise to recognize when they’re at the brink.
I believe that, of all of the organizations that could plausibly win an AGI arms race, this uniquely identifies DeepMind. I do have some misgivings about DeepMind’s strategy, and I don’t have full confidence that Demis would recognize when we’re at the brink or stop there, but no other organization seems even vaguely plausible.