Slightly humorous answer: it should be the very most pessimistic organization out there (I had MIRI in mind, but surely if we’re picking the winner in advance we can craft an organization that goes even further on that scale).
My point is the same as jimrandomh: if there’s an arms race that actually goes all the way up to AGI, safety measures are going to get in the way of speed, corners will be cut, and disaster will follow.
This assumes, of course, that any unaligned AGI system will be the cause of non-recoverable catastrophe, independently from the good intentions of their designers.
If this assumption proves wrong, then the winner of that race still holds the most powerful and versatile technological artifact ever designed; the kind of organization to wield that kind of influence should be… careful.
I’m not sure which governance design best achieves the carefulness that is needed in that case.
Slightly humorous answer: it should be the very most pessimistic organization out there (I had MIRI in mind, but surely if we’re picking the winner in advance we can craft an organization that goes even further on that scale).
My point is the same as jimrandomh: if there’s an arms race that actually goes all the way up to AGI, safety measures are going to get in the way of speed, corners will be cut, and disaster will follow.
This assumes, of course, that any unaligned AGI system will be the cause of non-recoverable catastrophe, independently from the good intentions of their designers.
If this assumption proves wrong, then the winner of that race still holds the most powerful and versatile technological artifact ever designed; the kind of organization to wield that kind of influence should be… careful.
I’m not sure which governance design best achieves the carefulness that is needed in that case.