There doesn’t seem to be any discussion about what would happen if there is no intelligence explosion. I could easily see a scenario where everyone is waiting around for a singularity that never comes, and neglecting accumulated dangers arising from gradual AI progress. I could also see a case where the “intelligence explosion” gets declared early according to AI progress meeting some benchmark, when clearly AI progress is not exploding.
Going up one level: You may believe that an intelligence explosion is inevitable, but most people, including world leaders, do not. A lot of people are probably going to see a treaty like this as a waste of time and resources. On the other hand, some countries might be willing to make larger concessions because they think they’ll never have to pay up on them.
There doesn’t seem to be any discussion about what would happen if there is no intelligence explosion. I could easily see a scenario where everyone is waiting around for a singularity that never comes, and neglecting accumulated dangers arising from gradual AI progress. I could also see a case where the “intelligence explosion” gets declared early according to AI progress meeting some benchmark, when clearly AI progress is not exploding.
Going up one level: You may believe that an intelligence explosion is inevitable, but most people, including world leaders, do not. A lot of people are probably going to see a treaty like this as a waste of time and resources. On the other hand, some countries might be willing to make larger concessions because they think they’ll never have to pay up on them.