Hmm, I’m surprised to hear you say that about the second story, which I think is describing a fairly fast end to human civilization—“going out with a bang”. Example quote:
If influence-seeking patterns do appear and become entrenched, it can ultimately lead to a rapid phase transition from the world described in Part I to a much worse situation where humans totally lose control.
So I mostly see it as describing a hard take-off, and am curious if there’s a key part of a fast-takeoff / discontinuous take-off that you think of as central that is missing there.
I think of hard takeoff as meaning that AI systems suddenly control much more resources. (Paul suggests the definition of “there is a one year doubling of the world economy before there’s been a four year doubling”.)
Unless I’m very mistaken, the point Paul is making here is that if you have a world where AI systems in aggregate gradually become more powerful, there might come a turning point where the systems suddenly stop being controlled by humans. By analogy, imagine a country where the military wants to stage a coup against the president, and their power increases gradually day by day, until one day they decide they have enough power to stage the coup. The power wielded by the military increased continuously and gradually, but the amount of control of the situation wielded by the president at some point suddenly falls.
Hmm, I’m surprised to hear you say that about the second story, which I think is describing a fairly fast end to human civilization—“going out with a bang”. Example quote:
So I mostly see it as describing a hard take-off, and am curious if there’s a key part of a fast-takeoff / discontinuous take-off that you think of as central that is missing there.
I think of hard takeoff as meaning that AI systems suddenly control much more resources. (Paul suggests the definition of “there is a one year doubling of the world economy before there’s been a four year doubling”.)
Unless I’m very mistaken, the point Paul is making here is that if you have a world where AI systems in aggregate gradually become more powerful, there might come a turning point where the systems suddenly stop being controlled by humans. By analogy, imagine a country where the military wants to stage a coup against the president, and their power increases gradually day by day, until one day they decide they have enough power to stage the coup. The power wielded by the military increased continuously and gradually, but the amount of control of the situation wielded by the president at some point suddenly falls.