Im not up on the literature , but it seems that continuity of technical progress is not a meaningful discussion. Events can be seen as discontinuous or not depending on time scale, information availability, and technical expertise.
To me, continuity is a way of operationalizing the question “will we be surprised by AI competence?”. If our alignment techniques are always sufficient, then the point is moot regardless of how fast AI improves.
Ps. an interesting possibility is “Lipshitz Continuity”, which means that the derivative is bounded. ie. “What’s the fastest rate that technology can progress?”
Im not up on the literature , but it seems that continuity of technical progress is not a meaningful discussion. Events can be seen as discontinuous or not depending on time scale, information availability, and technical expertise. To me, continuity is a way of operationalizing the question “will we be surprised by AI competence?”. If our alignment techniques are always sufficient, then the point is moot regardless of how fast AI improves.
Ps. an interesting possibility is “Lipshitz Continuity”, which means that the derivative is bounded. ie. “What’s the fastest rate that technology can progress?”