I see what you’re thinking. We break the symmetry not by thinking that the next step is going to be closer in time, but that the next step(s) are going to be more important to get right than either self-driving cars or earlier automation.
In a way, the two are interchangeable: if we define “steps” as changes of given magnitude then faster change means more densely spaced steps.
There is another effect that has to be taken into account. Namely, some progress in understanding how to adapt to automation might be happening without the actual adoption of automation, that is, progress that occurs because of theoretical deliberation and broader publicity for the relevant insights. This sort of progress creates an incentive to move all adoption later in time.
I see what you’re thinking. We break the symmetry not by thinking that the next step is going to be closer in time, but that the next step(s) are going to be more important to get right than either self-driving cars or earlier automation.
In a way, the two are interchangeable: if we define “steps” as changes of given magnitude then faster change means more densely spaced steps.
There is another effect that has to be taken into account. Namely, some progress in understanding how to adapt to automation might be happening without the actual adoption of automation, that is, progress that occurs because of theoretical deliberation and broader publicity for the relevant insights. This sort of progress creates an incentive to move all adoption later in time.