Cool! I mostly like your decomposition/framing. A major nitpick is that robotics doesn’t matter so much: dispatching to human actuators is probably cheap and easy, like listing mturk jobs or persuasion/manipulation.
Agreed. AGI can have great influence in the world just by dispatching humans.
But by the definition of transformative AGI that we use—i.e., that AGI is able to do nearly all human jobs—I don’t think it’s fair to equate “doing a job” with “hiring someone else to do the job.” To me, It would be a little silly to say “all human work has been automated” and only mean “the CEO is an AGI, but yeah everyone still has to go to work.”
Of course, if you don’t think robotics is necessary for transformative AGI, then you are welcome to remove the factor (or equivalently set it to 100%). In that case, our prediction would still be <1%.
Cool! I mostly like your decomposition/framing. A major nitpick is that robotics doesn’t matter so much: dispatching to human actuators is probably cheap and easy, like listing mturk jobs or persuasion/manipulation.
Agreed. AGI can have great influence in the world just by dispatching humans.
But by the definition of transformative AGI that we use—i.e., that AGI is able to do nearly all human jobs—I don’t think it’s fair to equate “doing a job” with “hiring someone else to do the job.” To me, It would be a little silly to say “all human work has been automated” and only mean “the CEO is an AGI, but yeah everyone still has to go to work.”
Of course, if you don’t think robotics is necessary for transformative AGI, then you are welcome to remove the factor (or equivalently set it to 100%). In that case, our prediction would still be <1%.