Thanks for your reply. I think the biggest cruxes are about how quickly humans can adapt to change and how quickly AI capabilities can grow.
To my original point in (2), I’d also add something like “crossing the finish line” or “reaching the end”: within the next few decades, I expect AI to be capable of automating nearly all knowledge work. By “all knowledge work,” I mean all thinking-related tasks, includes both 2022 jobs and post-2022 jobs. I worry that this capability level (or a level reasonably close to it) might arrive quickly, before we’re prepared to deal with the ensuing unemployment spike.
My half baked theory is that there will always be jobs shy of radical abundance in which case jobs won’t be necessary.
If AI automated all knowledge work WITHOUT delivering radical abundance, then there would still be jobs delivering goods/services that AI is, by definition, not delivering.
Thanks for your reply. I think the biggest cruxes are about how quickly humans can adapt to change and how quickly AI capabilities can grow.
To my original point in (2), I’d also add something like “crossing the finish line” or “reaching the end”: within the next few decades, I expect AI to be capable of automating nearly all knowledge work. By “all knowledge work,” I mean all thinking-related tasks, includes both 2022 jobs and post-2022 jobs. I worry that this capability level (or a level reasonably close to it) might arrive quickly, before we’re prepared to deal with the ensuing unemployment spike.
My half baked theory is that there will always be jobs shy of radical abundance in which case jobs won’t be necessary.
If AI automated all knowledge work WITHOUT delivering radical abundance, then there would still be jobs delivering goods/services that AI is, by definition, not delivering.
And if so, we have nothing to fear.