Yes. Though I have a higher opinion of how adaptable humans can be.
Using my own work as a benchmark (residential real estate pricing), AI automation would be a huge benefit to enable me to spend my time on higher level analysis. There’s a lot of AI that my role can absorb while my job still being safe.
That’s especially true in my industry in which adverse selection is prominent. AI making me more effective will be necessary merely to keep up with our competition. It won’t replace us because that will be the default starting position from which we as humans need to then compete at another level relative the competition.
I’m not convinced that the AI revolution benefits the high tech roles relative blue collar roles. AI is a lot closer to writing code from a language prompt than it is massaging someone’s back or even truck driving, for that matter.
I hope this is right. Would love the kinds of breakthroughs AI will provide and whatever we can do to get closer to radical abundance, which I think is the holy grail.
I also don’t see a systemic reason thst AI benefits won’t A) be broadly shared, and B) shy of radical abundance, that economic principles won’t continue to reward non-automated work. Automated activities will become cheap because they’re abundant, while activity that can’t be automated will rise in demand/price and be rewarded as a consequence.
Thanks for your reply. I think the biggest cruxes are about how quickly humans can adapt to change and how quickly AI capabilities can grow.
To my original point in (2), I’d also add something like “crossing the finish line” or “reaching the end”: within the next few decades, I expect AI to be capable of automating nearly all knowledge work. By “all knowledge work,” I mean all thinking-related tasks, includes both 2022 jobs and post-2022 jobs. I worry that this capability level (or a level reasonably close to it) might arrive quickly, before we’re prepared to deal with the ensuing unemployment spike.
My half baked theory is that there will always be jobs shy of radical abundance in which case jobs won’t be necessary.
If AI automated all knowledge work WITHOUT delivering radical abundance, then there would still be jobs delivering goods/services that AI is, by definition, not delivering.
Yes. Though I have a higher opinion of how adaptable humans can be.
Using my own work as a benchmark (residential real estate pricing), AI automation would be a huge benefit to enable me to spend my time on higher level analysis. There’s a lot of AI that my role can absorb while my job still being safe.
That’s especially true in my industry in which adverse selection is prominent. AI making me more effective will be necessary merely to keep up with our competition. It won’t replace us because that will be the default starting position from which we as humans need to then compete at another level relative the competition.
I’m not convinced that the AI revolution benefits the high tech roles relative blue collar roles. AI is a lot closer to writing code from a language prompt than it is massaging someone’s back or even truck driving, for that matter.
I hope this is right. Would love the kinds of breakthroughs AI will provide and whatever we can do to get closer to radical abundance, which I think is the holy grail.
I also don’t see a systemic reason thst AI benefits won’t A) be broadly shared, and B) shy of radical abundance, that economic principles won’t continue to reward non-automated work. Automated activities will become cheap because they’re abundant, while activity that can’t be automated will rise in demand/price and be rewarded as a consequence.
Thanks for your reply. I think the biggest cruxes are about how quickly humans can adapt to change and how quickly AI capabilities can grow.
To my original point in (2), I’d also add something like “crossing the finish line” or “reaching the end”: within the next few decades, I expect AI to be capable of automating nearly all knowledge work. By “all knowledge work,” I mean all thinking-related tasks, includes both 2022 jobs and post-2022 jobs. I worry that this capability level (or a level reasonably close to it) might arrive quickly, before we’re prepared to deal with the ensuing unemployment spike.
My half baked theory is that there will always be jobs shy of radical abundance in which case jobs won’t be necessary.
If AI automated all knowledge work WITHOUT delivering radical abundance, then there would still be jobs delivering goods/services that AI is, by definition, not delivering.
And if so, we have nothing to fear.