The results are mixed, suggesting that in some cases LLMs may decrease productivity:
These results are consistent with the idea that generative AI tools may function by exposing lower-skill workers to the best practices of higher-skill workers. Lower-skill workers benefit because AI assistance provides new solutions, whereas the best performers may see little benefit from being exposed to their own best practices. Indeed, the negative effects along measures of chat quality—RR and customer satisfaction—suggest that AI recommendations may distract top performers or lead them to choose the faster or less cognitively taxing option (following suggestions) rather than taking the time to come up with their own responses.
Anecdotally, what I’ve heard from people who do coding for a job is that AI does somewhat improve their productivity, but only about the same as or less than other tools that make writing code easier. They’ve said that the LLM filling in the code saves them the time they would have otherwise spent going to Stack Overflow (or wherever) and copying and pasting a code block from there.
Based on this evidence, I am highly skeptical that software development is going to become significantly less expensive in the near term due to LLMs, let alone 10x or 100x less expensive.
Sorry—my post is coming with the worldview/expectations that at some point, AI+software will be a major thing. I was flagging that in that view, software should become much better.
The question of “will AI+software” be important soon is a background assumption, but a distinct topic. If you are very skeptical, then my post wouldn’t be relevant to you.
Some quick points on that topic, however: 1. I think there’s a decent coalition of researchers and programmers who do believe that AI+software will be a major deal very soon (if not already). Companies are investing substantially into it (i.e. Anthropic, OpenAI, Microsoft, etc). 2. I’ve found AI programming tools to be a major help, and so have many other programmers I’ve spoken to. 3. I see the current tools as very experimental and new, still. Very much as a proof of concept. I expect it to take a while to ramp up their abilities / scale. So the fact that the economic impact so far is limited doesn’t surprise me. 4. I’m not very set on extremely short timelines. But I think that 10-30 years would still be fairly soon, and it’s much more likely that big changes will happen on this time frame.
Are you aware of hard data that supports this or is this just a guess/general impression?
I’ve seen very little hard data on the use of LLMs to automate labour or enhance worker productivity. I have tried to find it.
One of the few pieces of high-quality evidence I’ve found on this topic is this study: https://academic.oup.com/qje/article/140/2/889/7990658 It looked at the use of LLMs to aid people working in customer support.
The results are mixed, suggesting that in some cases LLMs may decrease productivity:
Anecdotally, what I’ve heard from people who do coding for a job is that AI does somewhat improve their productivity, but only about the same as or less than other tools that make writing code easier. They’ve said that the LLM filling in the code saves them the time they would have otherwise spent going to Stack Overflow (or wherever) and copying and pasting a code block from there.
Based on this evidence, I am highly skeptical that software development is going to become significantly less expensive in the near term due to LLMs, let alone 10x or 100x less expensive.
Sorry—my post is coming with the worldview/expectations that at some point, AI+software will be a major thing. I was flagging that in that view, software should become much better.
The question of “will AI+software” be important soon is a background assumption, but a distinct topic. If you are very skeptical, then my post wouldn’t be relevant to you.
Some quick points on that topic, however:
1. I think there’s a decent coalition of researchers and programmers who do believe that AI+software will be a major deal very soon (if not already). Companies are investing substantially into it (i.e. Anthropic, OpenAI, Microsoft, etc).
2. I’ve found AI programming tools to be a major help, and so have many other programmers I’ve spoken to.
3. I see the current tools as very experimental and new, still. Very much as a proof of concept. I expect it to take a while to ramp up their abilities / scale. So the fact that the economic impact so far is limited doesn’t surprise me.
4. I’m not very set on extremely short timelines. But I think that 10-30 years would still be fairly soon, and it’s much more likely that big changes will happen on this time frame.