Would you pursue software engineering as a career today?
I recently left a career in copywriting. While my background is largely in the creative space (writing, designing, branding, marketing, etc.), I’m also a former business owner with an affinity for problem solving. I don’t have much tech experience, but being “creative” with code (either as a data scientist/architect or software engineer/developer) has interested me for some time.
Additionally, the programming path intrigues me for the following reasons:
Enjoy building and problem solving.
Useful and fulfilling to have end-to-end skills within a domain.
Aptitude is portable to different industries and cause areas.
Once proficient and experienced, could potentially allow me to contribute/upskill in AI safety.
Through a combination of soul-searching, 80,000 Hours’ articles, an advising session, Holden Karnofsky’s framework for building aptitudes, and taking a free coding lesson, I’ve decided to pursue software development (most likely, full-stack).
That all said, I’m on the wrong side of mid-career and would like to reduce my chances of entering a field where it would be difficult to get (and keep) a job. From the professionals I’ve surveyed, none believe age is a barrier to entering the field. The U.S. Bureau or Labor Statistics also expects healthy growth for software developers (and related roles). But I remain unclear (worried) about AI’s future effects on the software development job market...
Many in my previous vocation have long dismissed the threat AI posed to their careers. However, the most recent iterations of ChatGPT (and the like) have started to change minds. At the last marketing agency where I worked, we were constantly testing new AI writing tools to help with copy production and efficiency. On its face, optimization seemed like a “good” goal. But the subtext was certainly about creating more output with less people.
As everyone on this forum is well aware, the idea that AI could become proficient at coding or eliminate jobs has also been debated for some time. While there’s no clear consensus, most things I’ve read suggest developers believe AI will assist with coding but that humans will still be needed for directives, oversight, debugging, more sophisticated strings, etc. Moreover, AI software-makers claim that their tech will usher in even more opportunity for developers. But whether the world will need as many (or more) AI-assisted developers as unassisted is inevitably lost in much of the rhetoric. And while the development of no and low code tools could very much be about innovation, utility, and accessibility, these technologies will be adopted by companies looking to save money on labor.
Holden Karnofsky recommends engineering as an aptitude in his “most important century” series and 80,000 Hours includes engineering on their “highest-impact career paths” page. (While these recommendations are specific to EA and longtermism-related work, I include them because the sources are particularly concerned with the implications of AI.)
When I’ve asked other engineers if they believe AI is a threat to their job, I’ve gotten a resounding (100%) “no,” often followed by an addendum like, “maybe in another 20-30 years.” But these answers haven’t completely satisfied me and I’ve finally realized why...
We’ve seen the tech industry grow at an unprecedented rate over the last few decades. And it’s this business-as-usual growth that leads to overall bullish impressions. But I don’t think anyone would argue that we’re on there verge of a new, uncharted, unpredictable landscape. To that point, maybe many (or most) things continue to go “up,” but other things -- like available jobs, salaries, career longevity—recede or even disappear.
To get a better sense of AI’s possible effects on the engineering job market, I searched for someone who understands both coding as a career (not just technical skills, but workflows and process), as well as the AI space. I found this post from Dec. of last year by an AI developer reviewing ChatGPT’s ability to code. After testing the tool, the developer concluded that ChatGPT wasn’t a threat to programmers, giving various rationale therein. However, at least one of his reasons for dismissal has been overcome since the post was written less than four months ago. I also don’t see any reason why AI couldn’t (soon) “talk to the clients and gather requirements” or that these tasks couldn’t be handled by a non-technical account manager or coordinator. But then I have to remind myself that I don’t work in this field and there are many subtleties I don’t understand. And so I should probably be comforted by the seemingly broad belief that AI won’t be taking coding jobs anytime soon. Yet, the litany of objections I’ve seen superseded over the last year alone leaves me unconvinced.
No one can predict the future with certainty, but I think I’d invest more faith in answers where motives and biases could be better separated. For example, many of the articles I’ve read are primarily concerned with sensational titles to attract more clicks, while most of the software engineers I’ve queried already have the skills, a job, and network to (probably, possibly) ride out the rest of their careers without being negatively affected by AI. But I suspect that these POVs aren’t super relevant or helpful for people looking to enter the field today.
So, I’d like to reframe the question for developers whom, yes, already have the skills, a job, and network, but also have a better-than-the-average-bear understanding of AI, and can imagine what it would be like to start their journey fresh in 2023:
Would you pursue software engineering as a career today?
Thanks in advance!
See also this LW question posted earlier this week, and the discussion there.
Thanks for the referral. Interesting post—even if much of the technical-speak is lost on me. What I gathered is that nobody really knows if/when software engineering will become an unskilled job (no surprise) but, a) many are confident that it won’t be anytime soon (at least, for the discipline as a whole), and b) junior developers are the ones that LLMs are likely to replace (est. 1-3 yrs.).
While much of the thread’s early sentiments echo replies here, there’s a divergence concerning newer engineers as the conversation continues. It’s these bearish predictions that worry me. I don’t need to make six figures, but I can’t invest time (6-12 mo.) and money (courses, bootcamp, etc.) in a career path where newbie “escape velocity” is unlikely. More to think about...
When it comes to LLMs, I often compare the complexity of writing a world-class software system to the difficulty of writing a world-class novel. Any fifth grader can string together a five-paragraph essay. But it’s a long way from that to writing an astute portrayal of modern society with compelling prose and gripping characters, like Balzac did.
When you hear professional software engineers say they’re not worried about AI, I think this is a large part of what’s going on. Most production code requires you to understand the world around you — you need to understand your customers, the existing technical constraints and your operational budget (how often is the system allowed to crash? What’s the 95th percentile allowed latency? What features can we trade off to reach this?). You often need to understand the viewpoints of your coworkers and/or managers and be able to argue with them when they’re wrong. You need to understand which coworkers or customers to talk to and which ones to ignore (not something I see in the skill set of any LLM so far). In most companies, the more senior a software engineer is, the more this becomes part of their job (instead of day-to-day coding).
That said, I think 80k hours correctly notes that software engineering is a great early-stage career and a mediocre mid-to-late stage career. Nowhere else can you find a cushy, well-defined and fun job that easily pays $100-500k a year. But I think that ends up cutting against you 5-10 years in — many software engineers can get so far off of just coding and being a nerd that they never learn how to solve business problems (and companies are often built around the expectation that software engineers don’t want to learn about the business). That keeps them from moving into more impactful roles. It’s also not a job that lends itself to networking, which probably also affects long-term career prospects.
Since you’re a former business owner, it’s possible this won’t be a problem for you. In that case, working at a tech company is one of the best ways to learn how to code and how to design software systems, and it’s probably a great move for you. I’d just encourage you to keep in mind that coding, while fun, is most impactful as a means to an end than as an end in itself.
When it comes to refining AI generated code, do you imagine this being done within organizations by the same amount of programmers or that LLMs could be managed by fewer senior (or even lower) level engineers? This question is inspired by my observations in marketing, where the stock of full-time writers appears to be going down. I totally get that LLMs can’t create their own prompts, debug every line of code, or approve products, but do you think they’ll start allowing orgs to complete product development cycles with less engineers?
Great point that coding isn’t an end in itself. In addition to seeming fun/interesting, I’m looking to learn this skill for greater domain range, technical building ability, and professional autonomy. Knowing how to code could eventually help me launch a startup or support an EA-related org. And yeah, earning to give while I ramp makes this path even more attractive. Many great points and thanks for the encouragement!
This might be an especially good time to enter the field. Instead of having to compete with more experienced SWEs in writing code the old fashioned way, you can be on a nearly level playing field with incorporating LLMs into your workflow. You’ll still need to learn a traditional language, at least for now, but you will be able to learn more quickly with the assistance of an LLM tutor. As the field increasingly adapts to a whole new way to write code, you can learn along with everybody else.
Very interesting point. I hadn’t seen this as super plausible given how AI is starting to be used in copywriting/marketing: 1) Copy editors can now give prompts to LLMs and refine from there. 2) Non-writing workers e.g. marketing coordinators, account managers, etc. can use LLMs to create “good enough” pieces for landing pages, social captions, SEO, etc. This kind of AI integration seems to be eliminating the need for copywriters, content writers, brand writers, etc. But I should acknowledge that a lot of my worries are based on anecdotal evidence. I was the only full-time writer at my previous agency and, while I left on my own accord, it looks like they’re going to experiment without the position. I think their plan is to get non-writing account managers proficient with an LLM and contract with a lower level writer for client edits.
According to BLS, writers and authors (very broad category) are expected to grow at 4% over the next 10 years, while editor roles are expected to decline by 5%. I do imagine that copy directors, technical writers, and script writers (various levels) will be among those spared near future replacement, but these are very specific niches, and the ability for LLMs to craft slogans, taglines, and scripts is getting quite impressive...
Now, I understand content creation is quite different from software engineering, and perhaps the former positions and tasks don’t map well onto the latter. To your point, maybe the transformation in software is more analogous to physical engineering, where a newer professional who knows SOLIDWORKS, Fusion 360, FDM/3D, etc. is going to add value where someone more experienced who only works with legacy programs and traditional manufacturing can’t. Does that comparison feel appropriate?
I’ve used ChatGPT for writing landing pages for my own websites, and as you say, it does a “good enough” job. It’s the linguistic equivalent of a house decorated in knick knacks from Target. For whatever reason, we have had a cultural expectation that websites have to have this material in order to look respectable, but it’s not business-critical beyond that.
By contrast, software remains business-critical. One of the key points that’s being made again and again is that many business applications require extremely high levels of reliability. Traditional software and hardware engineering can accomplish that. For now, at least, large language models cannot, unless they are imitating existing high-reliability software solutions.
A large language model can provide me with reliable working code for an existing sorting algorithm, but when applications become large, dynamic, and integrated with the real world, it won’t be possible to built a whole application off a short, simple prompt. Instead, the work is going to be about using both human and AI-generated code to put together these applications more efficiently, debug them, improve the features, and so on.
This is one reason why I think that LLMs are unlikely to replace software engineers, even though they are replacing copy editors, and even though they can write code: SWEs create business-critical high-reliability products, while copy editors create non-critical low-reliability products, which LLMs are eminently suitable for.
I’d say marketing is business-critical, and the difference between phone-it-in, good, great, and stellar content is important to bottom lines (depending on industry/product/service). That said, if the general point is that grammar issues on a site will have a lesser negative effect than buggy code that crashes that site, I agree. I’d also agree that unless you’re a marketing or content agency, marketing and content may be part of your business but they’re not the core of it. In contrast, almost every business in every industry runs on software today...
Still, I don’t know how long things like scale, complexity, and strategy will be meaningful hurdles for LLMs and other AI technology (nobody does), but it feels like we’re accelerating toward an end point. Regardless, software engineering seems like a good aptitude to add to the toolbox, and it’s good to hear that I may not be too late to the game.
I mean, it’s a better path than copywriting. Is that the other choice? Yeah software engineering is not a bad choice.
No, I’ve already made the decision to leave copywriting (unless an opportunity to have an incredible impact came my way).
Software engineering and data science were the two paths I was considering but engineering won out 1) As an end-to-end (idea to product) creation tool, and 2) Iit doesn’t require me to first become proficient in probability/statistics . The latter is something I eventually hope to do but, financially, I can’t afford to ramp up in math, then data science, then find a job. And while it’s estimated that data science roles will grow at a faster rate than jobs in software engineering, there are far less overall spots available in data science . Being at the midpoint of my career, my ability to make a meaningful contribution somewhere as a software developer seems more likely than as a data scientist. Lastly, I’d assume data science would be the type of skill that AI will replace before software engineering (but that’s a huge guess).
Ah, gotcha. That plan makes sense then.
If AI + a nontechnical person familiar with business needs can replace me in coding, I expect something resembling a singularity within 5 years.
I think that software engineering is a great career if you have an aptitude for it. It’s also way easier to tell if you are good at it relative to most other careers (ie, Leetcode, Hackerrank, and other question repositories can help you understand your relative performance).
So my answer is that either AI can’t automate software engineers for a while, or they’ll automate every career quite soon after software engineering. Maybe 30% of my job is automating other people’s. As a result, software engineering is a pretty good bet as a career.
I’d be curious to hear from folks who can imagine worlds where software engineering is nearly fully automated, and we don’t automate all jobs that decade.
Thanks for that perspective. Given that I don’t have experience in the programming space, I couldn’t project a timeline between fully automated software production and AGI—but your estimate puts something on the map for me. It is disconcerting though, as there are many different assumptions and perspectives about AGI, and a lot of uncertainty. But I also understand that certainty isn’t something I should expect on any topic—let alone this one . Moreover, career inaction isn’t an option I can afford, so I’ll likely be barreling down the software dev path very soon.