I think this is dead nuts on, and is completely aligned with my experience as an executive at a mid-size software development agency. We've internally debated the same exact points you walked through here, and came to similar conclusions. Definitely a remarkable time to be in the industry, and there will be massive amounts of opportunity available to the folks that roll with the changes and adjust to the new reality of building things.
My thoughts are from a limited perspective. First, I am not a trained software developer. My background is mechanical engineering, and I'm retired but I've always done a fair bit of coding, I even had the pleasure of teaching programming to engineering and engineering design students. I am partially involved in working on a piece of FOSS, I am trying really hard to deal with 25 years of technical debt and upgrade the codebase to java 21 (then onto 25 and beyond). Recently, I have been trying OpenRewrite and Cursor ai to help.
From this experience I would most definitely say that in the foreseeable future there is now way that these tools can be used by, for want of a better term, muggles. As the user of AI you need a continuous interaction between expert coder and the AI, conversations not chats. You need to be aware of what you want and what you're provided by the AI. It's not an easy task. But, in the commercial world I expect that the code may well get cheaper, but not at the expense of dismissing trained professionals, but, simply because the AI can work so much faster than a human. But this cannot achieved without close constant supervision. People must remember that AI is a tool which requires patience and understanding to use. Sure there are psuedo coders that say aren't I wonderul I created an app in a weekend with AI, well, maybe but you didn't generate nearly 38,000 java files. Hope this is useful
I'd argue that whether writing code becomes cheaper depends on what kind of code we're talking about.
In my opinion, poor and low-quality code indeed becomes much cheaper. But quality, long-term solutions? Highly unlikely. What I see right now is that AI tools help you focus less on routine code and dedicate more time to business-specific code. This doesn't make good software cheaper—it makes it more reliable.
I almost thought this was a level-headed take until I got to the bit where AI is supposedly driving crazy deflation... I would agree that we have these dynamics in play, but they're just not new to the AI era in the slightest. As a software engineer, I always understood that my job was to eliminate my own job. Programmers have done this many times over, most often by climbing to a higher rung on the ladder of abstraction.
Similarly there has always been value to waiting to write code. Shipping it later on you might have better tools or frameworks available to use, or you might not need to support some really ancient environment like IE6 that is fading away. That's just the way things go with software engineering: the dilemma has always been the same. Do you wait for more feet to walk the path, widening the way and hardening the dirt? ...or do you try to lead and make your own path, hoping that you might discover an exciting place which others will wish to follow you to?
Even with AI making coding easier, I don’t think we’re heading toward a world where everyone becomes a programmer. Out of ten people, maybe one or two will actually enjoy learning to code—and even then, interest isn’t enough. You still need to understand core concepts, build problem-solving skills, and stay committed. AI can help, sure, but becoming a truly skilled developer takes time and dedication. That’s why I don’t see a major drop in opportunities for experienced programmers anytime soon.
having used the AI tools for a while and the coding tools for a little while. My company measured the "productivity" gain and did not really see much for most of the complex software.
If you want to build a quick proto type it is ok for it. But once they upgrade the models and you try to go back into the code it makes a mess.
I think these AI tools are probably gen 1 they probably have another 10 to 20 years before they become profitable. I think the biggest change is not using AI to write software it is really software that is used by AI. The OS interface is long overdue to a totally make over.
This is a valid and nice take to compare it to the substitution and Jevons principle. I believe that the substitution principle will be applied on small to medium sized software where the code architecture is usually more or less the same: think of standard website, CRUD web apps, etc. The Jevons paradox would instead play out on large software with complex architecture, which you can’t really replace with machines (AI/LLMs).
"Understanding, integration, and judgment" - at some point AIs will outperform humans at those skills too. And then what? Extrapolate the interesting trend.
Another thought: IT department tomorrow will be like Finance department day: one CFO leading a small team today. Tomorrow, one CTO/CIO will only lead a handful of IT staff.
Couldn't we summarize these points as "the difference between a programmer and a developer" and that difference is becoming more obvious and important? To me, what makes a person a professional software developer is only partly the skill of programming; LLM-based code-generation tools (aka, Beck Genies) are trying to be used to enhance that skill.
To paraphrase someone (who I can't remember right now): New flash: the bottleneck and greatest cost was never writing the code.
“But the gap between commodity code and carefully crafted software widens. The middle disappears.”
If we think about it, it happened with music creation already. Having a studio that stays in your pocket and a supply chain able to put your music in steaming globally in no time flattened quality, increased the quantity and the gap between poor and extra-ordinary.
Programmers will probably go through the same pain that musicians experienced, and ChatGPT might only be Napster so far.
There’s a big difference in my opinion though: sadly we are going macro and not micro with the computational need to sustain all of this.
AI is the fast food of coding and because of that I think that other aspects to consider are:
- more junk code=>more bugs=>more bug fixing
- we have to find a way to confine that junk code into small replaceable modules, work on understanding, integration, judgment and DESIGN so that future optionality will still be possible
- consider possible long term health effects for the applications.
And ultimately you'll still want to be able to replace your fast food meal with a slower and more sophisticated one when possible.
I think this is dead nuts on, and is completely aligned with my experience as an executive at a mid-size software development agency. We've internally debated the same exact points you walked through here, and came to similar conclusions. Definitely a remarkable time to be in the industry, and there will be massive amounts of opportunity available to the folks that roll with the changes and adjust to the new reality of building things.
My thoughts are from a limited perspective. First, I am not a trained software developer. My background is mechanical engineering, and I'm retired but I've always done a fair bit of coding, I even had the pleasure of teaching programming to engineering and engineering design students. I am partially involved in working on a piece of FOSS, I am trying really hard to deal with 25 years of technical debt and upgrade the codebase to java 21 (then onto 25 and beyond). Recently, I have been trying OpenRewrite and Cursor ai to help.
From this experience I would most definitely say that in the foreseeable future there is now way that these tools can be used by, for want of a better term, muggles. As the user of AI you need a continuous interaction between expert coder and the AI, conversations not chats. You need to be aware of what you want and what you're provided by the AI. It's not an easy task. But, in the commercial world I expect that the code may well get cheaper, but not at the expense of dismissing trained professionals, but, simply because the AI can work so much faster than a human. But this cannot achieved without close constant supervision. People must remember that AI is a tool which requires patience and understanding to use. Sure there are psuedo coders that say aren't I wonderul I created an app in a weekend with AI, well, maybe but you didn't generate nearly 38,000 java files. Hope this is useful
I'd argue that whether writing code becomes cheaper depends on what kind of code we're talking about.
In my opinion, poor and low-quality code indeed becomes much cheaper. But quality, long-term solutions? Highly unlikely. What I see right now is that AI tools help you focus less on routine code and dedicate more time to business-specific code. This doesn't make good software cheaper—it makes it more reliable.
Raise your hand if you've heard this story before.
99% offshore - "It'll save you millions."
4GL languages - "You won't need CS/CE graduates anymore."
"Simple" integration technologies like SOAP - "B2B integration will be nearly automatic."
Completely tangential, but there's something here that reads like an AI generated a chunk of it and I'm wondering if I'm the only one that notices it?
Good thoughts overall though
Same here - When that happens, I find it difficult to read past the slop
ya, it's more of an ick, I don't want to discount the author's ideas though.
I definitely think some of this is prejudice on my end.
I almost thought this was a level-headed take until I got to the bit where AI is supposedly driving crazy deflation... I would agree that we have these dynamics in play, but they're just not new to the AI era in the slightest. As a software engineer, I always understood that my job was to eliminate my own job. Programmers have done this many times over, most often by climbing to a higher rung on the ladder of abstraction.
Similarly there has always been value to waiting to write code. Shipping it later on you might have better tools or frameworks available to use, or you might not need to support some really ancient environment like IE6 that is fading away. That's just the way things go with software engineering: the dilemma has always been the same. Do you wait for more feet to walk the path, widening the way and hardening the dirt? ...or do you try to lead and make your own path, hoping that you might discover an exciting place which others will wish to follow you to?
Even with AI making coding easier, I don’t think we’re heading toward a world where everyone becomes a programmer. Out of ten people, maybe one or two will actually enjoy learning to code—and even then, interest isn’t enough. You still need to understand core concepts, build problem-solving skills, and stay committed. AI can help, sure, but becoming a truly skilled developer takes time and dedication. That’s why I don’t see a major drop in opportunities for experienced programmers anytime soon.
This is because being a software developer isn’t about coding. It is about domain expertise first of all.
having used the AI tools for a while and the coding tools for a little while. My company measured the "productivity" gain and did not really see much for most of the complex software.
If you want to build a quick proto type it is ok for it. But once they upgrade the models and you try to go back into the code it makes a mess.
I think these AI tools are probably gen 1 they probably have another 10 to 20 years before they become profitable. I think the biggest change is not using AI to write software it is really software that is used by AI. The OS interface is long overdue to a totally make over.
This is a valid and nice take to compare it to the substitution and Jevons principle. I believe that the substitution principle will be applied on small to medium sized software where the code architecture is usually more or less the same: think of standard website, CRUD web apps, etc. The Jevons paradox would instead play out on large software with complex architecture, which you can’t really replace with machines (AI/LLMs).
"Understanding, integration, and judgment" - at some point AIs will outperform humans at those skills too. And then what? Extrapolate the interesting trend.
Another thought: IT department tomorrow will be like Finance department day: one CFO leading a small team today. Tomorrow, one CTO/CIO will only lead a handful of IT staff.
Couldn't we summarize these points as "the difference between a programmer and a developer" and that difference is becoming more obvious and important? To me, what makes a person a professional software developer is only partly the skill of programming; LLM-based code-generation tools (aka, Beck Genies) are trying to be used to enhance that skill.
To paraphrase someone (who I can't remember right now): New flash: the bottleneck and greatest cost was never writing the code.
“But the gap between commodity code and carefully crafted software widens. The middle disappears.”
If we think about it, it happened with music creation already. Having a studio that stays in your pocket and a supply chain able to put your music in steaming globally in no time flattened quality, increased the quantity and the gap between poor and extra-ordinary.
Programmers will probably go through the same pain that musicians experienced, and ChatGPT might only be Napster so far.
There’s a big difference in my opinion though: sadly we are going macro and not micro with the computational need to sustain all of this.
AI is the fast food of coding and because of that I think that other aspects to consider are:
- more junk code=>more bugs=>more bug fixing
- we have to find a way to confine that junk code into small replaceable modules, work on understanding, integration, judgment and DESIGN so that future optionality will still be possible
- consider possible long term health effects for the applications.
And ultimately you'll still want to be able to replace your fast food meal with a slower and more sophisticated one when possible.