I explain the larger game of A.I. Deception on my podcast here. The point is control. We either turn A.I. to our benefit or we will be consumed like the Borg. Let explain:
There is the tincy wincy bit about balance and where technology levels out. As you say, technology levels out, but where it levels out is unclear. It may also dip after levelling out if it is replaced by something else or it turns out to be ineffective or inefficient.
My favorite example is dairy farming and build antennas. In dairy farming, to have a fully automated milking station, you need around 10000+ cows to make it cost-effective. For a semi-automated milking station, around 1000 cows are sufficient. Why? The technology is more expensive, but, far more importantly, you need more well-educated staff that costs more money, require longer education to run and maintain the fully automated milking station. In contrast, the staffing for the technology of the semi-automated milking station costs less and does have less requirements on the education. Besides, the technology itself is cheaper.
Concerning building antennas, building an antenna up to a certain point costs X dollars but the gain after that point costs expoentially more and, also, narrows the frequency band in which the antenna amplifies the signal. Basically, depending on your requirements, at some point, there is a break-even point concerning return of investment. Then you can add the dimension of the market and mass production effect to further calculate the ROI.
In terms of vibe coding, it is an attempt to automate development further. I personally believe the main gain is in reuse of medium-granular components. As Szyperski expressed, we are good at large-granular reuse (databases, operating systems) and small-scale reused (snippers of code), but worse at medium-granular components. The learning cuve for .NET is significant. By adding probabilistic LLM-based support for reuse of medium-sized components, I believe it can become more cost-effective.
Great article Kent! And I learned something about corn production :) You very briefly mentioned vibe coding in the article. I'm curious to know your thoughts on whether vibe coding could be integrated into a TDD workflow, or if that's even reasonable?
Betamax was a technically superior format but it lost to VHS because Sony kept Betamax proprietary. Would it have been "more efficient" for the world to just have picked Betamax? Sure. But we can't be certain the world would have picked Betamax. They might have picked something much worse. We collectively have this "diffusion of innovation" process to weed out bad innovations before they do much damage.
Kent, did you fact-check bits of information provided by Claude that you relied upon whilst writing the article?
Amara's Law frames this well for me "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run"
I explain the larger game of A.I. Deception on my podcast here. The point is control. We either turn A.I. to our benefit or we will be consumed like the Borg. Let explain:
https://open.substack.com/pub/soberchristiangentlemanpodcast/p/ai-deception-2025?utm_source=share&utm_medium=android&r=31s3eo
There is the tincy wincy bit about balance and where technology levels out. As you say, technology levels out, but where it levels out is unclear. It may also dip after levelling out if it is replaced by something else or it turns out to be ineffective or inefficient.
My favorite example is dairy farming and build antennas. In dairy farming, to have a fully automated milking station, you need around 10000+ cows to make it cost-effective. For a semi-automated milking station, around 1000 cows are sufficient. Why? The technology is more expensive, but, far more importantly, you need more well-educated staff that costs more money, require longer education to run and maintain the fully automated milking station. In contrast, the staffing for the technology of the semi-automated milking station costs less and does have less requirements on the education. Besides, the technology itself is cheaper.
Concerning building antennas, building an antenna up to a certain point costs X dollars but the gain after that point costs expoentially more and, also, narrows the frequency band in which the antenna amplifies the signal. Basically, depending on your requirements, at some point, there is a break-even point concerning return of investment. Then you can add the dimension of the market and mass production effect to further calculate the ROI.
In terms of vibe coding, it is an attempt to automate development further. I personally believe the main gain is in reuse of medium-granular components. As Szyperski expressed, we are good at large-granular reuse (databases, operating systems) and small-scale reused (snippers of code), but worse at medium-granular components. The learning cuve for .NET is significant. By adding probabilistic LLM-based support for reuse of medium-sized components, I believe it can become more cost-effective.
Great article Kent! And I learned something about corn production :) You very briefly mentioned vibe coding in the article. I'm curious to know your thoughts on whether vibe coding could be integrated into a TDD workflow, or if that's even reasonable?
I'm wondering what is the meaning of VHS in your text. I certainly missed something! 😉
Betamax was a technically superior format but it lost to VHS because Sony kept Betamax proprietary. Would it have been "more efficient" for the world to just have picked Betamax? Sure. But we can't be certain the world would have picked Betamax. They might have picked something much worse. We collectively have this "diffusion of innovation" process to weed out bad innovations before they do much damage.
Clearer?
For sure! And thank you! I thought it could be some other acronym that I haven’t caught!