Synthetic -> analytic skills
"Lots of folks asked what’s the 90 & what’s the 10." Probably, Art and culture as a field will gain its traction back. More money might be pumped into it. Live shows might get expensive. Just like, the arrival of synthetic diamonds will shoot up the prices of real diamonds.
My guess would be that one thing that is in the 10 include an understanding of how to figure out the 'right What'. For that, you need both an idea of Why, as well as a picture of the impact of the proposed What. We are almost never able to directly manipulate the outcomes we want (say, 10% better 2nd week customer retention, or 5% better conversion, or 8% better latency at the 75th percentile, or..), so we need to guess what impact a change will have.
Another thing that will likely remain hard is precision in communication. Just think of all the stories we've written about people failing to be precise in specifying the What in their deals with genies or devils - and in our case, it's not enough to just control what the genie will do, we also need it to be clear enough to be understandable by the next human who will build on it.
Maybe the 10% is focusing on the why. Why do we want the software to do what?
That context is what enables developers to be autonomous while satisfying business need. Maybe that context will better enable generative AI.
I like the idea of GPT taking away a lot of the boilerplate (we've been using frameworks for the same purpose for. along time now). I've been thinking about this for a while now and realized something the other day: these young folk are jumping on this like it's going to make them a great programmer...it's not! What it will do however, is give contractors like myself (with 3 decades of experience) lots of work in the future when these pretty green programmers are not able to fix the code they "built" because they didn't "write" it, and don't understand the mechanics of what's going on.
Ah, the future is full of opportunity again!
Fascinating- and will apply to a very wide range of jobs and occupations.
It seems to me that each shift allows the dev to care less about the details. The software is less optimised but that doesn’t seemed to matter like it used too. I get the same feeling when people describe a solution purely in terms of gluing together cloud native tech.
I've been thinking about that transition from old-style structured programming to OOP and how a lot of older guys building stuff in COBOL, Fortran, etc. could not make the transition OOP into stuff like C++, Visual Basic and Java. I worked with a bunch of these guys in the early 90s when I was just getting started. We brought in concultants to train us on the new OOP languages and concepts. For me it was a no brainer at 22 years old. For these guys in their 50s it was just too mind bending for them... I fear the same thing is happening now with all this LLM prompt engineering. I mean, I GET IT, but can I be truly creative with it or is my brain too well trained in thinking in code, logical algorithms and OOP concepts to use it well enough?
My first thought was that the AI generated code examples I saw were a real threat. Now, however, digesting some of the discussions and reflecting on my development career I've come to the conclusion this is an inevitable, natural progression of software development. We've been building applications by interpreting and hand coding solutions. Coding practices and tools have been reducing time spent typing code. Since the mid 80's the proliferation of Libraries, automated testing, build tools, integrated debuggers, collaborative IDEs, coding by configuration have all lead to the reduction of time I spend writing code. One thing that's not changed much is the limited accuracy, breadth and clarity of the requirements I've had to work with. I think AI brings the focus on the *what* to the fore where, quite frankly, it should have been long before now. I think the 90pc will become our ability to use all we've learned about building scalable, secure, robust, productionized systems; to work with the requirements gathering process and use this new tool to do what we've always done, deliver solutions.
My first job was all about assembler (and some COBOL). Then I shifted to C, then C++, then Java, then "the web", and somehow I'm now doing "Lisp" (Clojure) and functional programming -- the stuff that I learned at university (and spent three years of PhD research on) but couldn't find work doing, back then. The only constant in my career seems to have been change... thankfully!
Good essay, I enjoyed reading it. You didn't cover the "Why". More What, Less How, Same Why?