If you've been tracking the tech word at all this past year, you probably know about the hype surrounding what we're calling "generative-AI". For now, "AI" as a concept is going through what I generally call its "blockchain" phase. To be honest, there are "AI" solutions that are actually excel spreadsheets with embedded formulas. Automation they are, but AI they most certainly are not.
If we are to look at something that at least smells like AI; ChatGPT (and similar text generation platforms) have changed the industry radically for many professions. The Hollywood Writer's strike ended only a few weeks ago week, with no real clear resolution of the problem that any old joe can go out and have a transformative tool to generate arbitrary text. The problem that I, as a technical person, have with this is that in the circles I tend to travel in, GenAI is understood to be, well, kinda meh for certain things.
The Editor Problem
In my many career excursions, I ended up in the media translation and simultaneous interpretation space. Even back then, there was a significant amount of chatter in the translation community in regards to computers taking over the translation world. Bablefish (remember that?), Google Translate, and a myriad of other tools already existed, and Computer-Assisted-Translation was an industry standard. However, with that came the need for a strange niche of industry expertise, translators that were very good at being copy editors, or vice versa, bilingual copy-editors. The rise of the "new favorite junior programmer" via ChatGPT has forced this issue into the wider world, especially in tech.
There's no argument that ChatGPT and its ilk can generate output that looks pretty good. The catch is just that, it's pretty good. Junior engineers, by their nature, are not necessarily able to fit into this "ChatGPT + senior engineer" equation.
The DevOps Elephant in the Room
Over the past decade or so, the role of the server administrator and operator has become conflated with "programmer". For good or not, in the modern landscape, most programmers need to think about systems...and most systems engineers need to be programmers at some basic level. As an industry, this conflation of roles is something that we have been pushing for, without a clear definition of how to actually onboard new talent into this ecosystem. If a single senior engineer can create vast swaths of infrastructure in a very short period of time, and infrastructure paradigms are such that the management overhead has been reduced, where does that leave everybody else? This leads us to the wonderful world of maintenance engineering.
Maintenance Engineering
There's a fantastic article by Greg Jorgensen where he writes about the challenge of programmers learning how to program by doing maintenance programming. This really translates to the infrastructure world wholesale. And in a sense, when creating infrastructure, the role of maintenance engineer is much simpler. Unless you are in the startup space, or particularly lucky, the probability of you as an engineer tackling a fully greenfield project is relatively low. The traditional entrance pathway to infrastructure work, support, is nothing but maintenance engineering. The entire ecosystem of Managed Service Providers that exists to service their client needs around exactly just that for infrastructure. (Shoutout to our MSP partners, we love them dearly.)
This would seem to solve the problem of junior engineers in this AI-focused, AI-driven market. But not really. There is plenty of work being done out there that will allow for AI models to tackle bug hunting and bug fixes. So does this disqualify the notion of the junior maintenance programmer?
The Paralegal Past and the Terminator Future
In Growing Talent as a Trade, I alluded to the era of the junior engineer as the parlegal to be over. In fact, I will go out on a limb to say that the era of the junior engineer as a whole is over. We'll be riding the sine wave of this market for a while longer, but it's really not all bleak. The junior engineer, though they remain just that, junior, now have a significantly easier access to knowledge and tooling that will increase their productivity (provided it's used correctly). The paradigm is no longer the paralegal/gopher, it is the cyborg.
Think about it for a second, any junior engineer can churn out code that, at first blush, will fit your use case relatively easily. (We're going to ignore the moral, ethical, cost, and legal problems of using AI-generated code in your projects. That's for a different, future article). Like any good sci-fi story, this cyborg is well-armed, probably has a few hidden functions that even they don't know about, and has the capability to either create something massively beautiful, or blow up all of your systems in such a fantastic way that you will spend months recovering from the fallout. Managing junior engineers has always been a balance between work output and probability of them messing up. With years of experience, that probability goes down.
This can be expressed as a pseudo-mathematical formula as such:
[Inverse Risk Factor (aka time in industry)] x [Productivity] = Value
The trick is that in this new AI-mad world, the Productivity value is completely out of whack and not much is being done (yet) to mitigate and integrate the necessary risk management into the risk factor. Many industry veterans are just beginning to come to terms with the risks and requirements of working with this brand new tool, so there'll be a lag time between the time they are able to integrate this into their workflows and the time that they are able to begin mentoring and coaching their juniors on the usages of AI to actually increase their productivity in a risk-mitigated fashion.
Snake Oil as Lubrication
For now, there's a lot of snake oil out there in the AI space. We've all seen the "get rich quick using GenAI" classes (only people getting rich there are the ones peddling the classes). Most of that industry hasn't quite yet pivoted to targeting us tech folks yet but I'm starting to see some of it there.
We aren't going to develop industry standards around this; this is just not the kind of industry that IT engineering is. But we are going to get quite good at mitigating the potential risk of AI-augmented engineers. It's just going to be a bumpy road for a bit.