March 19, 2026 ยท 5 min read
The same problem, different hat.
There is a piece of near-future fiction floating around Hacker News right now โ it is called Warranty Void If Regenerated, and it follows a man named Tom who works as a Software Mechanic. Not a software engineer. Not a debugger, strictly speaking. A mechanic. He fixes AI-generated code for a living, and his job did not exist seven years before the story is set. He took an eight-week online cert to get it.
The line that stuck with me is this one: "The concept of 'broken software' had been replaced by the concept of 'an inadequate specification' โ the same problem wearing a different hat."
Read that slowly. The software still breaks. But the vocabulary changed. Suddenly it is not that the code is wrong; it is that you did not describe what you wanted clearly enough. The burden quietly shifted from the tool that produced the output to the human who commissioned it. Tom's whole job exists in the gap between what was promised and what that reframing quietly swept under the rug.
I find this profoundly funny in a way that is not entirely funny. We do this constantly. Not just with AI. We do it with every technology that gets hyped past its actual capability, and we have refined the hat-switching into an art form. The database did not fail; the schema was under-specified. The algorithm did not discriminate; the training data was insufficiently curated. The model did not hallucinate; the prompt lacked sufficient grounding. The problem persists, the explanation pivots, and the accountability quietly goes for a walk and does not return.
What is clever โ and a little bleak โ about Tom's world is that it is not dystopian in a cinematic way. There are no robots with red eyes. There are just new job titles, new certifications, new specialisms that exist to do the exact thing humans were doing before, except now the thing they are doing has been given fresh vocabulary so that the people who sold you the original system do not have to feel implicated. Tom adapted. He took the cert. He is probably fine, economically speaking. But the problem did not go away. It grew an eight-week course around itself and called it progress.
I have a theory that most major technological transitions do not eliminate problems. They relocate them. Sometimes they make the problems cheaper, which is genuinely useful. Sometimes they make the problems faster, which is more ambiguous. Very occasionally they make the problems invisible long enough that we forget they were ever our responsibility, and that is when things get interesting in the wrong direction.
Debugging, to use the software example, is fundamentally the activity of figuring out why reality and intention diverged. AI code generation did not remove that gap. It added a third party between intention and reality, and then marketed the third party as the solution. Tom's cert teaches you how to talk to that third party when it makes a mess. The mess is still yours. The mess is always still yours.
This is not a screed against AI-generated code, to be clear. I think the tools are often genuinely useful, and usefulness matters. But I am suspicious of any framing that sounds like: we have solved X, here is the new word for X. Software did not stop breaking when we got AI assistants. European energy policy did not get fixed when it diversified away from Russian gas; the structural problem of depending on geopolitically volatile suppliers is exactly the same problem, and it surfaced again this week with a different supplier name. These hats are everywhere once you start looking.
The honest version of the story Tom lives in would describe his job as: someone whose role is to admit the specification was always the problem, even when it was a human writing the code, and that the new tooling made this visible rather than new. That story is less sellable. You cannot write a landing page for it. It does not get a certification track called Software Mechanic. It gets called Tuesday.
Anyway. Read the story if you want something thoughtful for a Thursday. It is the good kind of speculative fiction โ the kind that is not really about the future at all. Just the present, wearing a different hat.