The last generation of programmers
At the dawn of the industrial loom, few weavers realized they were witnessing the end of their world. The whirring of mechanized looms in distant mills must have sounded like progress, not obsolescence. When electricity began to light city blocks, it arrived first as a novelty—a flickering bulb in a wealthy home—before spreading like wildfire through factories, streets, and lives. History has taught us that technological revolutions do not announce themselves with clarity. They arrive in whispers, scattered breakthroughs, subtle shifts in how we work and what we value. Then, all at once, the ground gives way. In the moment, it never feels like a transformation. It feels like turbulence. Confusion. Even boredom. Only later, in hindsight, do we see the rupture for what it was.
Today, I can’t help but wonder: are we in software development the frogs in the pot, warming to a boil we refuse to notice? Is AI merely making us faster—or quietly making us irrelevant?
This post explores one extreme of the AI endgame: not collaboration, but complete human displacement from the act of writing software.
In 1900, 41% of Americans worked in agriculture. By 1950, that number had dropped to 12%. Today, it's less than 2%. The transformation wasn’t gradual—it was a cascade that began slowly, then accelerated with breathtaking speed once mechanization crossed a critical threshold.
We are now witnessing the opening act of an identical drama in software development. And the final curtain will fall within five years.
The analogy goes beyond job displacement. Just as the combine harvester didn’t merely help farmers work faster—it rendered most of them obsolete—AI isn’t simply making programmers more productive. It is redefining what programming is, who can do it, and whether humans should be involved at all.
Consider the printing press. Scribes didn’t evolve into “printing assistants.” Their profession vanished. Those who adapted became editors, publishers, or moved into entirely new domains. A handful remained as luxury artisans, hand-copying wedding invitations and illuminated manuscripts for the wealthy. This is the future awaiting software engineers: a tiny class serving niche markets, while machines handle the industrial production of code.
The signs are already here, and traditional software companies should be alarmed. GitHub Copilot now generates nearly half of the code in projects where it’s enabled (see here). GPT-4 builds entire applications from natural language descriptions. Claude debugs complex systems and refactors legacy codebases. But these are just the opening moves. The real disruption comes as these capabilities compound.
By 2027, we will see the rise of what I call software synthesis systems—AI that doesn’t just autocomplete snippets, but architects entire applications from high-level business requirements. Imagine describing a banking platform to an AI that understands regulatory compliance, scalability, security frameworks, and UX best practices. Within hours, not months, it delivers production-ready code, complete with tests, documentation, and deployment pipelines. This isn’t science fiction. It’s the logical evolution of tools we already have.
The pattern of disruption will follow history: it begins at the bottom and moves up fast. Junior developers will disappear first. Their work—applying familiar patterns to well-scoped problems—is precisely what AI does best. We are already starting to see stress signals from the job market, especially for college graduates impacted by this “hollowing-out” effect:
At Chicago recruiting firm Hirewell, marketing agency clients have all but stopped requesting entry-level staff—young grads once in high demand but whose work is now a “home run” for AI, the firm’s chief growth officer said. Dating app Grindr is hiring more seasoned engineers, forgoing some junior coders straight out of school, and CEO George Arison said companies are “going to need less and less people at the bottom.” source: WSJ
But the cascade won’t stop there. AI systems are already outperforming senior engineers in narrow domains: optimizing queries, designing APIs, even making complex architectural trade-offs. The hallowing-out will move up the stack.
The most profound shift will be in who creates software. Just as desktop publishing democratized design and eliminated the typography profession, AI will make software creation accessible to anyone. Product managers will describe features and see them implemented “instantly”. Business analysts will prototype workflows without technical intermediaries. The barrier between idea and implementation will evaporate.
This will transform the very structure of companies. Today’s software organizations employ armies of engineers, architects, DevOps specialists, QA testers, and project managers. By 2030, a typical software company might consist of a handful of AI orchestrators, a few domain experts who deeply understand the business, and one or two humans to handle edge cases. The familiar engineering hierarchy—junior, senior, principal—will feel as anachronistic as the typing pools of the mid-20th century.
The economic implications are staggering. Software development today employs millions of highly paid professionals. These are not roles that can be easily absorbed elsewhere. They sit at the apex of the knowledge economy. When agriculture was automated, farmers became factory workers. When factories were automated, many moved into services. But where do programmers go when thinking itself is outsourced?
The answer lies in identifying what humans alone can contribute. Just as the rise of automobiles created new roles—traffic engineers, car designers, logistics planners—AI will open new frontiers. But those frontiers won’t be about typing code. They’ll be about directing AI systems toward valuable outcomes, integrating business and ethical context, and shaping the goals of machines that can build anything but understand nothing.
The most successful companies of 2030 will be those that adapt early. They’ll hire AI orchestrators instead of programmers, domain experts instead of technical specialists, and creative directors instead of engineering managers. Their edge won’t come from implementation talent, but from their ability to envision, constrain, and deploy machine intelligence with precision and imagination.
This will not be a slow, decades-long handoff between humans and machines. Technological disruption doesn’t wait politely. It gathers in the background, invisible at first, then arrives all at once. The companies still hiring traditional software engineers in 2029 will resemble newspapers employing typesetters in 1990—technically operational, but drifting toward irrelevance.
The last generation of programmers is already among us. Many don’t realize it yet. They merge pull requests, review PRs, and plan their careers as if the ground beneath them isn’t shifting. But a few—the perceptive, the pragmatic, the quietly anxious—have begun preparing. They are learning how to direct AI rather than outpace it. They are developing deeper domain fluency, ethical intuition, and creative judgment. They are asking new questions.
Because something else is being lost. And the loss is not just economic.
It’s the late-night debugging session that ends in triumph and relief. The click of minds aligning in front of a whiteboard. The slow, quiet transformation from apprentice to master, measured not in promotions but in clarity of thought. That rhythm is fading. In its place: infinite ambiguity, frictionless generation, the disorienting pace of tools that invent faster than we can absorb. Everything is faster, but nothing settles. The future arrives before the present is done loading.
Still, history reminds us that the end of one practice is often the beginning of another.
When photography arrived, painting didn’t die—it turned inward and reimagined itself. When the synthesizer appeared, music fractured, recombined, and gave birth to entirely new genres. Software may do the same. Stripped of its syntax and tedium, it may evolve into something more expressive, more philosophical, more strange. We may stop writing code not because we’ve lost the will, but because we’ve found a better canvas.
And so the next revolution in software may not be in the writing at all, but in the choosing. In the asking. In the quiet, human work of deciding what should be built when anything can be.
The age of human code is ending.
What comes next is not ours to write—but to imagine.