I recently wrote about how AI doesn't create clarity. It exposes it. Weak decision-making, brittle processes, unclear ownership, and leadership gaps don't disappear when AI enters an organization. They become visible. That visibility is now being misread as damage.
In tech, AI is being blamed for the disappearance of junior developer roles. But what AI actually surfaced is something far more uncomfortable: a system optimized for short-term revenue while quietly deferring its human costs. The path into engineering didn't collapse overnight. It was dismantled gradually, one "efficient" decision at a time.
AI didn't create the problem. It delivered the invoice.
Revenue Pressure Came First
I've spent most of my career inside engineering teams that said the same thing in different ways: we don't have time to train juniors. What that usually meant was that delivery timelines were tightly coupled to revenue expectations.
This pressure didn't come from bad intent. It came from competition, compressed go-to-market windows, and the belief that speed itself was a defensible advantage. AI arrived in organizations already optimized for throughput, not resilience.
It's also incomplete to put all of this on individual leaders. Boards, investors, and market expectations shape the incentive field leaders operate in. When success is measured primarily through short-term growth, efficiency ratios, and quarterly performance, human capacity becomes invisible by design. Leaders may execute those incentives, but they rarely create them alone.
Under that pressure, management ideas meant to protect learning were reinterpreted. "Fail fast" became permission to absorb risk without protection. "Fast iteration" turned into speed without reflection. "Agile" became constant urgency with no slack for teaching. "Lean" thinking removed redundancy that once made teams resilient. When speed becomes the goal instead of a constraint, learning is the first thing sacrificed.
Junior Roles Didn't Disappear. They Became Unviable.
Long before large language models entered the conversation, junior developer roles were already shrinking. Senior engineers stayed longer due to market instability, burnout avoidance, or the lack of safer transitions. Fewer openings appeared at the bottom while expectations for juniors kept rising.
At the same time, compensation stagnated. Junior roles demanded broader skill sets, faster ramp-up, and immediate impact without the structural support that once made learning possible. Entry-level work didn't vanish. It became economically and cognitively unsustainable.
This wasn't accidental. It was the downstream effect of optimizing revenue while externalizing growth.
Training Didn't Disappear. It Was Externalized.
As systems grew more complex, training didn't scale with them. Onboarding compressed. Documentation lagged behind reality. Knowledge concentrated in a shrinking group of people who became both indispensable and exhausted.
Over time, on-the-job training was quietly replaced with an expectation of readiness. Juniors weren't trained once they arrived. They were expected to arrive trained. And once inside, they were expected to move quickly, often without context, because teams no longer had slack to explain how or why things worked. Training didn't disappear. It moved outside the organization.
Junior developers were expected to learn quickly and quietly. Mistakes felt expensive because there was no buffer to absorb them. Learning shifted from an organizational responsibility to an individual liability. This wasn't just a technical failure. It was a financial one.
The Costs Are Already on Your P&L. They're Just Unlabeled.
If organizations were honest, their P&L would already include burnout-driven attrition, knowledge loss and relearning, and execution bottlenecks caused by overloading a few senior people. Burnout shows up as recruiting costs and lost productivity long before someone formally leaves. Knowledge loss appears as repeated mistakes and slower decisions. Bottlenecks surface when work stacks up behind the same few people and velocity quietly degrades.
The claim that AI made junior productivity too low relative to cost ignores where the cost actually moved. Junior output didn't become less valuable. It became less visible. The return was never just shipped work. It was learning, judgment, and future capacity. When that investment is removed, the cost doesn't disappear. It shows up later as senior overload and increasingly expensive hiring cycles. Eliminating juniors doesn't improve unit economics. It defers the bill.
AI reduced the need for busywork, not the need for human growth.
Why AI Became the Scapegoat
AI entered this environment and became an easy explanation. It's true that AI can handle many of the tasks junior developers once relied on to learn. Boilerplate, scaffolding, and surface-level problem solving are faster now. But those tasks were never a growth strategy. They were placeholders for one.
AI reduced the need for busywork, not the need for human growth.
A common counterargument is that AI simply made junior developers economically obsolete. That their productivity no longer justifies their cost. But this assumes juniors were primarily valued for output rather than growth. Historically, they weren't hired because they shipped faster than seniors. They were hired because they became seniors. If a role collapses the moment busywork disappears, the problem isn't automation. It's that growth was never being intentionally supported.
If AI were truly the root cause, the solution would be technical. Better tooling, different models, new workflows. Instead, the failures showing up are human ones: fragile teams, knowledge bottlenecks, rising attrition, and stalled execution. The fact that the pain isn't technical is the clearest evidence that AI isn't the cause.
The Question Leaders Still Have to Answer
What's changed is not the need for junior developers, but the cost of ignoring how they grow. Juniors now need context, judgment, psychological safety, and feedback that explains reasoning, not just outcomes. Senior engineers are being asked to slow down enough to explain their thinking and transfer judgment.
For leaders who care primarily about revenue, the risk isn't prioritizing output. It's confusing short-term throughput with long-term capacity. Systems that extract performance without replenishing people eventually slow down, no matter how advanced the tools become. AI can surface these failures faster. It cannot fix them.
Revenue scales on resilience, not exhaustion.