There's been a lot of news recently about job layoffs, labor force reductions, pink slips for the sake of "efficiency." But a stat in the other direction recently came out that piqued my interest: job listings for forward-deployed engineers went up 800% between January and September this year. Not 8%. Not 80%. 800%. I'd call that a little bit more than a steady climb. More like a boom.

Before you leave this post to look up what the heck a forward-deployed engineer even does, I'll tell you. Think of them as a middleman between the AI company and the company looking to utilize those services but don't know what the heck they need to do to implement them. They're the ones who talk to the teams, understand what's actually broken or needs to be improved upon, translate that into something the AI can handle, and then somehow get everyone to trust the results enough to use them.

Call it half coding, half deployment training (with maybe a little bit of armchair therapy mixed in).

But what's even more intriguing is that design skills have overtaken technical skills in AI job postings, per a review of 3 million job postings by Autodesk. That's right. Design. Not machine learning. Not data science. Not whatever jobs those people have who try to explain in intricate detail what a neural network does during a cocktail party while you stare blankly into their eyes.

Design! You know, the people in charge of making AI feel like something humans might actually want to use. Those people!

On the surface level, it feels a little backwards, right? We're in the middle of this massive tech revolution, so we should need more technical people. But the more I sit with it, the more I think it makes perfect sense.

We built the thing before we figured out how to use it

I think what happened is pretty straightforward. Every company spent the last two years buying all things AI. Some because they had real problems to solve. Some because their competitors were doing it and they experienced a bit of FOMO. And some, because they were given a really amazing sales pitch, were impressed by the demos, so they bought the thing — even if they didn't know what they were going to do with it.

So now we've got a bunch of companies sitting on these incredibly powerful tools, with enough intuition to know they're probably only using them somewhere in the 10–20% range of their capabilities. Turns out owning the tool and knowing what to do with it are completely different problems.

It's like buying a professional camera because you want better photos. You unbox this beautiful piece of expensive equipment and then realize you don't know what bokeh, or aperture, or depth of field really mean. You're just basically pointing an expensive tool at things and then wondering why people are telling you that your iPhone photos look about the same. Eventually, you realize you're never going to take the time to learn and shove the camera way back in the closet. Except with AI, your competitors aren't putting that new fancy tool in the closet and they're racing past you in market share while you get left behind. Suddenly, your entire business strategy is riding on getting the best photos.

So, predictably, companies are hiring people who can bridge that gap. Forward-deployed engineers who can sit in the middle of "this AI can theoretically do anything" and "this specific team needs it to do this specific thing based on their specific brand of chaos." Product managers who can figure out which problems are actually worth solving with AI and which ones are the digital equivalent of a shiny new bell. Strategists who understand which workflows should be automated and which absolutely should still have a human touch. Ethics and compliance officers who can see the disaster coming six months before it hits the news.

And designers. A lot of designers. Because it turns out making AI powerful is a technical problem, but making AI usable is a human problem. You need a human to figure out what confuses people. What they'll trust. What will make them feel like they're in control versus endless loops of fighting with the tool that likes to say "you're absolutely right!" after even the slightest push back.

What these numbers actually reveal

LinkedIn's Chief Economist, Karin Kimbrough, said something very measured about "proactive and forward-thinking" people being better positioned in this economy. Which I think translates roughly to: if you can make AI work for people who will never understand how it works, you're suddenly very valuable.

The skill that matters isn't knowing how the model was trained or what algorithms it's running. Those are for just a handful of people to work on. For the broader economy? It's understanding humans well enough to know what they actually need, why they need it, and how to give it to them in a way that doesn't make them just give up on the whole roadmap.

Empathy. Judgment. Taste. The ability to read a room. The ability to push back and ask "should we?" before asking "can we?" Those skills we basically gave up teaching while we went all in on STEM at every level of education while dropping the humanities deep into the ocean tied tight with the heaviest of anchors.

Guess what? Turns out those skills may end up being the only ones that matter when everything else can be automated.

This moment is revealing a truth that I think deep down we have all known for a while as the AI whirlwind circles around us. The more powerful the technology gets, the more we need people who can translate it into something the rest of us can actually use.

And I don't think businesses will be able to automate that away anytime soon.

Greg Dessau writes about AI, work, and attention at Human Offset. If you're wrestling with how to actually implement AI in your organization, feel free to reach out.