All insights

Field note

What 95% of stalled AI projects have in common

May 2026 · 5 min read

A field note from Harpeth Labs.

The thing that kills most AI projects isn’t the model. It isn’t the vendor. It isn’t the budget.

It’s that the person who has to actually use the tool every day was never part of the decision to buy it.

A widely-cited MIT study last year put the failure rate of enterprise GenAI pilots at 95% — meaning ninety-five out of every hundred AI investments produce no measurable return. The number is provocative. The pattern behind it is not. Walk into any business that has quietly shelved an AI initiative in the last eighteen months and you’ll find a version of the same story.

A leader gets excited, usually after a conference, a board conversation, or a competitor’s press release. A pilot gets approved. IT buys the seats. A “champion” sends out a Loom video. Six weeks later, the dashboard says adoption is four percent, the champion is quietly back on the old workflow, and the renewal conversation is awkward.

What went wrong wasn’t the technology. It was the order of operations.

The five tells of a project that’s about to stall

After enough of these conversations, the pattern is hard to miss. Stalled AI projects almost always share most of the following:

The tool was chosen before the job was mapped. Someone fell in love with a vendor demo, then went looking for a problem it could solve. The reverse — starting from a specific workflow and asking what would meaningfully change it — is rare and feels slow, which is why most teams skip it.

There was no baseline. Nobody measured how long the work used to take, how often it produced errors, or what it cost to do once. So when the tool launched, success was invisible. You can’t prove ROI against a number that was never recorded.

Training was a launch event, not a practice. A 45-minute kickoff webinar, a recording in a shared drive, and that’s it. Three weeks later, half the team has forgotten the prompts that worked, and the other half never opened the tool a second time.

The executive sponsor never used the tool themselves. This is the loudest signal. If the VP who approved the spend can’t show you the last three things they did with it, the people below them aren’t using it either. Adoption is downstream of leadership behavior. Always.

There was no owner inside the line of business. A vendor PM and an IT contact aren’t owners. An owner is a person on the team whose job gets easier or harder based on whether this thing works. If you can’t name that person, the project is already drifting.

The fix is not another tool

If your AI project has stalled, or you can feel one starting to, the answer is almost never to switch vendors. The answer is to go back to the workflow layer that got skipped on the way in. Map the job. Set the baseline. Identify the owner. Build the training as a habit, not an event. Ask your executive sponsor to use the tool, on camera, once a week, for sixty days.

That’s unglamorous work. It’s also the work that separates the five percent of AI projects that compound from the ninety-five percent that quietly disappear.

It’s the work we do.


Harpeth Labs is a Franklin, Tennessee AI workforce consulting firm. We help small and mid-sized businesses turn AI from hype into measurable productivity. If a project of yours is stalled, or you’d rather not have one stall in the first place, get in touch.

Want this kind of thinking applied to your business?

A 25-minute discovery call. No pitch, no slides.