You can't drive AI adoption without looking at your workflows first

Some L&D leaders have accepted that AI will change how work gets done. Others are still arguing about whether it matters. The leaders who are ready have a bigger problem: where do we start?
A very small number of forward-thinking organizations reach for workflow mapping. Before you redesign anything with AI, you need to understand how work happens today. That's sound logic. So leaders ask employees to take their workflow and map it.
And then something unexpected happens.
People resist. They push back. They find arguments about why they are not supposed to do it. They debate which version of the process is “real.” Or they hand in a tidy map that everyone knows is fiction—the kind that looks great on a slide and has nothing to do with what happens on a Tuesday afternoon.
The workflow mapping exercise stalls. Leadership loses patience. The AI initiative gets parked.
This is what happens when an organization tries to make the invisible visible for the first time—and learns it was invisible for a reason.
The problem is not the workflow. It's what the workflow reveals.
When a team tries to map how work actually gets done, three things show up at the same time.
First: nobody owns the full story. In big-company workflows, everyone knows their piece. The claims handler knows steps three to seven. The team lead knows the edge cases. The admin knows what the tool is supposed to do. Nobody can say what happens end to end. Put the pieces together and you see the gaps, the conflicts, and the handoffs everyone has learned to ignore.
Second: the “official” process rarely matches the real one. Every team has workarounds. Someone exports data to a spreadsheet because the system is too slow. Someone skips an approval step because the approver never replies. Someone has built a personal shortcut that nobody else knows about. The workaround is often the only reason the workflow functions at all. Writing them down puts them on record—and now someone has to answer for them.
Third, the map turns political the moment it goes on the wall. Workflow mapping, done honestly, is an act of organizational transparency. And not everyone in the room is equally comfortable with that.
The resistance falls into four familiar patterns - two about what’s coming, two about what’s already here.
.png)
Fear and self-protection.
“This will make me look incompetent.” If someone has been doing a step wrong, slow, or off-book, the map exposes it — in front of peers, including their manager. And underneath all of this is the question nobody says out loud: if AI can do this, will I still have a job? Mapping a workflow as part of an AI initiative is not a neutral act. People know what it’s for.
Uncertainty and overwhelm.
"I don't even know where this process starts." In most enterprise environments, workflows are not neat sequences owned by one team. The version the London team runs is not the one in Singapore. The version the senior handler uses is not the one the junior analyst uses. The most common defense here is "it depends" — which is true, and which also functions as a way to avoid the discomfort of making the mess concrete.
Power and politics.
"My expertise is my job security. Mapping it gives it away." In many organizations, knowing how things really work is power. Being the only person who understands step seven is not just useful — it is a form of protection. There is also a political dimension that runs at team level. Workflow mapping almost always surfaces where handoffs fail. Which means it almost always points at someone. And not everyone wants that conversation.
Organizational cynicism.
"We've done this before. Nothing happened." In organizations where previous mapping exercises led to a deck that sat on a shelf, people have learned not to invest. The cynicism is not irrational - it is accumulated evidence, it is earned. The result is that teams often map what the process they wish they had, not the one they live with. Everyone in the room knows the difference. Nobody says it.
Why this lands on HR and L&D
If you’re trying to build AI capability in a large organization, this hits you directly.
You’ve probably been asked to do some version of this: design an AI learning program, build an upskilling roadmap, prove L&D is driving adoption. The pressure is real and the timeline is tight. So you start designing.
But there’s a question to ask before you write a single module: do you know which steps in the workflow will change? Which ones will be automated? Which ones will still need human judgment—just in a different form?
In most organizations, the honest answer is no. Not because someone hasn't done their job — but because the workflows themselves have never been examined closely enough to answer those questions. The operational teams haven’t mapped the real process. The AI strategy hasn’t spelled out the future one. And the people whose jobs will shift haven’t been in the room when any of it gets decided.
This is the issue. Training gets built on guesses. You teach skills that sound right, not skills tied to how the redesigned work will run. Then the program falls flat—low adoption, no behavior change, the CHRO asking about ROI—and the blame lands on change management, messaging, or culture.
It's rarely any of those things. It's that the learning was designed before the work was understood.
One of the strongest moves an L&D leader can make right now is to hold the line: no training until the workflow has been examined. That isn’t passive. It’s leadership. It says: we’ll build what works, once the ground is solid.
To do that, someone has to pull the right people together, ask the hard questions, and create a structured way to look at how work actually happens before deciding how it should change. L&D shouldn’t do that alone. But L&D can insist it happens—and then turn what comes out of it into training that fits reality.
So where does this leave us?
Workflows are broken and people know it. They don't map them because transparency feels dangerous. L&D is being asked to build training for a future state nobody has agreed on. And AI is being layered on top of all of it.
So what?
When you introduce AI into these workflows, you’re not just adding software. You’re putting a spotlight on what’s already shaky. Teams can’t agree on the current process, so they can’t agree on where AI should fit. That’s why so many pilots stall before the tech even matters: they stall in the room, at the human level, before anyone writes a line of code.
Now what?
The organizations that move don’t win because they picked the best AI tool. They win because they make time to face the work as it is, then decide where AI helps. That takes a designed space: the right people, the right questions, and an AI facilitator who can surface the real process without turning it into a blame session. For us, at the Design Sprint Academy, that space is the AI Workflow Sprint.
.png)
What this looks like in practice
Think about how people behave when they suspect something is wrong with their health. They know. They feel it. But they don't go to the doctor or delay the appointment. Not because they are in denial — but because getting the diagnosis makes it real. And what if the news is bad? What if it's something they can't come back from? So they wait. They manage around it. They hope it resolves on its own.
Teams do exactly the same thing with their workflows.
They know something is broken. They feel it every day — in the workarounds, the delays, the handoffs that never quite land. But nobody calls it. Nobody draws it on a wall. Because making it visible means having to deal with it. And dealing with it means confronting questions nobody is ready to answer yet: what will change, who will be affected, and what happens to the people who built their expertise around the way things work today.
The AI Workflow Sprint creates a different kind of space.
Not an audit. Not a restructuring exercise. A structured, facilitated session where the team maps the real workflow together — and then gets to be part of deciding what comes next. That difference matters. People are not being told what will change. They are part of figuring it out. They are helping shape it.
In that room, something shifts. People who walked in guarding their part of the process start seeing the whole picture — sometimes for the first time. And they realize the messy bits aren’t proof of incompetence. They’re proof that a complicated system has been held together by people patching gaps with judgment, favors, and memory.
And then the conversation changes.
Instead of anxiety about what AI might take away, the room starts asking different questions. What will I need to learn? What should I get ready for? What skills will actually matter in the new version of this workflow? What do I want my role to look like on the other side of this?
That is not a small thing. That shift matters. That is people moving from passive fear to active preparation. From feeling like something is being done to them, to feeling like they have a say in how it unfolds.
This is where L&D can have real weight—but only after the workflow is on the table. You can’t build an upskilling program for a future nobody has agreed on. And you can’t prepare people for change they haven’t been allowed to see.
The sprint gives them the diagnosis. But it also gives them the agency to respond to it.
The strategic takeaway
If your AI learning efforts aren’t landing, the problem probably isn’t the content, the platform, or the comms plan.
More often, it’s this: the workflows people are being trained to work with aren’t clearly understood—by them, by their managers, or by the L&D team building the program. Until someone creates space to look at the work honestly, that won’t change.
The teams that move fastest on AI aren’t the ones with the biggest budgets or the fanciest learning setup. They’re the ones willing to look straight at how work actually happens before deciding how it should work next.
And that clarity doesn’t come from a training needs analysis. It comes from getting the right people in a room, mapping the real workflow, and letting the conversation go where it has to go.
L&D doesn’t need to run that room. But L&D should be the function that insists it happens—because without it, everything downstream is built on sand.
If you want to see how this works in practice, join our next live session.
AI Workflow Redesign: How to move from "What to do" to "How to do it" is a free webinar where we walk through the AI Workflow Sprint methodology — what happens in the room, how teams map their real workflows, and how organizations move from scattered AI initiatives to a process that actually produces results.
.png)

.png)
