Yes, the empathy gap is the real AI bottleneck, not the tech itself. That MIT stat is telling: 95% of AI pilots flop because the rollout skips the very people meant to use it. Your framework nails it - especially the “human why” piece. It’s not enough to plug in tools; people need to feel why it matters.
What’s been the most surprising emotional reaction you’ve seen from teams during early-stage AI adoption?
love that step #2 acknowledge is "acknowledged" as important in the process lol. Great read. Not super applicable to me at this moment, but will pass along to some of our clients in the corporate space.
Just Wow! What a great approach. The cyclical review of friction points feels like a great way to get teams to see how AI is not a 'set it and forget it' tool. This is not a 'tool' in the usual way we think of that word at all!
Many AI discussions stop at AI + Human. That was the dominant narrative in 2025. "AI is an assistant" that complements human capabilities. In practice, this usually means companies buying Copilot seats and giving everyone a short training delivered by a Big4 consultant.
The real shift in 2026 starts with AI × Human. This is not addition, it is multiplication. Roles change. Decision rights move. Human judgment becomes the scarce asset. The system behaves differently, not just faster.
Most AI programs fail because leaders design for AI + Human, while the organization experiences AI × Human effects. That gap creates fear, resistance, and silent disengagement.
Adoption is not a training issue. It is a system design issue.
Yes, the empathy gap is the real AI bottleneck, not the tech itself. That MIT stat is telling: 95% of AI pilots flop because the rollout skips the very people meant to use it. Your framework nails it - especially the “human why” piece. It’s not enough to plug in tools; people need to feel why it matters.
What’s been the most surprising emotional reaction you’ve seen from teams during early-stage AI adoption?
Very well said! I would say there’s generally fear of being replaced
love that step #2 acknowledge is "acknowledged" as important in the process lol. Great read. Not super applicable to me at this moment, but will pass along to some of our clients in the corporate space.
Love it, Phil! Colette did a great job presenting it. Have a great new year, man!
Just Wow! What a great approach. The cyclical review of friction points feels like a great way to get teams to see how AI is not a 'set it and forget it' tool. This is not a 'tool' in the usual way we think of that word at all!
Love the EASE framework. I personally think the last step of review is the most important step!
Many AI discussions stop at AI + Human. That was the dominant narrative in 2025. "AI is an assistant" that complements human capabilities. In practice, this usually means companies buying Copilot seats and giving everyone a short training delivered by a Big4 consultant.
The real shift in 2026 starts with AI × Human. This is not addition, it is multiplication. Roles change. Decision rights move. Human judgment becomes the scarce asset. The system behaves differently, not just faster.
Most AI programs fail because leaders design for AI + Human, while the organization experiences AI × Human effects. That gap creates fear, resistance, and silent disengagement.
Adoption is not a training issue. It is a system design issue.