Discussion about this post

User's avatar
Adam #EmployerBranding's avatar

Many AI discussions stop at AI + Human. That was the dominant narrative in 2025. "AI is an assistant" that complements human capabilities. In practice, this usually means companies buying Copilot seats and giving everyone a short training delivered by a Big4 consultant.

The real shift in 2026 starts with AI × Human. This is not addition, it is multiplication. Roles change. Decision rights move. Human judgment becomes the scarce asset. The system behaves differently, not just faster.

Most AI programs fail because leaders design for AI + Human, while the organization experiences AI × Human effects. That gap creates fear, resistance, and silent disengagement.

Adoption is not a training issue. It is a system design issue.

Peter Jansen's avatar

Most leaders confuse 'Empathy' with 'Sentimentality.' They are wrong.

In the context of AI implementation, Empathy is an engineering constraint. It is the understanding of your layer-1 infrastructure: The Humans (Wetware).

If you deploy high-velocity silicon (AI) onto low-trust wetware without an interface strategy, the system rejects the transplant. The 'Gap' you describe isn't an emotional failure; it is an architectural failure.

You cannot upgrade the code if you despise the compiler.

What we need to do is fix the interface and then upgrade the code.

38 more comments...

No posts

Ready for more?