Your Employees Are More Afraid of AI Than You Think—And That’s a Leadership Problem

I’m not an AI expert. But I am an expert in people — and that’s the angle I want to bring to a conversation that’s getting louder every week.

Somewhere between the breathless headlines and the boardroom optimism, there’s a group of people who aren’t being heard: your employees. And what I’m seeing — in the data, in conversations with clients, and in my own company — is that what leaders assume their teams feel about AI and what those employees actually feel are two very different things.

That gap is a leadership problem. And it’s one we can fix.

The Stat That Should Stop You in Your Tracks

A recent study published in Harvard Business Review found that 76% of executives believe their employees are excited about AI. When those same employees were surveyed independently, the number of people who said they were actually excited? Closer to 30%.

That’s not a rounding error. That’s a fundamental disconnect in how leaders and employees are experiencing the same moment.

" Leaders are paying attention to the opportunity AI presents. They're paying less attention to the real fear and anxiety it can be creating in their workforce. "

Why the gap? I think it comes down to what people are incentivized to pay attention to. Leaders are tasked with finding opportunities, building strategy, and moving the business forward. Employees are tasked with taking care of their families and making a living. When a headline says ‘AI will eliminate 40% of jobs,’ a leader reads that and thinks about market dynamics. An employee reads it and thinks about their mortgage.

That’s not irrational. That’s human.

quote that says " Leaders are paying attention to the opportunity AI presents. They're paying less attention to the real fear and anxiety it can be creating in their workforce. "

The Real Story on Job Replacement

Here’s what the research actually shows — not the clickbait headlines, but the peer-reviewed studies from universities and serious journalists doing the work:

  • Job replacement rates from AI are currently estimated between 2.5% and 7% of the U.S. workforce, according to Goldman Sachs Research—and that higher end only applies if AI adoption becomes wide and deep across the entire economy.
  • In the fields most likely to be affected: legal, accounting, and some white-collar roles, displacement risk does climb toward that upper range, but even there, we’re not seeing mass permanent displacement.
  • Early data suggests displaced workers are finding roles in adjacent fields, though outcomes vary significantly by age, income level, and skill transferability. Workers with stronger adaptive capacity and transferable skills tend to navigate transitions more successfully, while those in clerical and administrative roles face a steeper road.

The pattern isn’t new. We’ve seen it before. Think about what happened with navigation apps and taxi drivers. Before Waze and Google Maps, a taxi driver’s primary value was the map in their head—knowing the fastest routes through a city. When that knowledge became universally available on every smartphone, that specific value disappeared.

But here’s what actually happened: the market expanded. Suddenly, anyone could drive someone somewhere without needing years of local knowledge. Uber and Lyft were born. People who would never have called a taxi now book rides constantly. There are more drivers now than ever, they just get paid differently and the competitive landscape changed.

AI will do something similar across many fields. Some tasks will be automated. The humans who did those tasks will shift, sometimes laterally, sometimes upward. The mistake is conflating ‘my tasks are changing’ with ‘I’m being replaced.’

"We're not talking about people being replaced. We're talking about tasks that people do being replaced."

Why Employees and Leaders Can Hold Contradictory Views Simultaneously

Here’s something that surprised me when I started looking at this more closely: employees can be genuinely enthusiastic about a specific AI tool that helps them do their job better while simultaneously being afraid that AI will eventually cost them their job.

These aren’t contradictory positions. They make complete sense when you understand the psychology underneath.

When an employee chooses to use an AI tool—say, a writing assistant or a data summarizer—they’re in control. They’re using it because it makes them look better, work faster, or produce something they’re proud of. They own that decision.

When they imagine leadership using AI to restructure the workforce? They’re not in control of that. They’re the subject of a decision being made above them, possibly without their input, probably without full transparency. That’s a completely different emotional experience.

The control variable is everything. Leaders who understand this can design communication strategies that give employees more agency—and in doing so, dramatically reduce the ambient anxiety around AI adoption.

What Separates Curious Organizations from Resistant Ones

I’ll be direct: the organizations avoiding AI right now are accumulating a debt that can’t be easily paid later. Not a technology debt. A wisdom debt.

Wisdom is experience. It’s what happens when your teams try something, fail at it, learn from it, adjust, and try again. You cannot shortcut that cycle. You cannot buy it. And 10 years from now, an organization that spent those years on the sidelines won’t be able to catch up just by purchasing the newest tools. The institutions that started learning in 2024 and 2025 will have a compounding advantage that’s almost impossible to replicate.

The Opportunity Cost of AI Avoidance

It’s not productivity you’re losing. It’s wisdom—the kind that only comes from doing the work, making mistakes, and iterating. That’s not something you can buy or fast-forward later.

That said, I want to be clear: experimenting doesn’t mean flailing. In my own company right now, we’re in what I’d honestly describe as the wild west. Marketing is using AI one way, sales another, our client team another. Nobody has a full picture. There’s no unified strategy.

That’s fine as a starting point — but it’s not a destination. The organizations that will pull ahead are the ones that move from ad hoc experimentation to intentional, communicated plans.

What I’m Actually Doing About It

I’ll share something that isn’t abstract strategy, it’s what I’m doing right now in my own company.

I’m sitting down to map our current AI usage across every team. Where are we? What’s working? What isn’t? What are our people actually feeling about it? Once I have a clear picture, I’m going to build a plan. Not just an AI strategy, but a communication plan so that every person in the organization knows what we’re doing, why we’re doing it, and what it means for their role.

Because here’s the thing: when employees don’t have information, they fill that vacuum with their worst fears. Every silence from leadership becomes confirmation that the robots are coming for their jobs.

You don’t eliminate fear by avoiding the conversation. You eliminate it by having it. With data, with transparency, and with the acknowledgment that yes, things will change, and here is what we know and here is what we’re figuring out together.

The Bottom Line

If you’re a leader trying to figure out your AI strategy, here’s where I’d start: before you think about tools, before you think about productivity gains, before you think about cost savings, talk to your people.

Find out what they actually think. Not what you assume they think. Not what you hope they think. What they actually think.

You might be surprised by the gap. And closing that gap might be the most important AI work you do this year.

Subscribe for industry tips

Ready to build a better workplace?