AI can’t feel love, joy or disappointment – and that’s exactly why HR must tread carefully, says CHRO
Technology should make us more human, not less. That's the realization Christine Vigna came to—not in a boardroom, but during an unexpected moment of reflection at an AI masterclass for HR leaders.
Across industries, companies are rapidly adopting artificial intelligence to streamline operations, boost efficiency, and automate decision-making. A 2024 IBM report found that 42 per cent of enterprise-scale businesses have already deployed AI in some form.
However, as automation becomes the norm, concerns about the cost of human connection, especially in people-centered functions, are mounting.
Vigna, chief people officer at Dejero, arrived at a Master Class focused on implementation. She left, rethinking her intent. A comment from Joseph Quan, CEO of Knoetic — “AI can never experience what it’s like to be human. It can never love, it can never feel joy, it can feel disappointment”— sparked a shift in perspective.
Now, Vigna is working to ensure that every tech decision inside Dejero strengthens—not strips away—the moments where human presence matters most. From vetting vendors to shaping policy, her north star is clear: culture first, efficiency second, humanity always.
“I didn’t go in expecting a philosophical awakening,” Vigna says. “But during the final half hour, we had space to reflect on what AI really means for humanity.”
That moment sparked a deeper question: “How do we hyper-personalize the moments that matter—so when someone truly needs a human to show up, they do?”
On a practical level, Vigna sees real potential for AI to strengthen HR's role inside organizations.
"If we know that AI tools can come in and support driving efficiencies with a number [like] 10 per cent, 15 per cent, 25 per cent, we're effectively saying that our workforce within our organizations now has that much more capacity," she says.
But Vigna is not starry-eyed about automation for its own sake. It's about cutting the tedious, repetitive work to free people up for stretch projects, strategic thinking, and real skill development.
"We've now got an opportunity for folks to stretch themselves, to get involved in bigger strategic opportunities, to grow their skill sets a little bit more," she says.
Choosing the right AI tools, though, demands more than excitement. It demands discipline. Vigna cautions against jumping on shiny trends without a sober look at fit and purpose.
"Do we have an evident vision and understanding of what we're trying to accomplish?" she asks her team, insisting that every new piece of technology must solve a well-defined problem, not just check a box for the sake of being able to say you're 'using AI.'
There's another layer too: culture. "Does this tool allow us to work in a way that fits our culture and values?" she says.
A coldly efficient hiring platform might seem powerful on paper but could clash badly if it strips away moments that matter, like the personal touch during offer presentations. And Vigna doesn't just scrutinize the tool. She scrutinizes its creators.
"I'm always a little bit hesitant about tools being built by folks who haven't necessarily walked a mile in your shoes," she says. When developers have lived the problems, they're trying to solve, she adds, the conversations—and the solutions—are much more real.
Compliance remains a thorny area, especially as AI creeps into the sensitive territory of performance management. Right now, there's a regulatory gray zone. Vigna is clear that transparency is key.
"Is that something you transparently share with your employees? As of right now, there aren't a lot of regulations around, so [it’s about asking] ‘Can we share this? Can we not?’" she says.
Companies need to sort out data security, ethics, and organizational comfort levels internally because external rules haven't caught up. And it's not just about data storage. It's about real human consequences.
"Are we comfortable with a tool making big decisions for us as it relates to performance management, which we know more than likely feeds into discussions about salary and merit increases, et cetera?" Vigna adds.
The fear that AI could depersonalize the employee experience looms large, but Vigna isn't buying into the doomsday narrative. The antidote, she says, is vigilance about those critical human touchpoints.
"If you're having an employee relations issue, if someone needs some coaching, they're probably not going to want to do that with an AI tool," she says.
Similarly, customers facing significant challenges with a product aren't going to appreciate being shuffled off to a chatbot. Organizations, Vigna emphasizes, need to rigorously map their journeys—employee and customer alike—and identify where human empathy must show up.
"Where would I want to talk to someone, versus where would I want to interact with a machine or tool if there are deep challenges?" she asks.
For Vigna, this isn't about resisting technology. It's about remembering the heartbeat of any organization—its people—and ensuring that when it counts, the response doesn't come from an algorithm but a human being who cares.