As AI continues to be embedded into workflows, HR leaders need to focus more on how teams collaborate, build trust and keep judgement sharp, writes Carmen von Rohr
For many employees, AI is no longer hypothetical – it’s now impacting critical functions and daily workflows. Gartner research found that 56% of employees are already using AI for a core responsibility at work.
This rapid adoption is often framed as a pure performance opportunity. The technology promises faster outputs, broader analysis and scalable efficiency, but this view overlooks a critical reality. Human interactions with AI may change behaviour in ways that can quietly undermine performance if left unmanaged.
As employees depend more on using AI, three behavioural risks arise: the decline of critical skills such as judgement due to overreliance on AI technology; AI outputs are trusted by default without adequate review; and diminished connections to team, colleagues, culture and strategy as employee isolation increases.
Individually, these risks may appear subtle. However, they can create ripple effects across culture, capability and performance. Over time, they can threaten the very outcomes AI is meant to improve, including employee contribution and the ability to deliver against organisational KPIs.
For HR leaders, this creates a new mandate. The challenge is no longer whether to adopt the technology, but how to protect and strengthen human connection alongside it. In an AIenabled workplace, culture becomes a performance lever rather than a soft concern.
Identifying and addressing skills erosion
One of the least visible impacts of AI adoption is what happens to human capability when judgement is used less often.
Over time, an overreliance on AI can lead to skill erosion across the business. This may include the atrophy of abilities held by more experienced employees who are no longer using specific skills, or deficits among more junior colleagues who lose the opportunity to build competencies in the first place.
To safeguard critical skills in an AI-enabled environment, organisations should embed learning opportunities directly into the workflow, such as through peer learning channels and structured knowledge transfer between experienced and early career employees.
Practices like periodic manual task rotations, where employees perform core tasks without AI assistance, can help to maintain and refresh essential capabilities that could otherwise stagnate. Generative AI simulators can also be used to close skill deficits for more junior employees and to test and refresh skills already held by more experienced colleagues.
Building a culture of contingent trust
One of the most common risks of employees using AI is default trust, when they become susceptible to its persuasion even when it’s inaccurate. AI can appear confident and authoritative, so employees may accept outputs regardless of how correct it is. Time pressure further increases this risk.
When employees stop questioning outputs, critical thinking declines, errors can propagate and decision quality deteriorates. Over time, this can weaken organisational capability.
HR leaders can counter this by developing and normalising contingent trust. This means including human review, quality checks and clear team norms around verification.
Managers play a crucial role here. By setting clear expectations that AIassisted work will be questioned and discussed, teams build shared responsibility for accuracy. This protects performance while reinforcing human judgement as essential, not optional.
Organisations should also track indicators of contingent trust, such as the presence of agreements for AI review and whether teams and individuals are consistently using them.
Strengthening capability and connection
As human interactions with AI evolve, technology can shift from being a simple tool or assistant to something more like a collaborator, or even a companion.
As exchanges become more complex, employees may spend more time engaging with AI, and less time building relationships with colleagues. This risk is especially critical in remote-first environments or roles with limited day-to-day team contact.
If human connections deteriorate, culture becomes harder to sustain, making it difficult to maintain a collaborative and engaged workplace. HR leaders and their teams should develop strategies to revitalise and protect organisational culture as AI disrupts traditional ways of working.
One practical approach is to actively ask employees what is working, what isn’t and what support they need. Managers can also role model the new habits and set team norms for AI-assisted work. This signals the behaviours that are valued and it makes adoption more consistent across teams.
Another strategy is to identify and reinforce key cultural moments, by building a framework that clarifies which events and tasks are best shared with people, to live the culture and work effectively.
A human advantage in an AI world
AI will continue to reshape the way we work, but performance isn’t delivered by technology alone. Human connection remains a decisive factor in how effectively organisations learn, adapt and execute. By intentionally strengthening connection, safeguarding skills and normalising thoughtful trust in AI, HR leaders can protect culture and performance together.
In an AIenabled future, the organisations that win will not be those that automate the most, rather those that remain deeply human, by design.
Carmen von Rohr is a senior principal analyst at Gartner HR practice