Psychologists push for AI safeguards for workplaces

Federal Government urged to seek psychologists' expertise

Psychologists push for AI safeguards for workplaces

The adoption of artificial intelligence tools has to be "psychologically informed" to avoid destabilising workplaces and harming employees, according to the Australian Psychological Society (APS).

APS CEO Zena Burgess called on the government to seek the expertise of organisational psychologists in establishing guardrails for AI implementation across Australia.

"AI technologies must ultimately benefit us all. Therefore, the implementation and use of these technologies must be psychologically informed, with safeguards that consider motivation, trust, job design, and culture," Burgess said in a statement.

According to the CEO, organisational psychologists have the expertise to design protective mechanisms such as redesigning roles alongside the technology to prevent large-scale distress from tech changes.

"Organisational psychology is about building protective strategies before problems escalate, ensuring AI adoption strengthens rather than destabilises our workplaces," Burgess added.

AI adoption in Australia

Burgess made the call amid the growing implementation of AI tools in Australian workplaces.

Findings from the National AI Centre revealed that 82% of businesses with 200 to 500 employees have adopted AI tools as of the first quarter of 2025.

More than a quarter of AI adopters said the top applications include data entry and document processing (27%), generative AI assistants (27%), and fraud detection (26%).

But the widespread adoption of AI tools could mean a "painful" transition for employees, according to the Productivity Commission, with Jobs and Skills Australia saying the technology could shift what entry-level work looks like.

"The key to successful AI adoption is not just about the technology - it's about making sure people feel secure, informed, and valued as their workplace adapts. Change succeeds only when the workforce is engaged and trusted to be part of the process," Burgess said.

Psychologists have long contributed to the science underpinning technological adoption, and also understand how people respond to change, pressure, and uncertainty, according to the CEO.

"It's imperative the government draws on this expertise now, because we're at a pivotal moment in Australia's economic future," she said.

Burgess issued the remarks ahead of the ongoing Economic Roundtable, which has AI adoption as one of its key agendas.

The Australian government has already issued 10 guardrails for the safe and responsible use of high-risk AI systems. However, the Productivity Commission recently advised that AI-specific regulation should be a "last resort" for the government.

"The mandating of the guardrails is only appropriate in circumstances where existing regulatory frameworks or new technology-neutral regulations are not able to adequately mitigate the risk of harm," the commission's report read.

LATEST NEWS