Are robots better than HR at 'reading' emotions?

Leaders may well see a new wave of emotionally intelligent technology

Are robots better than HR at 'reading' emotions?

With an intense understanding of psychology and sociology, the HR profession has long depended on the social sciences to analyse workplace behaviour and sentiment.

Compared with finance or IT – which see hard data as their lifeblood – HR teams have traditionally been given less opportunities to run numbers on a business intelligence software, let alone one that decodes workplace psychology.

In today’s data-driven environment, however, HR analytics is fast becoming more sophisticated thanks to the rise of artificial intelligence. To be exact, AI programmed with ‘emotional’ intelligence.

Reading between the lines
AI, particularly machine learning, is simultaneously giving leaders better data analytics tools and uncovering the intricacies of human emotions more scientifically.

“AI/ML is great at narrow repetitive tasks and tasks that require the processing of vast amounts of data, repetitive patterns, and complex calculations,” said Martin de Martini, co-founder and CIO of software group Y Soft and an advocate of tech literacy in the workplace.

HR professionals will thus be able to take massive amounts of people data from different sources – employee surveys, 360-degree performance reviews, and even email text analysis – to paint a big picture of employee culture and engagement.

And crunching massive datasets is one of the areas “where AI excels and will excel over any human,” de Martini said.

Using technology to gain insights on emotions and sentiments is nothing new. A variety of tools is already available: from digital teambuilding games and personality quizzes, to robotic interviewing software and wearable gadgets that “read” emotions.

But the biggest difference in the age of AI is how analytics tools will see beyond what is visible to the naked eye. Take the EQ-Radio, for instance.

In 2016, MIT researchers broke new ground in the field of emotionally intelligent machines. They unveiled a device that purportedly measures a person’s heartrate and breathing to determine their emotional state – all without physical contact.

The EQ-Radio bounces signals off a person’s body and decodes their vital signs through algorithms. These spell out with 87% accuracy whether a person in the room is happy, sad, or angry.

One application of EQ-Radio is in keeping the workplace safe during crisis situations, such as when an active shooter enters the premises. The device can get a reading of raised heartbeat and ragged breathing even from where the person stands and, more importantly, before any mishap occurs. This will give security personnel better visibility of erratic and errant behaviour.

Another use-case the developers of EQ-Radio are looking at is monitoring depression and anxiety. Because the device relies on actual physical factors, such as heartbeat and breathing, it can illustrate patterns and signal when something is amiss, especially when a person is finding it difficult to express their emotions.

Of course, using wireless signals to detect emotions in the workplace is far from being the most common example of AI-powered sentiment analysis.

Something as commonplace as the language in an email, memo, or job advertisement can indicate whether someone’s tone is positive or negative, or if their views are biased for or against a certain group of people.

In the workplace, our words readily give us away but AI tools can read between the lines.

Natural language processing – a subfield of machine learning – takes examples of everyday language and, through constant training, vets phrases according to their assigned meanings and context.

Services such as Textio, Boost Editor, and IBM’s Watson Tone Analyzer are only a few examples that use AI to listen closely to words and how they indicate emotions.

Textio, for instance, is an augmented writing tool designed to pick out problematic phrases and advise workers against insensitive, derogatory, and harsh language.

“Computer algorithms, with all the personal data that we are willingly sharing, will be better in recognising emotions,” said de Martini.

“Within a few decades, these computer algorithms will enable communications with humans better than people can do among themselves, as the emotion is almost always represented by facial, body language, and speech modulation patterns,” he said.

Emotional intelligence
Will machines be better than HR professionals in reading and processing emotions and sentiments? The short answer is yes.

Thanks to mathematics, analytics tools designed to describe and predict behavioural patterns among humans can take their cue by using a person’s words, intonation, gestures, and facial expressions as data points.

But the future of HR is not so much a question of who is better – humans or machines? – but rather, how well humans can adopt different forms of AI to make work more meaningful and productive.

There are three levels of intelligent digitalisation, according to PwC:

  • Assisted intelligence refers to the way routine, low-level, and time-consuming tasks are automated and made accessible, as in the use of chatbots to improve HR service delivery.
  • Augmented intelligence refers to the way humans and machines “make decisions together,” such as the use of machine learning to reduce uncertainty and bias when deciding who to promote or how much raise to give.
  • Autonomous intelligence refers to the most advanced form of AI in which machines have become self-aware.

At this point, emotional intelligence tools are still in their infancy; they assist HR practitioners and augment day-to-day HR processes but still heavily depend on humans to “teach” them how to detect and analyse emotions.

Thus, the capability of machines to process emotions and sentiments is constantly refined through hundreds of hours of training. That is, the process of feeding data – about what humans say or feel – as part of the machine learning process.

In time, emotionally intelligent AI systems will not only become accustomed to typical human reactions in specific scenarios, such as when a worker is frustrated or motivated. But they will also learn to anticipate how a human might react even before certain situations arise.

Recent articles & video

Australia's paid parental leave to reach 26 weeks by 2026

IT contractor gets 2.5-year jail time for swindling

Can 'provocation' be used as a defence in a workplace altercation?

Should an employer's religious views influence a dismissal decision?

Most Read Articles

Teacher sends 'Goodbye' message on WhatsApp group: Did she resign?

'Frustrated' worker blames 'understaffing' for aggressive behaviour

Co-managers challenge improper consultation process amid redundancy