Should HR drop ‘gut feel’ from decision-making?

HC talks to CEB's Aaron McEwan and academic Uri Gal about how HR can make the most accurate decisions

Should HR drop ‘gut feel’ from decision-making?

While Aaron McEwan of CEB is an advocate of the use of HR data analytics, he uses the saying ‘garbage in, garbage out’ to highlight the risk.

“About 54% of heads of HR or heads of data analytics are concerns about the data quality,” he said.

It’s also critical to understand what decisions HR is trying to shape with the data. Where is the organisation going and what data is going to help inform decisions about that direction?

“That’s the most important thing: if you are asking the wrong questions you are going to be collecting data that doesn’t actually support the decisions you are trying to make,” McEwan said.

As to the ‘human element’ involved in data-based decision-making, McEwan agrees there is a genuine risk of poor choices being made.

“There is always a human element both in the selection of what data we choose to analyse and then also in the analysis of that data,” he said.

However, the alternative – and the traditional means by which decisions were made – is not ideal.

“Traditionally, you would have had a group of leaders with their own personal biases sitting around making subjective decisions about people’s future in terms of pay rises, promotions and so on. The addition of objective data, even if it’s not perfectly objective, I think increases the accuracy of these decisions.”

The key, he added, is evidence-based collection practices and evidence-based analysis practices to ensure at least some of the subjectivity is minimised.

Associate professor Uri Gal from the Discipline of Business Information Systems at the University of Sydney Business School, suggests algorithms might be rejecting candidates without HR being aware of it.

“Ostensibly, the process is meant to be objective and rational,” Gal said.

“But the problem is that algorithms are not really objective in the sense that collecting big data actually involves a significant amount of human judgment.”

Gal cited an example. Consider a manager who wants to assess the performance of his or her employees over the previous year.

There are a number of ways which the manager can do that, including collecting data on how much revenue they have generated for the company over the last 12 months, how many clients they have interacted with or how many leads they have generated.

“So we have three data points to access what we call ‘performance’, but why these three? We could have used any other combination of data points,” said Gal.

He suggested other alternatives could equally be valid, such as feedback from customers, feedback from colleagues, how much time they spent on email, or how many days they were absent from work.

Further, when algorithms rely on inaccurate, biased or unrepresentative data, they may systematically undermine racial and ethnic minorities, women and other historically disadvantaged groups.

“There are a variety of different data points we could have used and the choice of data points inevitably involves human judgment,” said Gal.

“So it’s not really objective in the sense that people often think when they are talking about algorithmic decision-making.”

Gal does not dismiss analytics and said there is “room” for having certain technologies in place to assist people making decisions. However, he urges a “more aware approach” in what these technologies are being used for.

“It’s fine to have certain tools in place to help go through large amounts of data because computers and algorithms are more effective at that than human beings. But I think it is probably wise to maintain some sort of human oversight over this process.”

 

 

Recent articles & video

Australia's paid parental leave to reach 26 weeks by 2026

IT contractor gets 2.5-year jail time for swindling

Can 'provocation' be used as a defence in a workplace altercation?

Should an employer's religious views influence a dismissal decision?

Most Read Articles

Teacher sends 'Goodbye' message on WhatsApp group: Did she resign?

'Frustrated' worker blames 'understaffing' for aggressive behaviour

Co-managers challenge improper consultation process amid redundancy