It’s easy to only ever centre the HR automation conversation about the commercial (or government) sector, but that’s pretty short-sighted: non-profits and The Third Sector also have the same pressures around efficiency, cost and maximising of organisational delivery.
That’s certainly the impetus for USAID, America’s Agency for International Development, which has been creating its own technology to automate mundane transactions to free up employees to address more complex issues, according to an interesting write-up in a US publication that covers central government activity.
Until as recently as two or three years ago, tasks like assigning Foreign Service officers to their posts was done via email and paper, for example, the organisation’s chief human capital officer, Bob Leavitt.
“Having the most basic technologies was a critical requirement for us to mitigate liabilities and financial risk due to the errors that we were generating,” Leavitt tells the publication, FCW. “We had to reprocess how we designed our workflows and our processes to streamline them as much as we could.”
Now, 95% of routine HR tasks like an employee getting married and needing to change her name are automated all the way up until it gets to the HR professional that oversees everything and who only then needs to approve or disapprove.
Sounds great. But what might happen if you put ‘robots’ so much in charge that they get heavily involved in the recruitment? Possible dilemmas arising out of that get intriguingly pulled apart by freelance HR journalist Cath Everett in a late December piece in diginomica.
On the one hand, we may not have too much to worry about as yet, as there’s been a lot of hype (surprise) and perhaps not that much actual practical implementation yet (James Wright, a consultant at executive search firm Carmichael Fisher (which has just gone into administration), concurs that the market is ‘very much in its infancy’ and that adoption levels are currently low).
But then – maybe we do, and not just the infamous “unconscious bias” Google algorithm case, where women candidates were being screened out. Everett reports on how the US-based Electronic Privacy Center (EPIC) has also just filed an official complaint against recruitment software supplier HireVue because the latter’s use of “AI-driven assessments” to evaluate job applicants’ skills and personality characteristics, like their emotional stability and ability to learn, constitute a “wide-ranging threat to US workers”.
How so? The allegation is that HireVue had been training its software on thousands of data points relating to a candidate’s voice, word selection and facial movements. “While individuals are not told their scores, these ratings are meant to help guide employers as to which job-seekers they should hire,” says the article. “As a result, the point of the complaint is to stop HireVue from automatically scoring applicants and to make public the algorithms and criteria it uses for analysing people’s behaviour, with the aim of protecting job-seekers from unknown and unacknowledged bias.”
Pretty scary if you’re excluded from a great job opportunity by software? Well, if you are scared, then you may need to think about what countermeasures you’d like to see fast… as a final recent 2019 HR automation story in Computer Weekly about the irresistible rise of RPA suggests. There, we read this doubtless true but still slightly chilling quote: “It has become popular for organisations to say that the goal [of RPA deployment] is to ‘release people to perform higher-value tasks’, but most organisations have no idea what those higher-value tasks might be.
“And, in any case, the cost to automate usually must be justified by a cost reduction — which typically means a reduction in staff.”
Well. Looks like there are no easy HR automation answers… but that the debate will only continue in this brand-new year, and indeed decade.
thedmcollaborators editor