When algorithms undermine equality of opportunity

Employers harnessing social media or artificial intelligence to reach new recruits and remove bias from their processes may find themselves inadvertently discriminating against the very people they are trying to reach.

Figures from the Office of National Statistics show there are almost 1.3 million vacancies currently on offer, and businesses are having to try harder than ever to get their jobs in front of candidates.

In this challenging market, more employers are embracing social media advertising to fill their vacancies, in pursuit of a wider audience.  But algorithms used by platforms such as Facebook may mean users do not see job adverts because of their age or gender, even when an employer conducts what they believe to be an unrestricted advertising campaign.

Research by Global Witness found that jobs for mechanics were shown almost entirely to male users, where nursery nurse positions were delivered to a predominantly female readership.  Other research has shown that job advertisements may not be delivered to older readers.

Age and gender are just two of the characteristics that are protected under the Equality Act 2010, which was designed to safeguard against discrimination, harassment and victimisation.  Other characteristics protected under the Act include race, sexual orientation, religious beliefs, disability and gender reassignment.  In the workplace, the law requires employers to guard against discrimination from the very start of the recruitment process through employment to termination.

But even though an employer may act with the best intentions, they may find themselves breaking the law if their advertising does not reach all groups, as indirect discrimination could be taking place through social media’s automated ad targeting.

Another potential source of recruitment bias lies in the increasing use of machine learning and artificial intelligence (AI) for recruitment purposes.  Whether used to sift applications or to conduct initial interviews, there are many challenges for both HR and data management compliance.

Bias may stem from the underlying programming and data sources, whether through the data used to train machine learning software, which may not fully represent gender or ethnicity, or through those writing the algorithms.

And when tech solutions are designed with the aim of automatically learning a recruiter’s preferences and using those predictions to identify similar applicants in future, bias can become reinforced rather than removed.

Employers may be trying to improve their processes or make decision-making more objective.  But when machine learning is making predictions based on past screening decisions, it may reinforce the very pattern those employers are trying to change, and so undermine equality and diversity intentions.

Automated decision-making is restricted under the UK’s GDPR legislation and if a decision may have significant legal or other effect,  as with an employment decision, it cannot be based solely on automated processing.  The problem is also being addressed in draft legislation from the EU on the use of AI, which sets out requirements for human oversight.

There are many aspects to diversity in the workplace and making sure that the recruitment process and ongoing opportunities are accessible to all is clearly an important component of that.  A discrimination tribunal, with uncapped compensation for successful claims, is reputationally damaging as well as potentially costly.  Also, failing to comply with data handling requirements can give rise to large penalties.

And for the future, with empathic AI on the horizon – and its potential to detect human emotions during an interview – there will be further challenges for both HR and data protection.

Empathic technology will use aspects such as heart rate, pupil dilation, blood pressure, flushed cheeks or changes in tone of voice to assess an individual’s emotional state.  But while it may be useful to know how an applicant is feeling in an interview, use of such techniques will take employers into uncharted territory.

HR professionals will have to demonstrate how they are interpreting the information they collect in this way and human oversight is likely to remain essential in any recruitment decision.  Data controllers will have to address whether such processing is truly necessary to achieve objectives and demonstrate how empathic AI in recruitment is proportionate.

There’s every expectation that algorithmic bias will become a commonplace in future discrimination claims, whilst our employment and data protection legislation tries to catch up.

To discuss this, or any other related matter, please contact Jane directly on 01483 887766, email info@hartbrown.co.uk or start a live chat today.

*This is not legal advice; it is intended to provide information of general interest about current legal issues.

 

Share

Jane Crosby

Partner, Head of Dispute Resolution & Accredited Mediator

Jane is a Partner based in the Guildford office and she is also Head of the Dispute Resolution team here at Hart Brown. Jane specialises...

Jane Crosby -Head of Dispute Resolution

Partner, Head of Dispute Resolution & Accredited Mediator

Jane Crosby

Jane is a Partner based in the Guildford office and she is also Head of the Dispute Resolution team here at Hart Brown. Jane specialises in employment Law and commercial litigation and brings more than 15 years' experience to her role.

Prior to entering the legal profession, Jane was employed in the aviation industry. This experience is appreciated by many of Jane's clients who note that she is able to take a commercial and pragmatic approach to any legal issue that they face.

Jane acts for a wide range of individuals and businesses and her areas of specialism include aviation, property related industries and IT. Jane regularly advises on aspects of employment law, such as settlement agreements, employment contracts, policies and procedures, redundancies, equal pay, data protection, issues arising from TUPE and reorganisations, the calculation of holiday pay, bonus and commission payments, disciplinary and grievance issues, dismissal and termination issues, the protection of confidential information and the enforcement of restrictive covenants. Jane gets involved in GDPR training for her clients and she is able to deliver tailored employment law training sessions upon request.

As a commercial litigation lawyer, Jane also deals in shareholder and directors disputes, commercial contract disputes and the enforcement of restrictive covenants.

Jane has been involved in successful high value commercial litigation for clients in the High Courts, she is an accredited mediator and she is a member of the Employment Lawyers Association.

Jane is often asked to write for a number of well known publications, including The Daily Mail, The Telegraph and The Week and she has been interviewed on BBC Radio 4.

Here is small selection of the feedback that Jane has received:

“Jane, I cannot sincerely thank you enough for your wise counsel and am delighted to have made your acquaintance. If I am blessed with a new position somewhere I will hand over my contract in the first instance to you. Likewise, any of my friends, peers, romans and countrymen wanting advice, I will point them in your direction.”

“Jane, you have been most resilient on my behalf for which I sincerely thank you for all your endeavours. I have a tremendous working relationship with Hart Brown and you have undoubtedly compounded this further."

“I appreciated the clarity of advice given at a stressful time”.

“A sensitive and highly professional approach and efficient work in the interests of the client”.

“Your advice, conduct and assistance have been indeed outstanding and very professional but also – and most importantly – very humane”.