norm. data protection bulletin: 07th May 2024

Back
Data protection bulletin header

New Guidance on Responsible Use of AI in HR and Recruitment Processes

The Department for Science, Innovation & Technology, alongside a number of other organisations including the ICO, has published guidance for the responsible use of AI in HR and recruitment processes.

The guidance sets out how organisations can identify and mitigate risks associated with the use of AI in recruitment and hiring processes. (These risks include unfair bias and discrimination against applicants). Examples are provided such as:

  • job description review software that may be discriminatory;
  • chatbots used to engage with candidates which are trained on irrelevant or insufficient data;
  • headhunting software and CV matching tools that perpetuate existing biases, and
  • video interviewing tools that may result in discriminatory outcomes.

The guidance advises the implementation of Algorithmic Impact Assessments, Equality Impact Assessments, Data Protection Impact Assessments, and the development of an effective AI Governance Framework.

Read the full guidance here.

Comment: This provides useful guidance to HR professionals and emphasises the need to exercise particular care when using AI. If you would like to learn more about best practice in the use of AI and AI tools, please do not hesitate to contact us.


 

Insight: Data Breaches from Communications Sent to Wrong Addresses

Background 

The Defendant, sent by post, to several hundred people, all current or former police officers, a financial statement. Unfortunately, in many cases, the statements, which contained for each intended recipient, name, date of birth, national insurance number, salary and pension information, were sent to out-of-date addresses.

 

The claim

474 of the intended recipients of the letters issued proceedings seeking damages for breach of the GDPR and/or misuse of private information. In the main, the claims were based on non-material damages for anxiety, alarm, distress, embarrassment and loss of control over their data. The Claimants had positive evidence that the letter was opened in only 14 cases, and in 11 of those instances the letters were opened by a family member. In only 2 cases were the letters opened by someone other than a family member or colleague of the intended recipient. In the cases in which the letters had not been returned, the Claimant relied upon an inference that the letters had been opened and read by a third party. The Defendant issued an application seeking to strike out the Claimants’ claims and/or for summary judgment on multiple grounds including that any damage or distress suffered by the Claimants was not serious enough and the claims constituted an abuse of process.

The key issue for the court was the damage caused by the posting of a letter containing personal information/data to a third party.

 

The court’s findings and decision

The court said that in order to have a viable claim for misuse of private information and/or data protection, each Claimant was required to show that they had a real prospect of demonstrating that the letter intended for them was opened and read by third party. Absent that, the Claimants had “no real prospect of demonstrating that there had been ‘misuse’”, an essential element of the tort of misuse of private information”.

Further, a danger or risk to personal data did not give grounds for a data protection claim. To be entitled to a remedy, a claimant must demonstrate that they are the victim of a wrongdoing. The judge said: “A near miss, even if it causes significant distress, is not sufficient”.

Therefore, the claims in which the letter was returned unopened failed to disclose reasonable grounds for bringing a claim for misuse of private information and/or data protection and were struck out (or, alternatively, were summarily dismissed as showing no real prospect of success). Even when the letter had not been returned unopened, unless the Claimant could plead a viable case that their letter was actually opened and read by a third party, he/she had no real prospect of success, and the claim would be struck out and/or dismissed.

There were only 14 claims in which the Claimants had provided evidence that their letter had been opened. Those were allowed to process to trial – all others were struck out. Of those 14 cases, the judge said they were “very far from being serious”. Further, that some aspects of the claims were “hopeless” and “exaggerated”. For example, one Claimant expressed fears that the information might be used to identify and target him, his family or his current home when the envelope had been opened by his father only; such was “completely unreal”.

 

The Costs 

The claim was issued in the High Court. By the time of issue of the claim, the Claimants’ pre-action costs were around £1.2M. The Claimants’ estimated costs to trial were £2.549M and, in addition, the Claimants sought to recover an ATE insurance premium incurred in respect of the misuse of private information claims. The Defendant’s estimated costs to trial were £2.7M.

The damages sought by each of the Claimants were £1,250 to £1,500.

On allocation, the view of the judge was that, had the Defendant admitted liability, it would be appropriate to transfer the remaining 14 cases to the County Court to be allocated to the small claims track, where very limited costs are recoverable.

The judge appeared critical of costs incurred in this case. Had all 474 original claimants continued with their claims, the total damages sought would have been up to £711,000 and yet the combined costs of the parties were estimated to be £5.2M excluding the ATE premium.

 

The Takeaways

  1. The Court will not draw the inference that a letter addressed to a named recipient, clearly marked “private and confidential”, will be opened by a third party who is not the named recipient or authorised by him/her to open correspondence addressed to named recipient.
  2. Claimants must consider carefully the extent to which they can prove damage caused by a data breach or misuse of information – it is not sufficient to make a bare assertion, especially one that does not tie into the facts of the case.
  3. Although this decision is about misdirected postal mail, it should (in my opinion) also apply to misdirected/wrongly delivered emails.
  4. The referral of these types of claims to the small claims court, where only very limited costs are recoverable, is in line with previous decisions. This should be a deterrent to many potential claimants and a comfort to all those send communications to a wrong address.

 

Exposing Personal Information on Online Customer Portals

The ICO has issued a reprimand to a housing association after personal information was accessible to other residents on its online customer portal.

On the very first day the portal was launched a resident discovered they could access documents related to anti-social behaviour cases and view personal information about other residents, including names, addresses and dates of birth. The resident called a customer service advisor at the housing association to flag the breach, but their concerns were not escalated, and the personal information remained accessible.

Following a mass email to residents promoting the portal, four more residents reported the same breach, and it was suspended.

The ICO investigation found that the housing association failed to test the portal appropriately before it went live and staff were not clear on the procedure to escalate a data breach.

Comment: New digital products and services aimed at improving a customer’s experience must not come at the cost of the security of personal information. This breach was the result of a clear oversight by the housing association when preparing to launch its new customer portal. The ICO will expect all organisations to ensure they have appropriate security measures in place when launching new products and to have tested them thoroughly with data protection in mind, as well as ensuring staff are appropriately trained.


 

TUC Calls for AI Legislation

The UK government has so far, in contrast to the EU, chosen to take a light touch, or ‘pro innovation’, approach to regulating AI, rather than introducing legislation. However, the government anticipates that it will ultimately be necessary to take legislative action at some point in due course.

On 18 April the TUC published its Artificial Intelligence (Regulation and Employment Rights) Bill. This aims to regulate the use of AI systems in the workplace in order to protect the rights and interests of employees. It also provides for trade union rights in relation to the use of AI systems in the workplace (including consultation requirements) and for a “right to disconnect”.

Of course, whether the TUC’s AI Bill stands any chance of becoming law depends, to a large extent, on which political party forms the government after the next general election…


Get norm.’s data protection bulletin direct to your inbox

norm. tracks and monitors the latest data protection developments and collates these into a monthly data protection bulletin.

You can receive this bulletin for free, every month, by entering your business email address below: