Live Facial Recognition: avoiding the pitfalls

Author: John Goss, barrister at 5 Essex Court

Like Comment
Article Image

Live facial recognition (“LFR”) technology has been in the news for the wrong reasons, with concerns raised about use by both police forces and those operating spaces such as Kings Cross Station. But the furore should not blind us to the potential benefits of LFR to the medical sector. It does, however, act as a reminder that a complex legal framework governs this technology, and that care is required when deciding where, when and how it should be used. The Information Commissioner’s Office is taking an active interest in this area, and the recent media attention shows how well-intentioned use of LFR may backfire.

The possible uses of LFR in medicine are extensive. At a simple level, it can enhance existing CCTV systems for hospital security, whether by adding another layer to access control, or by allowing people to be quickly tracked across CCTV footage. This could significantly reduce the time taken to locate an absconding patient (perhaps lacking capacity) or an intruder.

As a clinical tool, one potential use for LFR, already the subject of studies in the USA, is in rapid early identification of genetic conditions, such as Cornelia de Lange, Angelman and Noonan syndromes. This works by comparing the patient’s face with a pre-existing database, via an algorithm trained as a neural network. While genetic diseases will always be confirmed via genetic testing, use of facial recognition can speed up diagnosis, narrow the possibilities and remove the need for expensive and time-consuming multi-gene panel tests.

It is likely that as LFR develops and available neural networks become more sophisticated, they will be able to recognise sensations such as pain or discomfort on a patient’s face, or to spot the signs of absent seizures or micro-strokes that might otherwise be missed. An obvious early area for its use would be in sleep studies. That sort of intensive monitoring would allow highly efficient distribution of scarce clinical time. It would also ensure that significant symptoms were not missed while a patient suffers in silence, whether through misplaced fortitude or unawareness of what is happening. Though LFR monitoring for every patient may not be practicable, it would be a valuable addition to the diagnostic armoury.

LFR could therefore make a real difference to patient care and outcomes. But caution is needed about how the data produced and used by these systems is managed, to stay compliant with data protection laws. Facial recognition data is biometric data under the General Data Protection Regulation (GDPR), and receives a higher level of protection. Unsurprisingly, processing such data for medical purposes can be justified – after all, medical teams routinely process other biometric and health-related data. But there are many traps for the unwary within the GDPR, and it would be wise for anyone considering LFR to have a carefully considered policy in place, and to conduct a robust data protection impact assessment. Three examples show some of the issues.

The first is that the use of LFR must not cross into automated decision-making about individuals. Under Article 22 of GDPR, individuals have a right not to be the subject of a decision based on automated processing that ‘significantly affects’ them, which medical decisions are likely to do. The narrow exemptions to that are not available for decisions based on biometric data processed for medical purposes. As a result, while LFR may inform proper clinical judgement, it should never be a substitute for it.

Secondly, since biometric data is personal data, all of the data subject’s rights will apply, including the right of access and the rights to erase, rectify or restrict processing of it. The technology will need to facilitate granting those rights when required. Data controllers will need to be proactive about informing anyone subject to LFR that their personal data is being processed in this way – one of the areas where other early adopters may have fallen short. If LFR is being used on a more widespread basis – in conjunction with hospital CCTV, for example – then the risks are commensurately higher. Moreover, the processing must be necessary and proportionate: blanket use of LFR without thoroughly justifying its purpose will only store up further issues.

Third, one of the criticisms levelled at LFR is its potential to be discriminatory, if the algorithms do not function equally for individuals of all races and sexes. To some extent, this issue may be resolved as the neural networks that underlie LFR ‘learn’ more. That in turn raises difficult issues about the processing of personal data within those networks themselves, although this is a general issue rather than one specific to medicine. In any event, anyone considering procuring LFR technology in a medical context would be well advised to ask probing questions of their supplier about these issues, and equality impact assessments may be required.

The High Court has just ruled that South Wales Police’s trial use of LFR was lawful, though doubts were expressed about the adequacy of some of its policies. In the same way, as medical uses for LFR become greater and more widely recognised, appropriate policies, procedures and frameworks will have to be developed, to ensure that its potential can be unlocked without breaching the data protection regime.

Go to the profile of OnMedica

OnMedica

Editorial team, Wilmington Healthcare

OnMedica is an independent, easy to access on-the-go website for doctors. It provides GPs and specialists with easy to digest and up-to-date, relevant educational content whilst enabling the freedom to share and collaborate in a safe-space to further personal development.
2559 Contributions
31 Followers
1 Following

No comments yet.