fbpx
HCR Law Events

12 August 2021

The end of facial recognition technology?

The Automated Facial Recognition Technology Bill was drafted in February 2019 in response to public concerns surrounding the use of such technologies and is currently undergoing its second reading before progressing to committee stage. The Bill echoes the legislative moves towards regulating the use of artificial intelligence and biometrics in public witnessed in the European Union.

The introduction of the Bill follows public outrage and a Court of Appeal ruling that UK pilot schemes using automated facial recognition (AFR) in public places were unlawful and breached data protection and privacy laws.

Most famously, South Wales Police overtly used AFR as part of such a pilot scheme, using it on more than 50 occasions at public events between May 2017 and April 2019. The Court of Appeal held that the use was unlawful owing to the interference with Article 8 of the European Convention on Human Rights (ECHR) and the wide-ranging discretion the police force had to add individuals to a watchlist.

 

What is AFR?

AFR has become a part of everyday life; we use it to unlock smart phones, verify bank details and some even use it to log in to their work IT systems. At its most basic, it is simply the process of identifying and/or verifying an individual using their face. This technology captures, analyses, and compares patterns with a database of facial images.

AFR has become prominent within digitally public spaces like social media, though its usage is caught by data privacy and human rights legislation. Organisations like Facebook now even invest in programmes such as DeepFace which can intelligently recognise faces with an accuracy rate of 97.25%.

 

What does the Bill mean for AFR?

It is important to note that the prohibition on the use of AFR applies only to public places; it will not impact employers’ use of such technology for security monitoring, nor will the facial authentication features on mobile phones or other security systems be made illegal.

However, it is likely to frustrate research in defence and security organisations which have been developing AFR across a plethora of uses. From target identification to crowd control or crime prevention in public spaces, AFR success rates have become increasingly accurate.

Importantly, the Bill in its current form does not contain express exemptions for law enforcement or national security; this is likely in response to the public concern surrounding police pilot schemes. We welcome the regulation of emerging technologies, but nuanced legislation is needed to ensure they can be safely deployed where they will contribute to national security or crime prevention.

Share this article on social media

About the Author
Richard Morgan, Partner

view my profile email me

Want news direct to you?

sign up


What is the future of the office?

show me more

Got a question?

Send us an email

x
Newsletter HCR featured image

Stay up to date

with our recent news

x
LOADING