Skip to main content

https://rtau.blog.gov.uk/2020/06/01/facial-recognition-technology-a-guide-for-the-dazed-and-confused/

Facial recognition technology: A guide for the dazed and confused

Posted by: , Posted on: - Categories: Artificial intelligence, Biometrics, Facial recognition technology

Few innovations have divided opinion more than facial recognition technology (FRT). Some claim that it will make our streets safer, our bank accounts more secure and our public services more accessible. Yet others argue that it will violate our right to privacy, among a host of other rights, and must be banned immediately. All the while, police forces continue to trial and deploy live facial recognition, and industry continues to increase the scope of its use in retail and public spaces.

Amidst the heated exchanges between proponents and critics, it can be difficult to understand the true impact of this technology. Do systems frequently mistake people’s identity? Could they radically enhance digital security? Are they well regulated? The onset of the coronavirus pandemic has only served to make these questions more important, with FRT proving integral to social distancing measures being used or developed in a number of countries, including proposals for ‘health certificates’.

The CDEI’s new Snapshot paper sets out to highlight some implications of FRT for society, and to review how it is governed in the UK. Key messages include:

FRT comes in many forms

FRT is a versatile technology that can serve a variety of purposes. One way of classifying systems is by whether they verify or identify individuals:

  • Facial verification (one-to-one matching): These systems try to determine whether a face matches with a single facial template, often saved on a device. Many phones and laptops are now embedded with verification technology, allowing users to log on to their devices more securely.
  • Facial identification (one-to-many matching): These systems try to determine whether a face matches with any facial template in a database of individuals. This is used by Facebook to suggest friends to tag in photos, as well as by the police and private security to locate people of interest in a crowd.

FRT systems also differ by the type of images they analyse. Some may be used retrospectively (analysing previously collected images), or in live settings (processing faces in real time). The public debate has thus far centred on the use of live facial identification for law enforcement purposes. One reason is that live uses of FRT often require quick judgements about whether to act on a system’s results, which may present unique risks. 

Each deployment must be judged according to its own merits

Proponents of facial recognition focus on its potential to enhance security and efficiency. Critics, meanwhile, talk about threats to privacy and problems ensuing from inaccuracy and bias. Our paper examines these claims closely, distinguishing between risks and benefits that are already playing out, and those that have yet to be evidenced. For example, we highlight that FRT systems have significantly improved in accuracy in recent years, and that the facial data collected in live FRT systems used by UK police forces is not retained unless there is a match. We also argue that context is critical. The degree of risk or benefit that FRT poses to society depends on multiple factors, including how a system has been trained and who has been included on a watchlist (for one-to-many systems).

FRT is regulated by several laws – but not a standalone code of practice

While there have been claims that FRT is unregulated, it is in fact subject to a number of laws and associated regulators, including the Data Protection Act and Human Rights Act (the latter for public sector use only). However, we see potential benefit in more purposeful guidance to ensure the safe and consistent use of the technology. In particular, there may be a need for greater oversight of FRT’s use in the private sector, where it is being deployed to identify suspected shoplifters and track customer footfall in retail stores, among other uses.

Regulators, civil society groups and political leaders will continue to scrutinise the governance of FRT over the coming months. In doing so, they will ask not just whether existing laws are clear, but whether they are being properly enforced, and indeed whether they are sufficient.

Our paper proposes a number of questions for further investigation and discussion:

  • How can the public be meaningfully engaged in debate about where FRT should be used, and under what conditions?
  • Is there a case for a new law regulating FRT use by the police?
  • Should the powers of existing regulators be reviewed and clarified?
  • Should private sector use of FRT be subject to stronger regulation?
  • What role is there for industry self-regulation?

What next?

While debate about the role of FRT in our society will take time to play out, there are early opportunities for progress. For example, greater consistency in how FRT is used in law enforcement, including by having common safeguards in place before each rollout is confirmed, would help to address some concerns from civil society. At a minimum, we would expect police forces to be transparent about how they use live FRT, including where it is deployed, the basis by which people are included on watchlists, and how deployments are approved.

Over the coming months, the CDEI will continue to examine the implications of FRT for society. We are particularly interested in how it is being deployed in the private sector, which has had relatively little attention in comparison to the use of FRT in law enforcement. We will also be monitoring how this technology is being applied as part of COVID-19 response efforts, from encouraging social distancing to powering the digital ID technology behind COVID-19 health certificates. 

FRT promises significant benefits for society - but only if it is deployed on our own terms. The CDEI is committed to ensuring that those terms are met and upheld, and that ethics remain front and centre in the minds of the creators, users and regulators of this technology.

If you would like to find out more about the CDEI’s work on FRT, please contact cdei@cdei.gov.uk.

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.