Go to page content

What is facial recognition technology?

and where could it go from here?

FACIALREC (1).jpg

What is facial recognition technology?

While we all, as humans, have the innate ability to distinguish between faces, computers and AI have only recently begun to do the same, through facial recognition technology.

You might most commonly encounter facial recognition technology when unlocking your smart phone or at airport immigration. These are voluntary examples, where facial recognition is used as a biometric identifier – like your fingerprint or eyes – to access something. Automated Facial Recognition technology has been green-lit to be rolled out nationally across the UK and will be used on a regular basis by the Metropolitan Police from early 2020, trialled since 2016 in London.

The non-profit organisation Big Brother Watch is calling for UK authorities to stop using Automated Facial Recognition software with surveillance cameras. In 2018, Big Brother Watch reported that these live facial recognition systems were up to 98% inaccurate. As of their 2018 report, there is no clear legal basis, overseeing organisation, or governmental strategy for using this technology. Subsequently, the government have released their Biometrics Strategy and declared that as of February 2018, there are 12.5 million images stored on the Police National Database (PND) that are searchable to assist facial recognition technology. They note that some may be duplicates. This makes up a part of the 21 million images stored on the PND, including duplicate images, scars, marks, and tattoos.

We've done a little digging to see how it works, and the issues that facial recognition technology raise.

How does it work?

There are multiple ways that facial recognition technology can work, but the two used in the UK are:

1) Facial Matching:

This matches an image against a large database of images. This could be a still taken from surveillance camera footage, or any other source.

2) Automated Facial Recognition:

A much newer technology, facial recognition is embedded within the surveillance cameras themselves, matching people in real-time against a database. Each face is individually scanned and analysed.

Both methods measure facial characteristics through distinguishing landmarks, unique to each person, and 80 ‘nodal’ points on faces. This is translated into a numerical code, known as a 'faceprint'. This code is then referenced against a database of photographs and 'faceprints' to attempt to find a match. Crucially, this match is based on a percentage of corresponding features, and not a binary ‘yes’ or ‘no’ result.

What are the issues with facial recognition technology? 🤔

Big Brother Watch has identified the following key issues with facial recognition technology as of today.

  • Lack of legal basis
    • In 2017, Nick Hurd, the Minister for Policing, stated that there ‘is no legislation regulating the use of CCTV cameras with facial recognition.’ That danger is that facial recognition technology has come into action silently and without public knowledge. As Big Brother Watch puts it, it suggests the ‘silent erosion of human rights.’
  • Lack of oversight
    • Big Brother Watch has outlined that there is no formal, independent body to manage the police’s use of Automated Facial Recognition. This was raised previously in 2016, in the Review on the Surveillance Camera Code of Practice. The government allows police bodies to take almost total control over their use of facial recognition technology. Baroness Williams, in 2018, stated that the government will strive to ‘improve independent oversight and governance’ with the creation of a ‘board.’ The board is meant to monitor how the police are complying with ‘guidance,’ but the guidance they are observing is unclear and inaccessible. Big Brother Watch notes that ‘[i]t is unprecedented for government to provide a board to provide ‘guidance’ on the use of a policing power that is being deployed ultra vires.’ 'Ultra vires' refers to an act which requires legal authority but is done without it.

Facial recognition technology in police surveillance may also reinforces racism within law enforcement bodies. Fabio Bacchini and Ludovica Lorusso, in their study ‘Race, again: how face recognition technology reinforces racial discrimination’ (2019), report that facial recognition technology reinforces racial disproportion in the following ways:

  • Black people are overrepresented in many of the databases faces are matched against. Because of this, black people are more often found as a match for a suspect, which ‘in turn entails that black people are more often stopped, investigated, arrested, incarcerated and sentences as a consequence of face recognition technology.’
  • Faces with darker skin colours may be more difficult to identify because of the importance given to colour contract when characterising facial features and identifying nodal points.
  • Software across the board evidences drastic differences in accuracy across races. The primary reason for this appears to be racial disproportion in training: for example, Klare et al. (2012) study saw accuracy improvement when training exclusively on a racial group.
  • There are no accuracy tests for racially-biased error rates run by companies or law enforcement agencies using this software.

In June 2020, IBM, Amazon, and Microsoft halted the sales of facial recognition technology to law enforcement in the US.

Are my images stored on the Police National Database (PND)?

The retention of images from individuals who have been acquitted has been declared unlawful. When uploading images to the PND, police must consider if this is ‘useful intelligence’ to the force: e.g. if an older image of the individual already exists on the database, that should also be removed before the up-to-date one is added.

If a custody image has been taken and the individual is found to be innocent, they can apply to have their image removed from the database. The Review of the Use and Retention of Custody Images document outlines that ‘there should be a presumption in favour of deletion.’ This presumption should be ‘even stronger’ if the image was taken of an individual when under the age of 18, and those convicted under the age of 18 also have the right to request deletion. Those who have been convicted of a recordable offence (i.e. the police are permitted to keep a record of the offence but generally speaking, imprisonable) are able to apply for deletion.

The Custody Images Review also notes that ‘[r]etention of the convicted persons’ images is reviewed at specified intervals, which depend on the seriousness of the offence.’ There is no explicit timeframe given for respective offences.

Where could it go from here? 💭

The facial recognition industry was valued at USD $3.97 billion in 2018 and is predicted to grow to $10.15 by 2025. However, following the Black Lives Matter protests in 2020 and the subsequent pausing of facial recognition technology sales by leading tech giants in June, there is hope that there will be greater caution over mass-use of the technology in surveillance. Governments may develop more detailed guidance for police to follow when using the technology, and greater transparency over what information is stored on individuals and how to retrieve or delete this data.

As of yet, there is no such guidance and transparency in the UK and organisations like Big Brother Watch continue to call for the police to stop using facial recognition technology.