The Privacy Concerns of Live Facial Recognition Technology

Jun 24, 2021

Datacentres for better security

Share this News

The Privacy Concerns of Live Facial Recognition Technology

Facial recognition technology has been proliferating in recent times as part of the wider trend of biometric security.  Facial recognition can make many aspects of our lives easier and more secure.  Many will be familiar with the technology as a common means of unlocking phones and other personal electronic devices.  There is a lot of data collected as the unique composition of a face is scanned and recognised and this allows for a totally unique biometric profile and the increased security that comes with that.

But concerns have recently been raised regarding what is known as live facial recognition (LFR).  This is a system that scans, recognises and stores a person’s biometric facial data in real time.  The system works by integrating the data with a network of interconnected data centres, using algorithms to recognise and analyse facial data.  With such personal data collected automatically and on a mass scale, certain privacy and security concerns become pressing.

The concern is not that people are exposed to unwanted surveillance – CCTV systems do that already – but that identities are automatically inferred from biometric details and potentially sensitive data is retrieved in order to profile the individual.  This information can be stored on the system or retrieved from the internet.  In future, there is serious potential to overlay CCTV systems with LFR and have facial details checked against collected data, which could include social media.

This means that an individual could be checked against their online activity and identified in a public place through a series of fully automated processes, with no permission granted at any point.

Potential Uses for LFR

The potential commercial benefits of installing such technology are exactly what could ultimately encourage its proliferation.  If people can be profiled in real time, it could allow LFR systems to feed that data into other automated responses.  The result could be personalised adverts served up to people in public places or market data collected from the profiles of customers attending an event or entering a particular premises.

There are also the obvious security applications of LFR that come with their own ethical concerns.  Naturally, LFR could automatically check facial data against a database of known offenders in order to provide security for any sensitive locations.  Yet, while such security screening is already common practice among official and legal institutions, the danger lies in the ability of private companies or even individuals to use this technology.  LFR operating in a retail environment, for example, could check all collected biometric facial data against the images of known shoplifters or other undesirable persons who have otherwise run afoul of the law.  This could lead to a common practice of private entities collecting facial data for security applications.

Data Protection

The clear goal in regulating this technology should be to ensure that it does not proliferate and expand into the private sphere without due regard for data protection and the limited rights of private entities to store the facial biometric data of members of the public.

It is telling that, in the US, the technology has met resistance on account of the governing rules not being clear enough.  Some American cities have banned the use of LFR and several companies have discontinued its use until the rules are made clearer.

The UK Information Commissioner Elizabeth Denham recently published an official Commissioner’s Opinion specifically on the use of LFR in public places with the aim of outlining the conditions under which its use would be justified.  By ultimately enforcing such criteria, the hope is that concern for privacy will always be at the heart of any decision to deploy LFR.

By clearly marking a threshold for its use, the many great benefits and useful applications of LFR – for example in the search for missing persons – can be fully realised without the danger of abuse or serious privacy violation.