Is Big Brother watching you?

Automated facial recognition technology (AFR) recently fell under the scrutiny of the Court of Appeal. 

South Wales Police had initiated a pilot scheme openly deploying AFR on around 50 occasions at various public events between May 2017 and April 2019. 

The technology works by taking images from a live feed at a rate of 50 faces per second and automatically comparing individuals faces with a ‘watchlist’ including people who had escaped from custody, were wanted on warrants, or were suspected of criminality.  

If there is no match, the image is deleted from the live feed. The bulk of the images scanned were in this category. If, however, there is a match an alert is created for human review to determine whether to act.  

The claimant had been in proximity to two deployments of AFT in Cardiff in December 2017 and March 2018. It is contended that his face would have been amongst the c.500,000 scanned during the pilot. Although he was not on the watchlist, the claimant contended that his image would still have been, recorded even if it were immediately deleted. 

The Police did not dispute this.  

The claimant sought judicial review on the grounds that the deployment of AFT was not compatible with:

  • the right to respect for private life 
  • data protection legislation 
  • the ‘Public Sector Equality Duty’ under the Equality Act 2010 (in England & Wales)

The claim was dismissed by the Divisional Court on all grounds. The claimant appealed. 

The Court of Appeal held:

  • There were no clear guidelines on where AFR could be used and who could be put on a watchlist. This afforded too broad a discretion to Police to meet the standard required by Article 8(2) of the European Convention on Human Rights.  
  • That said, a weighting exercise of the benefits of AFR over the impact on the claimant concluded it was proportionate. 
  • The Police data protection impact assessment had been written on the basis that the right to respect for private life was not infringed. On this basis, the Court found it to be deficient. 
  • As the two subject deployments had occurred before the Data Protection Act 2018 came into force, the Court did not need to decide this point. 
  • Finally, it was held that the Police had not taken reasonable steps to consider the Public Sector Equality Duty and potential bias on grounds of race or sex.

A spokesman for the Information Commissioner’s Office said:

Facial recognition relies on sensitive personal data and balancing people’s right to privacy with the surveillance technology the police need to carry out their role effectively is challenging. But for the public to have trust and confidence in the police and their actions there needs to be a clear legal framework. Today’s judgment is a useful step towards providing that.

The Police have confirmed that they do not intend to appeal.

For more information about this article, or any other aspect of our business and personal legal solutions, get in touch.  There is no charge for initial, informal, advice.

Cary Agos Fc Law
R. (Bridges) v South Wales Police [2020] EWCA Civ 1058

Back to all posts

How can we help you?

Contact us today to arrange a free ‘no obligation’ meeting.

Subscribe to eBriefings

* indicates required

Please select how you would like to hear from us:


You can unsubscribe at any time by clicking the link in the footer of our emails. For information about our privacy practices, please visit our website.