close
Thursday April 25, 2024

Police use of facial recognition tech ruled unlawful

By Pa
August 12, 2020

LONDON: The use of facial recognition technology by police did interfere with privacy and data protection laws, the Court of Appeal has ruled.

Civil rights campaigner Ed Bridges brought a legal challenge against South Wales Police, arguing that their use of automatic facial recognition (AFR) had caused him “distress”. The 37-year-old had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018. In a ruling on Tuesday, three Court of Appeal judges ruled that the use of the technology was unlawful and allowed Bridge’s appeal on three out of five grounds he raised in his case. Bridges took his case — believed to be the world’s first over police use of such technology — to the Court of Appeal after his action was previously rejected by the High Court.

In the judgment, the judges said the High Court erred when it concluded that the force’s interference with Bridge’s right to a private life was “in accordance with the law” under human rights laws.

They ruled that there was no clear guidance on where AFR Locate — the system being used by South Wales Police in an ongoing trial — could be used and who could be put on a watchlist, and this left too much discretion to police officers.

The ruling said: “The fundamental deficiencies, as we see it, in the legal framework currently in place relate to two areas of concern. The first is what was called the ‘who question’ at the hearing before us. The second is the ‘where question’.

“In relation to both of those questions too much discretion is currently left to individual police officers. It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed.” It added that the court is satisfied “that the current policies do not sufficiently set out the terms on which discretionary powers can be exercised by the police and for that reason do not have the necessary quality of law”.

In their ruling, Master of the Rolls Sir Terence Etherton, President of the Queen’s Bench Division Dame Victoria Sharp and Lord Justice Singh, did find that the use of AFR was proportionate under human rights laws as the potential benefits of it outweigh the impact on Bridges.

The court also concluded that a data protection impact assessment of the scheme was deficient and that the force had not done all it could to verify that the AFR software “does not have an unacceptable bias on grounds of race or sex”.

The judgment noted that there was no clear evidence that the software was biased on grounds of race or sex. The judges said they hoped that “as AFR is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias.”

In a statement after the ruling, Bridges said he was “delighted” the court had found that “facial recognition clearly threatens our rights”.

South Wales Police said the test of their “ground-breaking use of this technology” by the courts had been a “welcome and important step in its development” and the force will give the findings “serious attention”. The force is not intending to appeal against the judgment.