Laws on police facial recognition aren’t tough enough, UK data watchdog barrister tells Court of Appeal

London, UK - March, 2018. Police officers patrolling Leicester Square and Piccadilly Circus in central London. Pic Paolo Paradiso /

Paolo Paradiso /

A top judge told a barrister for the UK Information Commissioner’s Office (ICO) today that his legal arguments against police facial-recognition technology face “a great difficulty” as he wondered whether they were even relevant to the case.

Sir Terence Etherton, the Master of the Rolls and president of the Court of Appeal, stopped Gerry Facenna QC at the beginning of his legal submissions this morning to question their relevance.

“I think that this line of submissions faces a great difficulty,” the Master of the Rolls told the ICO’s barrister. “Effectively, as I understand these submissions on behalf of the Information Commissioner, it’s not addressing the question… You’re talking about compliance with the legal framework whereas what was in issue before, and the essence of ground 1, is not compliance with it but whether the framework is sufficient.”

Facenna replied: “My primary submission now is that the legal framework which the Divisional Court set out in the annex to its judgment does not meet the law requirement under Article 8 or that requirement as it’s set out in section 35 of the Act.”

In plain English, Facenna was saying that South Wales Police’s legal justification for deploying facial-recognition tech, as detailed yesterday, didn’t comply with the Human Rights Act-guaranteed right to privacy – nor the Data Protection Act 2018 section, which states: “The processing of personal data for any of the law enforcement purposes is lawful only if and to the extent that it is based on law.”

The court is hearing an appeal against a Divisional Court judgment which OK’d police use of the creepy surveillance technology. Human rights pressure group Liberty is backing a South Wales man who tried and failed to prove that the tech was used unlawfully. On his side is the ICO, while against him are South Wales Police, the Home Office and the South Wales Police and Crime Commissioner.

The ICO has made no secret [PDF] that it is against routine police use of facial-recognition tech. Yet Facenna’s arguments this morning seemed to be falling on stony ground.

Despite the barrister’s efforts, the Master of the Rolls remained “in some confusion” about the legal submissions as he told the barrister: “Your Item 2 is not part of this appeal. Item 1 may be but your Item 2 is not part of Ground 1,” referring to the detailed grounds of appeal with which Liberty hopes to overturn the earlier judgment. Facenna had raised a line of argument nobody else was looking at, the judge was saying.

“It’s not part of Ground 1,” conceded Facenna, “but Ground 1 does relate to the lawfulness of the deployments that have taken place in the past. As I understand it, the purpose of this litigation has been to ascertain whether the overall legal framework under which facial recognition is continuing to be deployed is sufficient, and whether its continuing deployment is therefore lawful.”

The Master of the Rolls pushed his glasses up his nose at this point. Taking the hint, Facenna cut short his detailed exposition and said: “Why don’t I crack on?”

“Yes, thank you,” said the president of the court.

In written submissions Facenna told the court that the Information Commissioner:

Although police filled out a data protection impact assessment for its camera deployment, the ICO said it was not good enough, pointing out the Divisional Court found that it was a previous document that had been “revised and retitled”. The data protection regulator added in legal submissions: “Although it made passing reference to the possibility that members of the public might be affected by the measures, it contained no assessment of that impact on the protection of their personal data, nor any assessment of the risks to their rights and freedoms.”

Building on this, Facenna urged the judges to rule that facial recognition should be better regulated, saying: “In a democratic society like ours, when you have a new sophisticated technology or tool, undoubtedly of use to the state, whether in the public interest, prevention of crime, tax evasion or whatever it is, is [this type of unregulated deployment] right?”

The barrister went on: “Its use involves an interference – maybe not very large but interference nonetheless – with the fundamental rights of tens of thousands, millions of citizens, depending how it is employed. Is it consistent with the law that that can be rolled out, even on a pilot basis, by individual police forces or public authorities using to a large extent their own discretion, without there being some kind of legal framework?”

A thoughtful Master of the Rolls asked later: “Would the ICO have power to issue a bespoke code of practice that might provide a specific framework on AFR [automated facial recognition]?”

Facenna said he thought it did not but would check, adding: “My point is it can’t be left to individual police forces or deployments to develop impact assessments or policy documents.”

This afternoon the court began hearing from Jason Beer QC, barrister for South Wales Police. Tomorrow it will hear from counsel for the Home Office and the Surveillance Camera Commissioner and The Register will be reporting their arguments.

The judges are: the Master of the Rolls, Sir Terence Etherton, who is president of the Civil Division of the Court of the Appeal; Lady Justice Sharp, president of the Queen’s Bench Division of the High Court; and Lord Justice Singh, president of the controversial Investigatory Powers Tribunal. ®

Sponsored: Webcast: Simplify data protection on AWS