Ensuring the lawfulness of automated facial recognition surveillance in the UK – Oxford Human Rights Hub

‘In R(Bridges) v South Wales Police, the England and Wales Court of Appeal reviewed the lawfulness of the use of live automated facial recognition technology (‘AFR’) by the South Wales Police Force. CCTV camera­­s capture images of the public, which are then compared with digital images of persons on a watchlist.’

Full Story

Oxford Human Rights Hub, 3rd September 2020

Source: ohrh.law.ox.ac.uk

Policing Our Privacy – Where Does the Law Lie? – 39 Essex Chambers

‘Last Tuesday the Court of Appeal (Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Singh LJ) allowed the appeal of the civil liberties campaigner, Edward Bridges, against the decision of the Divisional Court which had dismissed his claim for judicial review of South Wales Police Force’s use of live automated facial recognition technology (“AFR”).’

Full Story

39 Essex Chambers, 17th August 2020

Source: www.39essex.com

Facial Recognition Technology not “In Accordance with Law” – UK Human Rights Blog

‘The Court of Appeal, overturning a Divisional Court decision, has found the use of a facial recognition surveillance tool used by South Wales Police to be in breach of Article 8 of the European Convention on Human Rights (ECHR). The case was brought by Liberty on behalf of privacy and civil liberties campaigner Ed Bridges. The appeal was upheld on the basis that the interference with Article 8 of the ECHR, which guarantees a right to privacy and family life, was not “in accordance with law” due to an insufficient legal framework. However, the court found that, had it been in accordance with law, the interference caused by the use of facial recognition technology would not have been disproportionate to the goal of preventing crime. The court also found that Data Protection Impact Assessment (DPIA) was deficient, and that the South Wales Police (SWP), who operated the technology, had not fulfilled their Public Sector Equality Duty.’

Full Story

UK Human Rights Blog, 13th August 2020

Source: ukhumanrightsblog.com

Police’s Automated Facial Recognition Deployments Ruled Unlawful by the Court of Appeal – Doughty Street Chambers

‘R. (Bridges) v Chief Constable of South Wales [2020] EWCA Civ 1058 [2020] 8 WLUK 64 is thought to be the first case in the world to consider the use of facial recognition technology by law enforcement agencies. In this short article, we explore the judgment and its implications for the deployment of these and similar technologies in future.’

Full Story

Doughty Street Chambers, 12th August 2020

Source: insights.doughtystreet.co.uk

Let’s face it: use of automated facial recognition technology by the police – UK Police Law Blog

‘The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2020] EWCA Civ 1058 (handed down on 11 August 2020) was an appeal from what is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Court of Appeal in Bridges, this system constitutes an interference with Article 8 rights which is not such as is in accordance with the law, but which (critically) would be proportionate if a sufficiently narrow local policy were framed.’

Full Story

UK Police Law Blog, 11th August 2020

Source: ukpolicelawblog.com

South Wales police lose landmark facial recognition case – The Guardian

‘Campaigners are calling for South Wales police and other forces to stop using facial recognition technology after the court of appeal ruled that its use breached privacy rights and broke equalities law.’

Full Story

The Guardian, 11th August 2020

Source: www.theguardian.com

‘Deepfake’ warning over online courts – Legal Futures

‘Video manipulation software, including ‘deepfake’ technology, poses problems for remote courts in verifying evidence and that litigants or witnesses are who they say they are, a report has warned.’

Full Story

Legal Futures, 29th July 2020

Source: www.legalfutures.co.uk

UK’s facial recognition technology ‘breaches privacy rights’ – The Guardian

‘Automated facial recognition technology that searches for people in public places breaches privacy rights and will “radically” alter the way Britain is policed, the court of appeal has been told.’

Full Story

The Guardian, 23rd June 2020

Source: www.theguardian.com

Equality watchdog demands suspension of use of automated facial recognition and predictive algorithms in policing – Local Government Lawyer

‘The Equality and Human Rights Commission (EHRC) has called for the suspension of the use of automated facial recognition (AFR) and predictive algorithms in policing in England and Wales, “until their impact has been independently scrutinised and laws are improved”.’

Full Story

Local Government Lawyer, 13th March 2020

Source: www.localgovernmentlawyer.co.uk

Let’s face it: use of automated facial recognition technology by the police – UK Police Law Blog

‘The case of R (Bridges) v Chief Constable of South Wales Police & Information Commissioner [2019] EWHC 2341 (Admin); [2020] 1 WLR 672 is said to have been the first claim brought before a court anywhere on planet earth concerning the use by police of automated facial recognition (“AFR”) technology. There could be nothing wrong with posting scores of police officers with eidetic memories to look out for up to a 800 wanted persons at public gatherings. So why not use a powerful computer, capable of matching 50 faces a second with a database of (under) 800 suspects, to do this job much more cheaply and instantaneously, flagging any matches to a human operator for final assessment? According to the Divisional Court in Bridges, this may, depending on the facts of each particular deployment, be lawful.’

Full Story

UK Police Law Blog, 21st February 2020

Source: ukpolicelawblog.com

Rules urgently needed to oversee police use of data and AI – report – The Guardian

‘National guidance is urgently needed to oversee the police’s use of data-driven technology amid concerns that it could lead to discrimination, a report has said.’

Full Story

The Guardian, 23rd February 2020

Source: www.theguardian.com

Watchdog rejects Met’s claim that he supported facial recognition – The Guardian

Posted February 13th, 2020 in equality, facial mapping, London, news, police by tracey

‘The official biometrics commissioner has rebuked the Metropolitan police after it falsely claimed that he supported its use of facial recognition CCTV in an equalities impact assessment published as the force made its first operational use of the controversial technology.’

Full Story

The Guardian, 12th February 2020

Source: www.theguardian.com

Met police deploy live facial recognition technology – The Guardian

‘The Metropolitan police have been accused of defying the warnings of its own watchdogs by beginning operational use of facial recognition CCTV, despite a scathing assessment of its effectiveness from the expert hired to scrutinise its trials.’

Full Story

The Guardian, 11th February 2020

Source: www.theguardian.com

Meadowhall facial recognition scheme troubles watchdog – BBC News

Posted January 28th, 2020 in data protection, facial mapping, identification, news, pilot schemes, police by tracey

‘Police involvement in a private landlord’s facial recognition trial has led a regulator to call for government intervention.’

Full Story

BBC News, 28th January 2020

Source: www.bbc.co.uk

Facial recognition could be ‘spectacular own goal’, police warned amid accuracy concerns – The Independent

Posted January 28th, 2020 in data protection, facial mapping, news, police by tracey

‘Facial recognition could be a “spectacular own goal” for police if it fails to be inaccurate and effective, the government has been warned. MPs raised concerns about the technology after the Metropolitan Police announced the start of live deployments in London.’

Full Story

The Independent, 28th January 2020

Source: www.independent.co.uk

10 cases that defined 2019 – UK Human Rights Blog

‘And so, we reach the end of another year. And what a year it has been. As well perhaps the most tumultuous period in British politics for decades, this year saw the first ever image taken of a black hole, a victory for the England men’s cricket team at the World Cup, the discovery of a new species of prehistoric small-bodied human in the Philippines and signs that humpback whale numbers in the South Atlantic have bounced back thanks to intensive conservation efforts. And the law? Well, rather a lot has happened really. As the festive season draws near, what better way is there to celebrate than to rewind the clock and relive the 10 cases which have defined 2019?’

Full Story

UK Human Rights Blog, 19th December 2019

Source: ukhumanrightsblog.com

Court of Appeal to hear facial recognition technology challenge – Law Society’s Gazette

‘A Cardiff resident who lost a High Court challenge over police deployment of automated facial recognition technology has been given permission to take his case to the Court of Appeal.’

Full Story

Law Society's Gazette, 20th November 2019

Source: www.lawgazette.co.uk

Rise of the algorithms – UK Human Rights

‘The use of algorithms in public sector decision making has broken through as a hot topic in recent weeks. The Guardian recently ran the “Automating Poverty” series on the use of algorithms in the welfare state. And on 29 October 2019 it was reported that the first known legal challenge to the use of algorithms in the UK, this time by the Home Office, had been launched. It was timely, then, that the Public Law Project’s annual conference on judicial review trends and forecasts was themed “Public law and technology”.’

Full Story

UK Human Rights Blog, 4th November 2019

Source: ukhumanrightsblog.com

Police may have used ‘dangerous’ facial recognition unlawfully in UK, watchdog says – The Independent

‘Facial recognition technology may have been used unlawfully by police, a watchdog has warned while calling for urgent government regulation.’

Full Story

The Independent, 1st November 2019

Source: www.independent.co.uk

Facial Recognition Technology: High Court gives judgment – UK Human Rights Blog

‘R (Bridges) v Chief Constable of South Wales Police and Secretary of State for the Home Department [2019] EWHC 2341 (Admin). The High Court has dismissed an application for judicial review regarding the use of Automated Facial Recognition Technology (AFR) and its implications for privacy rights and data protection.’

Full Story

UK Human Rights Blog, 12th September 2019

Source: ukhumanrightsblog.com