Civil liberties groups say facial recognition cameras should be BANNED as they accuse ministers of quietly approving their use despite court rulings against invasive filming
- College of Policing published new guidance on facial recognition technology
- Guidance was published last week despite court rulings against invasive filming
- More than 30 civil liberties groups have signed an open letter calling for a ban
Civil liberties groups have called for a ban on facial recognition cameras, while accusing ministers of quietly approving the technology despite a 2020 Court of Appeal ruling against invasive filming.
Thirty-one organisations including Amnesty International, Liberty and Privacy International have posted an open letter alleging guidance allowing police, local councils and enforcement agencies to use facial recognition across England and Wales has been given in defiance of court rulings against invasive filming.
The guidance was published last week by the College of Policing during the parliamentary recess and without any announcement by it or the Government, according to The Daily Telegraph.
It comes despite a Court of Appeal ruling in 2020 that the use of facial recognition cameras by South Wales Police as a pilot scheme ahead of a nationwide rollout breached privacy rights and broke equalities law.
Civil liberties groups say guidance on facial recognition cameras has been published – despite a court ruling against invasive filming involving South Wales Police (stock photo)
Civil rights campaigner Ed Bridges, 37, brought a legal challenge against South Wales Police arguing their use of automatic facial recognition (AFR) had caused him ‘distress’.
He had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.
In a ruling, three Court of Appeal judges ruled the force’s use of AFR was unlawful, allowing Mr Bridge’s appeal on three out of five grounds he raised in his case.
In the judgment, the judges said that there was no clear guidance on where AFR Locate – the system trialled by South Wales Police – could be used and who could be put on a watchlist.
It ruled that ‘too much discretion is currently left to individual police officers’.
However, despite the ruling, civil liberties groups have now claimed that facial recognition technology has been approved by stealth.
‘In a democratic society, it is imperative that intrusive technologies are subject to effective scrutiny,’ the groups said in a letter.
Ed Bridges won a case against South Wales Police in 2020, arguing their use of automatic facial recognition had caused him ‘distress’
‘Police and the Home Office have, so far, completely bypassed Parliament on the matter of LFRT (live facial recognition technology). We are not aware of any intention to subject LFRT plans to parliamentary consideration, despite the intrusiveness of this technology, its highly controversial use over a number of years, and the dangers associated with its use.’
The group added it was ‘calling on Parliament and relevant stakeholders to halt and ban the use of live facial recognition technology by the police and private companies entirely, as it poses significant and unmitigable risks to our society.
‘We do not believe that LFRT can ever be safely deployed in public spaces and for mass surveillance purposes.’
The letter said the use of facial recognition technology ‘represents a huge shift in the relationship between the individual and the State’.
‘The implications come not solely from privacy and data protection perspectives, but from the larger ethical question for a democratic society permitting and seemingly condoning the rollout of such intrusive technology,’ it said.
‘LFRT also raises significant problems for our human rights, such as freedom of expression and freedom of assembly.’
The groups said they were concerned LFRT ‘may be used in a broad range of public gatherings’ such as sporting events, music concerts, and protests, threatening protected rights.
‘Further, deployments of this surveillance technology could mirror and exacerbate existing disproportionate policing practices towards minority communities,’ the letter said.
Last year, Mr Bridges took his case – believed to be the world’s first over police use of such technology – to the Court of Appeal after his case was previously rejected by the High Court.
In a statement after the ruling, Mr Bridges said he was ‘delighted’ the court has found that ‘facial recognition clearly threatens our rights’.
Six steps behind facial recognition technology
The Metropolitan Police uses facial recognition technology called NeoFace, developed by Japanese IT firm NEC, which matches faces up to a so-called watch list of offenders wanted by the police and courts for existing offences.
Cameras scan faces in its view measuring the structure of each face, creating a digital version that is searched up against the watch list.
If a match is detected, an officer on the scene is alerted, who will be able to see the camera image and the watch list image, before deciding whether to stop the individual.
South Wales Police said the test of their ‘ground-breaking use of this technology’ by the courts had been a ‘welcome and important step in its development’.
Chief Constable Matt Jukes said: ‘The Court of Appeal’s judgment helpfully points to a limited number of policy areas that require this attention.
‘Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.’
Mr Jukes added: ‘We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology.
‘But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching our duties around equality.’
At a hearing, lawyers for Mr Bridges argued the facial recognition technology interferes with privacy and data protection laws and is potentially discriminatory.
They said the technology, which is being trialled by the force with a view to rolling it out nationally, is used to live capture the facial biometrics of large numbers of people and compare them with people on a ‘watchlist’.
The force does not retain the facial biometric data of anyone whose image is captured on CCTV but does not generate a match, the court heard.
Mr Bridges’ case was dismissed at the High Court in September 2019 by two senior judges, who concluded the use of the technology was not unlawful.
Lord Justice Haddon-Cave and Mr Justice Swift said they were ‘satisfied’ the current legal regime is adequate to ‘ensure appropriate and non-arbitrary use of AFR’ and that the force’s use to date of the technology has been ‘consistent’ with human rights and data protection laws.
Mr Bridges, who the force confirmed was not a person of interest and has never been on a watchlist, crowdfunded his legal action and is supported by civil rights organisation Liberty, which is campaigning for a ban on the technology.
AFR technology maps faces in a crowd by measuring the distance between features then compares results with a ‘watchlist’ of images – which can include suspects, missing people and persons of interest.
South Wales Police has been conducting a trial of the technology since 2017.
The force added that it is not intending to appeal against the judgment.
Source: Read Full Article