Activist loses UK court case on police facial recognition
Court News 2019/08/30 14:47 A British court ruled Wednesday that a police force's use of automated facial recognition technology is lawful, dealing a blow to an activist concerned about its implications for privacy.
Existing laws adequately cover the South Wales police force's deployment of the technology in a trial, two judges said , in what's believed to be the world's first legal case on how a law enforcement agency uses the new technology.
The decision comes amid a broader global debate about the rising use of facial recognition technology. Recent advances in artificial intelligence make it easier for police to automatically scan faces and instantly match them to "watchlists" of suspects, missing people and persons of interest, but it also raises concerns about mass surveillance.
"The algorithms of the law must keep pace with new and emerging technologies," Judges Charles Haddon-Cave and Jonathan Swift said.
Ed Bridges, a Cardiff resident and human rights campaigner who filed the judicial review, said South Wales police scanned his face twice as it tested the technology - once while he was Christmas shopping in 2017 and again when he was at a peaceful protest against a defense expo in 2018.
"This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance," he said in a statement released by Liberty, a rights group that worked on his case.
His legal team argued that he suffered "distress" and his privacy and data protection rights were violated when South Wales police processed an image taken of him in public.
But the judges said that the police force's use of the technology was in line with British human rights and data privacy legislation. They said that all images and biometric data of anyone who wasn't a match on the "watchlist" of suspects was deleted immediately.
Existing laws adequately cover the South Wales police force's deployment of the technology in a trial, two judges said , in what's believed to be the world's first legal case on how a law enforcement agency uses the new technology.
The decision comes amid a broader global debate about the rising use of facial recognition technology. Recent advances in artificial intelligence make it easier for police to automatically scan faces and instantly match them to "watchlists" of suspects, missing people and persons of interest, but it also raises concerns about mass surveillance.
"The algorithms of the law must keep pace with new and emerging technologies," Judges Charles Haddon-Cave and Jonathan Swift said.
Ed Bridges, a Cardiff resident and human rights campaigner who filed the judicial review, said South Wales police scanned his face twice as it tested the technology - once while he was Christmas shopping in 2017 and again when he was at a peaceful protest against a defense expo in 2018.
"This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance," he said in a statement released by Liberty, a rights group that worked on his case.
His legal team argued that he suffered "distress" and his privacy and data protection rights were violated when South Wales police processed an image taken of him in public.
But the judges said that the police force's use of the technology was in line with British human rights and data privacy legislation. They said that all images and biometric data of anyone who wasn't a match on the "watchlist" of suspects was deleted immediately.