UK police could expand their use of facial recognition software despite widespread concerns over the ethical implications of the technology. Policing minister Chris Philp is reportedly keen for all officers across the country to have access to the technology, and wants to incorporate facial recognition into body-worn cameras. It is a move that runs counter to plans in the EU, where use of such cameras in public spaces is set to be banned.
<img decoding="async" width="500" height="334" src="https://techmonitor.ai/wp-content/uploads/sites/4/2023/05/shutterstock_473262604.jpg" alt="Controversial CCTV cameras worn The Home Office is said to have briefed the biometrics and surveillance camera commissioner, Professor Fraser Sampson, on the planned expansion to more UK police forces according to a report in the FT. It is a divisive subject, with previous studies finding its use unethical.
The technology takes footage from a live CCTV camera feed, looks for faces and compares the features to those in a pre-determined watchlist of “people of interest” in real time. When one is spotted it generates an alert that officers can then investigate.
UK policing minister Chris Philp is said to have “expressed his desire to embed facial recognition technology in policing” which includes a consideration over how the government can support the police. One of the ideas he is said to be exploring is whether facial recognition technology could also be incorporated into body-worn cameras, the
FT report says.
Philps’ plans are part of a new report commissioned The technology is already in use Privacy campaigners are opposed to police using facial recognition software on the grounds there are risks of misidentification and racial bias. The Met police denies any bias, saying during a review of its usage of the technology to date that it found “no statistically significant bias in relation to race and gender,” adding that the chance of a false match was one in 6,000 people who pass the camera. Whether this would still be the case with body-worn cameras is unclear.
A Home Office spokesperson said: “The government is committed to empower the police to use new technologies like facial recognition in a fair and proportionate way. Facial recognition plays a crucial role in helping the police tackle serious offences including murder, knife crime, rape, child sexual exploitation and terrorism.”
Content from our partners
In October last year, a review of the Met and South Wales police forces’ use of the technology Researchers at the Minderoo Centre created “minimum ethical and legal standards” that should be used to govern any use of facial recognition technology and tested those standards against how UK police forces are using it, finding they all failed to meet the minimum.
Professor Gina Neff, executive director of the centre, said her team compiled a list of all the ethical guidelines, legal frameworks and current legislation to create the measures used in the tests. These aren’t legal requirements, but rather what the researchers say should be used as a benchmark to protect privacy, human rights, transparency and bias requirements, as well as ensure accountability and provide oversight on the use and storage of personal information.
All the current police use-cases for live facial recognition failed the test, Professor Neff says. “These are complex technologies, they are hard to use and hard to regulate with the laws we have on the books,” she told
Tech Monitor. “The level of accuracy achieved does not warrant the level of invasiveness required to make them work.”
In the EU, lawmakers last week voted to adopt an amendment to the upcoming EU AI Act that would ban the use of facial recognition in public spaces.
Read more: Police live facial recognition technology ‘unethical’