The Information Commissioner’s Office has conducted its inaugural audit of facial recognition technology used in policing, examining the implementation of this contentious tool by South Wales Police and Gwent Police.
The audit is [1] a component of a broader initiative aimed at assessing the use of facial recognition technology throughout England and Wales, with an emphasis on governance frameworks, deployment practices, and compliance with data protection regulations. The audit assessed governance protocols, staff training, data retention policies, and the implementation of Data Protection Impact Assessments. It determined that both forces possess mechanisms to guarantee human oversight, with every deployment evaluated for necessity and proportionality.
The Information Commissioner’s Office or ICO says facial recognition technology or FRT [2] is not exempt from legal regulations. It falls under data protection laws that mandate any handling of personal data, including facial images, to be lawful, fair, and proportionate. A spokesperson for ICO said, “Facial Recognition Technology (FRT) does not operate in a legal vacuum. It is covered by data protection law, which requires any use of personal data, including biometric data, to be lawful, fair and proportionate. When used by the police, FRT must be deployed in a way that respects people’s rights and freedoms, with appropriate safeguards in place.” As FRT continues to expand, it is crucial for police forces to implement the necessary safeguards to protect individuals' rights.
Emily Keaney who [3] is the Deputy Commissioner stated that the technology presents a mix of opportunities and risks. “Few technologies are having as great an impact on modern policing as facial recognition technology. It can bring clear benefits in helping to prevent and detect crime, but it comes with real risks to people’s rights and freedoms such as the potential for discrimination and misidentification if not used responsibly.” The Information Commissioner's Office highlighted the importance of ensuring that the management of biometric data complies with UK data protection law, maintaining standards of lawfulness, fairness, and proportionality. The audit coincides with the expanding use of live facial recognition technology across the nation. The Home Office has [4] introduced 10 new vans equipped with facial recognition technology which are now in operation across seven police forces as stated in my previous article.
This expansion has raised concerns regarding transparency, safeguards, and legal oversight. “It's a really intrusive new power, absent of any democratic scrutiny. There are no specific laws on the use of facial recognition, they're really writing their own rules on how they use it. Shaun's legal challenge is such an important opportunity for the government and the police to take stock of how this technology is spreading across London in a really unaccountable fashion.” said [5] senior advocacy officer Madeleine Stone. The Information Commissioner's Office emphasized that the recent audit offers a limited view of practices within two police forces and does not imply general approval for all deployments of facial recognition technology.
Additional audits are in the pipeline, which [6] will include assessments of the Metropolitan Police, Essex Police, and Leicestershire Police. These evaluations are part of the ICO’s broader initiative concerning artificial intelligence and biometrics, aimed at establishing consistent standards and enhancing public confidence. According to surveys referenced by the ICO, a segment of the public expresses willingness to endorse police usage of facial recognition, though their trust hinges on the technology's accuracy, impartiality, and the presence of robust safeguards. The regulator’s continuous oversight seeks to ensure that advancements in policing remain aligned with fundamental rights.
The audit summary indicates [7] that both the live and retrospective versions of the technology have demonstrated a high level of assurance regarding the effectiveness of processes and procedures in achieving data protection compliance. South Wales Police and Gwent Police agreed to a consensual audit of its data protection practices.As part of the ICO’s AI and biometrics strategy, the ICO has committed to supporting and ensuring the proportionate and rights-respecting use of facial recognition technology by the police by auditing police forces using FRT and publishing the findings, securing assurance that deployments are well-governed and people’s rights are protected.”
However, in 2020 the South Wales police lost [8] a facial recognition case after a court ruled it breached privacy and broke equalities law. Liberty's demand came in response to a pivotal ruling in a case initiated by civil liberties campaigner Ed Bridges, who contended that the Welsh force's collection of thousands of facial images was indiscriminate and disproportionate. Louise Whitfield who [9] is Liberty’s head of legal casework said, “The implications of this ruling also stretch much further than Wales and send a clear message to forces like the Met that their use of this tech could also now be unlawful and must stop.”