Facial recognition technology has been heralded as a groundbreaking advancement in crime fighting, comparable to the introduction of fingerprints. However, its impending deployment by Police Scotland has sparked concern among activists who warn that it may exacerbate issues related to racial profiling and privacy.
Police Scotland is [1] currently evaluating the implementation of live facial recognition technology or LFR, although it has not yet put it into practice. While the force does employ retrospective image search technology, the use of LFR which allows for real-time facial recognition in public areas is still being assessed. Additionally, a "National Conversation" is taking place to gather public opinions on the possible integration of LFR in law enforcement. Responding to [2] a Freedom of Information request in 2023, Scotland Police said, “As technology advances and we all spend more time online we can see that the need to embrace new ways of working and harness technology need to be considered. We are committed to our duty to keep people safe, and this may necessitate us moving with the times and looking to technology to help us to do so in the future.” However, In a publication [3] called Discussion Paper on the Potential Adoption of Live Facial Recognition, the Scottish Police Authority admits that there is [4] both a legal and ethical duty to notify the public that LFR is currently in operation.
The deployment of live facial recognition technology by law enforcement has sparked significant concerns and discussions surrounding privacy, civil liberties, and ethical considerations. The privacy advocacy group Big Brother Watch [5] is currently actively campaigning against LFR use in the UK, emphasizing the absence of a legal framework that governs its application. They contend that police forces lack a definitive lawful basis for employing LFR. Additionally, the civil rights organization Liberty has also opposed the use of LFR by police, offering legal backing in [6] the notable Bridges v South Wales Police court case. “We acted as solicitors for Ed Bridges, who challenged South Wales Police’s use of live facial recognition in public. In the world’s first legal challenge to police use of this tech, Ed argued the force was breaching rights to privacy, data protection laws, and equality laws. South Wales Police has used the tech on more than 60 occasions since May 2017 and may have taken sensitive facial biometric data from 500,000 people without their consent. In September 2019, the High Court decided that while facial recognition does interfere with the privacy rights of everyone scanned, the current legal framework provides sufficient safeguards. We disagreed, and appealed against the judgment,” said Liberty.
London's Metropolitan Police have already adopted the technology, prompting local officers to contemplate implementing a similar system. However, [7] advocacy groups are raising alarms about potential "Big Brother" privacy issues, warning that it could exacerbate racial profiling concerns. Madeleine Stone from [8] Big Brother Watch said on facial recognition technology, “It treats everyone like a potential suspect, I’ve personally seen children wrongly flagged as criminals. It reverses the presumption of innocence. “Scotland risks sleepwalking into a surveillance state. This kind of biometric scanning belongs in an episode of Black Mirror, not on our high streets,” said Stone. Big Brother Watch has also warned Police Scotland that facial recognition technology [9] will put Scottish peoples civil rights at risk, “At a time when liberal democracies around the world are banning and scaling back the use of this Orwellian technology, exploring intrusive AI-powered surveillance would be a step backwards for rights and would pit the privacy of millions of Scots at risk,” said Madeleine Stone.
Neil Cowan from [10] Amnesty International also said, “I think there is always a balance to be struck in terms of rights and policing, but we believe this technology goes way beyond that, It has high levels of misidentification and these misidentifications disproportionately impact particular groups such as people of colour who already face systemic racism, discrimination and over policing. What we are talking about here is really the mass surveillance of people who are going about their everyday lives.”
In 2024, Angela Daly [11] who is a professor at the University of Dundee, raised significant concerns [12] about the suitability of AI driven facial recognition that Scotland's national police force is contemplating. The expert asserts that the technology is inadequate for capturing serious offenders and is fraught with ethical issues that could jeopardize public trust in law enforcement. “It is not fit for purpose in its deployments. For example, not catching the kinds of serious criminals the police claim, and instead identifying people who have committed lesser and non-violent crimes. When it is used it is not done in a proportionate and necessary manner, and it is generally unethical in its development and deployment,” said Daly.