Sainsbury's has commenced testing the use of facial recognition technology in its UK stores claiming it will address concerns over shoplifting.
Sainsbury's [1] has informed employees at two locations in Sydenham, south-east London, and a convenience store in Oldfield Park, Bath about the launch of an 8 week trial of new technology, with the possibility of a nationwide rollout. Sainsbury's claims that the technology aims to identify shoplifters and address the significant rise in retail crime seen in recent years. However, privacy advocates have criticized the initiative. The implementation of facial recognition technology in retail environments has drawn significant criticism in recent years. Asda, a competitor in the sector, [2] has encountered thousands of complaints following the introduction of a similar trial earlier in 2025.
In August, a proposal by the Metropolitan Police to expand the use of live facial recognition technology [3] at large public events faced backlash from the equalities regulator. The Equality and Human Rights Commission or EHRC criticized the plan as unlawful, citing research that revealed black men were disproportionately likely to trigger alerts. The EHRC described the technology's usage as intrusive and capable of creating a “chilling effect” on individuals' rights. John Kirkpatrick who [4] is the chief executive of the EHRC said, “There must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. We believe that the Metropolitan Police's current policy falls short of this standard.”
Privacy rights organization Big Brother Watch has described this initiative as “deeply disproportionate and chilling,” [5] urging Sainsbury's to stop the trial. Madeleine Stone who [6] is the Senior Advocacy Officer said, “Sainsbury’s decision to trial Orwellian facial recognition technology in its shops is deeply disproportionate and chilling. Sainsbury’s should abandon this trial and the government must urgently step in to prevent the unchecked spread of this invasive technology.” Byron Long, a shopper in Cardiff, [7] was incorrectly identified as a theft suspect by a B&M store utilizing live facial recognition technology. Long was confronted in front of other customers and placed on a watchlist for a theft he did not commit. “This technology turns shoppers into walking barcodes and makes us a nation of suspects, with devastating consequences for people’s lives when it inevitably makes mistakes. We are regularly hearing from and supporting distressed people who have been caught up in a confusing net of privatised surveillance, despite being entirely innocent,” said Madeleine Stone.
Following the incident, Big Brother Watch contacted [8] the Information Commissioner’s Office on behalf of Long. Jasleen Chaggar, [9] the Legal and Policy Officer at Big Brother Watch, contacted the Information Commissioner's Office to present new evidence regarding potential data protection violations committed by Facewatch and its clients. In her letter, she expressed concerns about the accuracy and measures currently implemented by the companies in question. “Mr Long’s personal data was processed by Facewatch and Home Bargains as joint controllers. He visited a B&M store on 9th April and was falsely accused of stealing £75 worth of goods. On the basis of this accusation, his biometric data was added to a watchlist of people alleged by the joint controllers to have committed crimes at locations operated by Facewatch clients,” said Chagger.