UK Police Forces Lobbied to Use Biased Facial Recognition Technology Despite Known Issues
Revealed: Facial recognition system more likely to misidentify women and Black people, raising concerns over fairness and operational priorities
Documents reviewed by The Guardian and Liberty Investigates have exposed that UK police forces actively lobbied to continue using a facial recognition system known to produce biased results against women, young people, and ethnic minorities. This lobbying came despite awareness, for over a year, that the technology disproportionately misidentified these groups.
Background of Facial Recognition Use in UK Policing
UK police utilize a retrospective facial recognition tool linked to the Police National Database (PND), which contains over 19 million custody images. The system works by taking a “probe image” of a suspect and matching it against the database to generate potential identifications or investigative leads.
Evidence of Bias and Initial Attempts to Mitigate
In September 2024, a Home Office-commissioned study by the National Physical Laboratory (NPL) sounded alarms over the technology’s fairness. Their review revealed the system was significantly more prone to false-positive matches involving Black and Asian individuals, women, and people aged 40 or under, in comparison to white men.
To address the bias, the National Police Chiefs’ Council (NPCC) mandated an increase in the algorithm’s confidence threshold for flagging matches—essentially tightening the criteria for considering a potential match valid. This adjustment notably reduced the rate of biased outcomes.
Police Lobbying Reverses Mitigation Measures
However, the higher threshold sharply curtailed the number of “investigative leads” retrieved from the system, dropping from 56% to just 14% of searches resulting in possible matches. Police forces expressed concerns that the system, under the stricter settings, was no longer effective operationally and appealed for a rollback.
One month later, the NPCC reversed the decision, reverting to the lower confidence threshold despite the clear evidence that this increased the risk of biased misidentifications. Current threshold levels have not been publicly disclosed by the Home Office or NPCC.
Consequences and Official Responses
A recent NPL study repeated these troubling findings, indicating that false positives for Black women could occur nearly 100 times as often as for white women under certain system settings.
The Home Office acknowledged the bias, stating that the algorithm “is more likely to incorrectly include some demographic groups in its search results” but underscored their commitment to addressing the issue. They revealed intentions to roll out a newly procured, independently tested algorithm early next year, with improved fairness and no statistically significant bias.
NPCC’s Chief Constable Amanda Blakeman defended the decision to revert thresholds, emphasizing the delicate balance between operational effectiveness and fairness. She noted that police had reissued training and guidance to users of the system to help mitigate potential harms from bias.
Criticism from Experts and Advocacy Groups
Experts and civil rights advocates expressed deep concerns over the prioritization of convenience and quantity of leads over fundamental rights and fairness.
Professor Pete Fussey, an independent expert on facial recognition, questioned whether police acceptance of biased technology undermines legal and ethical standards, stating, “Convenience is a weak argument for overriding fundamental rights.”
Abimbola Johnson, chair of the independent scrutiny and oversight board for the police race action plan, criticized the lack of concerted discussion relating to facial recognition within anti-racism initiatives. Johnson warned that new surveillance technologies are being introduced within a context already marked by racial disparity and insufficient oversight.
Government’s Path Forward
The Home Office has initiated a ten-week public consultation on expanding facial recognition technology use, with Policing Minister Sarah Jones hailing the tool as the “biggest breakthrough since DNA matching.”
Meanwhile, the government promises ongoing evaluation and strict safeguards to ensure usage meets national standards, including independent scrutiny to prevent the reinforcement of racial disparities.
The unfolding situation highlights urgent tensions between policing effectiveness and the protection of civil rights, as the UK grapples with the challenges posed by emerging technologies prone to bias. As the government moves toward wider deployment of facial recognition, public scrutiny and robust safeguards will be crucial to maintaining trust and fairness in law enforcement.





