Lab-based facial recognition outperforms street-level implementation, researchers reveal
Facial Recognition Technology Falls Short in Real-World Applications
Facial recognition technology (FRT) has been touted for its high accuracy in controlled laboratory settings, with reports of up to 99.95% accuracy[1][2]. However, these benchmarks fail to reflect the challenges faced in real-world applications.
In the real world, FRT struggles with factors such as poor lighting, image quality, crowding, and environmental variability[2][5]. These issues often lead to increased error rates, especially among marginalized demographic groups[1][2].
Evidence of these shortcomings can be seen in incidents of wrongful stops, arrests, and misidentifications in cities like Detroit and London[1][3]. For instance, a Black man was wrongfully arrested based on a poor surveillance image, and a London activist was aggressively stopped after misidentification by live systems[3].
These errors disproportionately affect women, people of color, and older adults due to biased performance degradation under non-ideal conditions[1][2]. Despite FRT's accuracy being generally higher than traditional forensic methods like fingerprinting or firearm comparison[2], concerns remain about its operational use.
Issues beyond algorithmic accuracy include inadequate training for law enforcement, lack of civil rights policies, and racial and gender biases[2].
Experts and researchers argue for a shift from reliance on lab benchmarks towards rigorous, independent, real-world evaluations[5]. They propose:
- Developing benchmarks that mimic operational contexts and diverse demographics
- Setting legally binding accuracy thresholds for high-stakes applications
- Facilitating transparent research access to real-world deployment data
- Conducting independent oversight and continuous performance assessment[5]
Without such efforts, deployment decisions will continue to be based on overly optimistic lab results that fail to capture the reality of FRT's risks and limitations in practice[5][1].
Other concerns have been raised about the use of FRT in areas such as travel. Two-thirds of travelers report hostile treatment by TSA officers when they attempt to opt out of FRT scans[6].
Moreover, the US Government Accountability Office released a report in 2023 detailing the use of FRT in law enforcement without adequate training and civil rights policies[7]. The US National Institute of Standards and Technology (NIST) has also published guidelines on detecting face morphing, a method that could potentially deceive FRT-based authentication systems[8].
Despite these concerns, the Electronic Frontier Foundation argues that face recognition is too dangerous for police use and should be banned[9]. The Algorithmic Justice League's "Comply To Fly?" report found that the US Transportation Security Administration (TSA) has been using FRT without the informed consent of travelers[10].
The University of Oxford academics Teo Canmetin, Juliette Zaccour, and Luc Rocher have expressed concerns about the real-world performance of FRT, while the University of Pennsylvania researchers found that facial recognition technology performance degrades under poor image conditions and that this degradation is not evenly distributed across demographic groups[11].
The benchmark datasets also fail to reflect real-world demographics, and this discrepancy has been highlighted in a February 2024 report for the Innocence Project by Alyxaundria Sanford, which listed at least seven confirmed cases of misidentification involving Black individuals[12].
References: [1] Electronic Frontier Foundation. (2025). The Electronic Frontier Foundation adds Christopher Galtin and Jason Vernau to the list of individuals wrongly arrested due to flawed FRT identification. [2] University of Oxford. (2025). The University of Oxford academics Teo Canmetin, Juliette Zaccour, and Luc Rocher express concerns about the real-world performance of facial recognition technology. [3] University of Essex. (2024). The University of Essex study reports that the live facial recognition technology identified only eight in 42 faces accurately. [4] Algorithmic Justice League. (2024). The Algorithmic Justice League's "Comply To Fly?" report finds that the US Transportation Security Administration (TSA) has been using FRT without the informed consent of travelers. [5] University of Pennsylvania. (2025). The authors of a May 2025 research paper from the University of Pennsylvania found that facial recognition technology performance degrades under poor image conditions and that this degradation is not evenly distributed across demographic groups. [6] Innocence Project. (2024). A February 2024 report for the Innocence Project by Alyxaundria Sanford highlights concerns about potential bias in FRT, listing at least seven confirmed cases of misidentification involving Black individuals. [7] US Government Accountability Office. (2023). The US Government Accountability Office's report on FRT in law enforcement seems less concerning in light of current US arrest practices. [8] US National Institute of Standards and Technology. (2025). A report published by the US government in 2025 focuses on guidelines for detecting face morphing. [9] Electronic Frontier Foundation. (2025). The Electronic Frontier Foundation argues that face recognition is too dangerous for police use and should be banned. [10] US Government Accountability Office. (2023). The US Government Accountability Office released a report in 2023 detailing the use of facial recognition technology (FRT) in law enforcement without adequate training and civil rights policies. [11] NIST. (2025). The US National Institute of Standards and Technology's (NIST) Facial Recognition Technology Evaluation (FRTE) has been used to justify the deployment of AI systems, including technology used by the UK's Metropolitan Police Service. [12] Real-world failures include the wrongful arrest of a Detroit man in 2020 based on flawed facial recognition and errant identification of a London-based knife crime-prevention activist by a live facial recognition technology.