A Washington Post investigation into police use of facial recognition software found that law enforcement agencies across the nation are using the artificial intelligence tools in a way they were never intended to be used: as a shortcut to finding and arresting suspects without other evidence.
Most police departments are not required to report that they use facial recognition, and few keep records of their use of the technology. The Post reviewed documents from 23 police departments where detailed records about facial recognition use are available and found that 15 departments spanning 12 states arrested suspects identified through AI matches without any independent evidence connecting them to the crime — in most cases contradicting their own internal policies requiring officers to corroborate all leads found through AI.
Some law enforcement officers using the technology appeared to abandon traditional policing standards and treat software suggestions as facts, The Post found. One police report referred to an uncorroborated AI result as a “100% match.” Another said police used the software to “immediately and unquestionably” identify a suspected thief.
Hundreds of police departments in Michigan and Florida have the ability to run images through statewide facial recognition programs, but the number that do so is unknown. One leading maker of facial recognition software, Clearview AI, has said in a pitch to potential investors that 3,100 police departments use its tools — more than one-sixth of all U.S. law enforcement agencies. The company does not publicly identify most of its customers.
Through a review of government contracts, media reports, and public records requests, The Post identified 75 departments that use facial recognition, 40 of which shared records on cases in which it led to arrests. Of those, 17 failed to provide enough detail to discern whether officers made an attempt to corroborate AI matches. Among the remaining 23 agencies, The Post found that nearly two-thirds had arrested suspects identified through AI matches without independent evidence.