Why It Matters
A Tennessee grandmother spent five months in custody after an artificial intelligence-based facial recognition system incorrectly identified her as a suspect in a bank fraud case in a state she had never visited. The case involving Angela Lipps, 50, is drawing renewed attention to the legal risks posed by AI-assisted identification tools used in law enforcement and the procedural gaps that can allow wrongful arrests to persist for extended periods.
Tennessee residents and civil liberties advocates across the country are watching the case closely as it raises fundamental questions about due process protections when algorithmic tools are introduced as evidence in criminal investigations.
What Happened
Angela Lipps was arrested at her rental home in Tennessee in July 2025 after the West Fargo Police Department in North Dakota used facial recognition technology that flagged her as a potential suspect in a local bank fraud case. Lipps had no documented connection to North Dakota and had never visited the state, according to a GoFundMe campaign established on her behalf.
At the end of October 2025, Lipps was extradited to Fargo, North Dakota — more than 1,000 miles from her home in Tennessee — where she remained in custody for the duration of her detention. She was jailed for approximately five months before the case against her was resolved.
Fargo Police Department Chief Dave Zibolski confirmed at a press conference on Tuesday that West Fargo police had used facial recognition technology as part of their investigation. He stated that his department subsequently took “additional investigative steps independent of AI to assist in identification” before formally naming Lipps as a suspect. Zibolski acknowledged that the West Fargo police’s facial recognition system was “part of the issue” in the wrongful arrest.
By the Numbers
- 5 months — the total length of time Angela Lipps was held in custody before the case was resolved
- 1,000+ miles — the distance between Lipps’ Tennessee home and Fargo, North Dakota, where she was extradited
- 50 years old — Lipps’ age at the time of her arrest
- At least 10 — the approximate number of documented wrongful arrests in the United States attributed to facial recognition misidentification since 2020, according to published civil liberties research
- 0 — the number of states with comprehensive laws restricting law enforcement use of facial recognition as of early 2026, though several states have introduced legislation
Zoom Out
Lipps’ case is not an isolated incident. Researchers and civil liberties organizations have documented a growing pattern of wrongful arrests tied to facial recognition errors across the United States. High-profile cases have occurred in Michigan, Louisiana, and Georgia, predominantly affecting individuals from communities of color, though errors have been recorded across demographic groups.
Studies have found that many commercially available facial recognition systems demonstrate higher error rates when identifying women, older individuals, and people with darker skin tones. Law enforcement agencies have increasingly adopted these tools without uniform standards for how AI-generated matches should be verified before being used as a basis for arrest.
The case highlights a critical procedural question: whether independent corroborating evidence should be legally required before facial recognition output can support an arrest warrant. Several state legislatures, including those in California and Massachusetts, have considered or passed partial restrictions on government use of facial recognition, but no federal standard currently governs its use in criminal investigations.
What’s Next
As of Tuesday’s press conference, Police Chief Zibolski indicated the department is reviewing its procedures related to the use of facial recognition technology. It remains unclear whether Lipps will pursue civil litigation against the West Fargo Police Department, the Fargo Police Department, or other parties involved in her arrest and extradition.
Advocates are calling on Tennessee and North Dakota lawmakers to examine the circumstances of her case and consider statutory guardrails on AI-assisted identification in criminal proceedings. Federal lawmakers on the House Judiciary Committee have previously introduced legislation that would restrict facial recognition use by federal agencies, though no comprehensive bill has passed either chamber.
The Lipps case is expected to remain in the public spotlight as discussions over AI accountability in law enforcement intensify heading into the 2026 legislative sessions across multiple states.