The only suspect in a Cleveland, Ohio, murder case is likely to walk because the police relied on Clearview AI’s facial recognition to get a warrant on them — despite Clearview warning specifically that its results are not admissible in court: [Cleveland; Register]

These search results are not intended or permitted to be used as admissible evidence in a court of law or any court filing.

Clearview’s terms of service say: “Clearview makes no guarantees as to the accuracy of its search-identification software.” [Clearview]

Blake Story was shot on February 14, 2024, and died shortly after. Surveillance camera footage of the shooting was not clear. Another camera caught someone that police thought looked like the shooter.

Cleveland police asked the Northeast Ohio Regional Fusion Center to identify the second person. They ran the footage through Clearview AI and identified a likely suspect.

The police got a search warrant — but did not mention the use of Clearview to the judge, nor that NEORFC had identified other possible suspects. Police searched the suspect’s girlfriend’s house and found a gun and “other evidence.”

The defense argued that the warrant was granted on misleading claims, so evidence from it should be thrown out. Judge Richard McMonagle agreed.

Without this evidence, the case is likely dead — the police don’t seem to have anything else. Prosecutors have appealed the ruling.

The Ohio Bureau of Criminal Investigation has a policy against sole use of facial recognition — but the Cleveland police do not. Though they may soon.

It can't be that stupid, you must be prompting it wrong

https://pivot-to-ai.com