Cops responding to ShotSpotter's AI alerts rarely find evidence of gun crime, says Chicago watchdog

Police responding to ShotSpotter's AI-generated alerts of gunfire find evidence of actual gun-related crime only about one time in ten, a Chicago public watchdog has found.

The California biz uses machine-learning algorithms to determine whether loud bangs caught by microphones deployed across more than 100 US cities are gunshots or not. If a shot is identified, the location of the noise is triangulated and sent to the police as an immediate, real-time alert, and reports are later compiled for prosecutors for use in court cases.

ShotSpotter is under the microscope right now because a 65-year-old man spent almost a year behind bars awaiting trial for murder – and the primary evidence against him was a disputed ShotSpotter report of a gunshot.

Michael Williams was last year accused of shooting and killing 25-year-old Safarian Herring, and denied any wrongdoing. Prosecutors said ShotSpotter picked up the sound of gunfire right where and when Williams was seen in his car, in Chicago, giving Herring a ride. Williams said Herring was hit in a drive-by shooting.

Crucially, Williams' lawyers asked the trial judge to probe the ShotSpotter evidence after it emerged the AI software actually picked up a firework a mile away and this information was later revised by ShotSpotter staff. In response, the prosecution withdrew the ShotSpotter report, and last month asked for the case to be dismissed as it no longer had sufficient evidence. The judge agreed, and Williams was released as a free man.

story continues at