(Axon Image)

Seattle is one of a handful of cities where police departments are using an artificial intelligence platform to sift through thousands of hours of police body-cam recordings, looking for patterns of officer behavior that can be addressed through training.

Seattle is an “anchor customer” for the software, according to Chicago-based startup Truleo, a startup founded in 2021 that grew out of a platform first developed to analyze phone calls and text messages between Wall Street bankers.

The Seattle Police Department has been using the technology since 2021, the company says.

SPD declined to answer questions about the department’s use of Truleo. A spokeswoman said “it’s too early in the process to speak to measurable outcomes.”

“However, we look forward to the possible insights Truleo may provide in the future, and we continue to be committed to data- and evidence-based policing,” said SPD public information officer Valerie Carson.

Police departments generate thousands of hours of body-cam footage. The idea behind Truleo is to use artificial intelligence to scan the recordings, looking for audio cues that can help identify patterns, such as problems patrol officers may have interacting with the public. Then departments can address issues through training before problems escalate.

“The final product of Truleo’s Audio Analysis is a rich analysis of thousands of conversations, enabling departments to quickly identify at-risk incidents and department-wide trends,” the company said in a blog post explaining how the AI platform works.

Axios reported on the technology last week, noting how it could be more widely adopted after the death of Tyre Nichols in Memphis where police “unleashed a barrage of commands that were confusing, conflicting, and sometimes even impossible to obey,” The New York Times reported.

“As we mourn the loss of life and collapse in trust in Memphis tonight, be hopeful there is a solution,” Truelo CEO Anthony Tassone wrote on LinkedIn this week. “Like most departments, Memphis PD reviews less than 1% of their body camera videos because there is simply too much data for humans to watch. However, automated analytics exist and the technology helps to ensure a culture of accountability and professionalism, especially at a time when departments are so young and full of new recruits.”

Truleo’s software can label “professional” and “risky” officer language. (Truleo Image)

Truleo identifies California police departments in Alameda, Atwater and Vallejo as current customers, along with departments in Florida, Alabama and Pennsylvania. Atwater has used the data it gleaned from Truleo’s analysis of recordings to create performance metrics that are “searchable baseball card stats for cops,” Truleo said.

A case study conducted by the company claims that the police department in Alameda saw a 36% reduction in the use of force by officers after implementing a Truleo review of body-cam recordings and using those findings to focus training.

The study, which compared data from a six-month period in 2021 to a six-month period in 2022, also cited a 30% drop in unprofessional language used by officers, and a 12% increase in compliance from people they were interacting with.

Prior to using the Truleo AI, the Alameda department had sergeants randomly auditing about 1% of their officers’ body cam recordings, the study said.

However, there are potential issues with this kind of software, said Os Keyes, a doctoral student in Human Centered Design and Engineering at the University of Washington who has spoken out against the use of AI with body-cams.

At a basic level, body-cams record every interaction an officer has with other people, and can record images and audio of nearby conversations or acts the officer isn’t involved with. A body-cam on a police officer becomes a sort of roving digital surveillance tool, Keyes said.

There’s reason to fear that the AI will misinterpret what it hears in body-cam audio recordings, Keyes said. Even the best AI struggles to correctly identify sarcasm, and a number of natural-language processing models have exhibited bias along racial, ethnic and gender lines.

A Washington Post-commissioned study determined that Amazon and Google smart speakers were 30% less likely to understand humans who didn’t speak English with American accents. The accuracy rate for speakers with Chinese, Indian or Spanish accents was about 80%.

Combine the roving surveillance camera issue with AI’s struggle to understand the full spectrum of human speech, and you’ve got a problem, Keyes said.

Analyzing body-cam recordings “can be a really useful practice, not only for training but for detecting things like unreported brutality or harassment,” Keyes said.

But because it’s so important, “it’s completely inappropriate” to use a system that hasn’t been independently reviewed, Keyes noted.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.