September 2, 2023

How LAPD is using artificial intelligence to study policing

 

PERF members,

In last week’s column I highlighted a recent Los Angeles Times article describing a new study that will use artificial intelligence to analyze LAPD officers’ tone and language during traffic stops. According to the article, “Over three years, researchers will review body camera footage from roughly 1,000 traffic stops, then develop criteria on what constitutes an appropriate interaction based on public and office feedback and a review of the department’s policies. . . . These criteria will then be fed into a machine learning program, which will ‘learn’ how to review videos on its own and flag instances where officers cross the line.”

This struck me as quite significant. PERF recently convened a national meeting on trends in the use of body-worn cameras, ten years after PERF developed guidelines for the Justice Department on implementing a body-worn camera program. A major issue in our meeting was how to manage the massive amount of video a department generates every day. The fact that LAPD, one of America’s largest police departments, is going to use artificial intelligence to help address this issue could provide invaluable insights for other agencies, with implications for both policy and training. I wanted to find out more, so I spoke this week with LAPD Chief Michel Moore.

An edited version of our conversation is below. The discussion is also available as a podcast. 

 

Chief Moore at a recent recruit graduation ceremony (LAPD)

Chuck Wexler: I don’t know of another major department doing this. Tell me about it — how did this study come about?

Chief Michael Moore: Thank you, Chuck, for the opportunity to talk more about this study we’re engaged in with the University of Southern California. They approached us a little while ago. Recognizing that LAPD’s entire uniformed workforce wears body-worn cameras and that we collect some 14,000 interactions a day of video clips, they wanted to see what we can learn from those video clips and audio interactions.

What can we learn to improve the interaction between officers and persons being detained for traffic offenses or other detentions? And are there ways to improve the sense of procedural justice that occurs when people are given a voice and are listened to, so people have a sense that they’re being treated fairly?

It’s an area that interests us a great deal, but it also came with some substantial hurdles with regard to confidentiality. Are the officers involved going to be subject to some type of discipline or criticism because of what might be found during the reviews of these video recordings?

Wexler: How many videos did you say you collect each day?

Chief Moore: Fourteen thousand video clips a day. We have hundreds of officers that work every day. So we have, say, 700 to 800 people working a day, and each officer may have 10, 15, or 20 interactions during the course of the day that are captured on video.

Wexler: Managing all of that video has to be almost impossible. So you say, Okay, let me open myself up to researchers to come in here and try to figure out how my officers are engaging with the public. Is that fair?

Chief Moore: That’s fair. We currently monitor videos; certainly in critical instances we review them. And we do periodic audits and inspections on a randomized basis of video interactions, so that’s already in place. But we know that the vast majority of the interactions are not being analyzed, and we’re not learning from them. So this was an opportunity.

We see it as an experiment: researchers want to see if they can use artificial intelligence to analyze large groups of these recordings and learn from them. Are there opportunities for law enforcement to learn from how this interaction went, versus these other interactions? How can training and guidance for our officers improve the outcomes for all parties involved?

Wexler: Some may think this is Big Brother checking up on officers. But this isn’t about individual officers, is it? This is about identifying patterns in the ways that your officers are communicating with citizens, as opposed to going after individual officers?

Chief Moore: Exactly. This is not a gotcha. We’re not looking to explore this universe of recordings to identify and investigate individuals who have particular levels of conduct. We believe that the existing systems do that sufficiently for serious misconduct and for anything that should get our attention.

But we don’t know what we don’t know. And that’s what this research is meant to do. I’m grateful for the leadership of the [Los Angeles] Police Protective League; we have ensured that the anonymity of everyone’s identity in this is assured. The researchers will not know their personal identifying information. We’ve walled that off. To ensure that those walls are enforced, the researchers and anyone involved will have to have confidentiality agreements as well as background checks.

Lastly, the University of Southern California is an institution of substantial standing and prestige. I know that they’re dedicated to finding out what we can learn from this for the betterment of law enforcement.

Wexler: This is a huge experiment, and it’s a huge leap of faith. What you’re really trying to do is learn how cops are doing with traffic stops and how they can get better, right?

Chief Moore: It is, but I want to also explore what the outer limits are. Because we know that body-worn video has limits. It does not tell the entire story. It’s a two-dimensional story, not a human experience.

I don't know what this study will have as far as similar limitations. But we intend to explore that because frankly, that’s where innovation comes from. When you find out what the current limitations are, that’s when you go back to the scientists and say, Okay, these are the limitations of this current study; can you overcome them?

We see this with linguistics. When you look at voice-to-text technology over the last 20 years, and the ability of machines to interpret and translate language immediately and accurately, we’ve had generational leaps in progress. I think the same will happen here.

Wexler: There's a lot of talk about artificial intelligence that can revolutionize professions. Policing could be one of them, couldn't it? A lot of people are pretty worried about this; they think it’s like Big Brother somehow and could have an adverse impact.

Chief Moore: Artificial intelligence is a scary term, right? It suggests anonymity, it’s isolated, and it’s going to create outcomes beyond the control of the human element. And it could raise systemic issues of racism, bias, and other discrimination because we could bring those biases into the machine learning programs we create.

So it can be Orwellian; it can be very scary in a sense. Which is why I think it’s so critical that the Department of Justice has established the importance of ethics in the use of technology, recognizing the boundaries and the limitations that should be imposed on the use of technology.

When the study was announced, it created concerns among the rank and file of what it is and how it could negatively impact them. So we took steps to safeguard that. But I also recognize that we’re taking a risk as to what findings this research will suggest or identify that we may or may not agree with. Because every research report you should read with a skeptical eye. Does it support its findings?

Wexler: It feels like you’ve got enough people involved and enough safeguards. You’ve got police. You’ve worked with the unions. It really sounds like you’re trying to take the LAPD to the next level, aren’t you?

Chief Moore: We are, but I also want to give credit where credit is due. When Police Commissioner William Briggs learned of this outreach by USC, he took it on personally as one of his legacy efforts as Police Commission president. So this is not just the department itself, but the board of police commissioners challenging the organization in an effort of continuous improvement.

It took that kind of support because, as you can imagine, the naysayers were out there saying that no good can come from this, and this is going to embarrass the organization. If we can’t guarantee it’s going to be laudatory to the department, why are we engaged in it?

I don't believe that's how policing advances. I think you’ve got to take risks. We’ve got to show vulnerability. And we cannot be afraid of things that we don’t know.

Our first report back will be six to eight months from now. It’ll be 18 to 24 months before we'll have the final report. The outcome for me will be in identifying some avenues and training, some observations regarding our behaviors, and some strategies to improve the performance of our work as we go to protect the people of Los Angeles.

Chief Moore at a parade in Los Angeles’s Chinatown neighborhood (LAPD)

To listen to the interview, please check out the podcast linked here.

Thanks to Chief Moore for speaking with me. When I think of all that’s going on in policing and that  good police agencies are always looking for ways to improve, this effort could well break new ground. Technology, in this case, could be a force multiplier in efforts to better understand a very controversial aspect of policing: the traffic stop. And in so doing, it could help us take policing to the next level.

Have a wonderful holiday weekend!

Best,

Chuck