Kathleen O’Toole is one of the grandees of policing. Her career spanned continents and stretched from front-line officer to commissioner. She is also a member of Axon’s AI Ethics Board. Here she explains why Sir Robert Peel’s Nine Principles of Policing are as valid today as when they were formulated in 1829.
Peel’s principles have stood the test of time and remain the foundation of modern, democratic policing. The core message of ‘policing by consent’ means that, by and large, for the police to succeed, they must earn and maintain the trust of the community. Today, perhaps more than at any other time, these principles face a serious test. New technologies, such as facial recognition and prediction modelling, have vast potential to anticipate and prevent crime. But the use of some technologies, such as facial recognition, raises dystopian concerns about surveillance. What’s more, the emergence of bias in AI (which is central to predicting crimes) that seems to mimic the way humans stereotype, is undermining the public’s confidence in these innovations. With these issues in mind, how can we find the balance between making use of these powerful technologies, while still policing with consent and getting buy-in from our communities? Kathleen recommends four disciplines that can help:
- Capturing and measuring important data
- Making the most of digital tools
- Being transparent
- Engaging the community and building trust
Capturing and measuring important data
Soon after my appointment as the police chief in Seattle, I asked the precinct captains how we were doing on crime, and they replied, ‘Good...we think’. In reality, they had no idea - no measurement. Within weeks, we launched SeaStat, our version of CompsStat (computer statistics program), to capture timely and accurate data to map crime.
The data we captured went well beyond crime data and showed that we were hugely involved in our community. In fact, similar to studies of other major cities, law enforcement was only 10-15% of the work we were doing. The rest was providing service and assisting vulnerable people. Our greatest challenges were at the intersection of public health and public safety – homelessness, addiction, conditions and mental health crises. We were constantly under pressure to address people’s concerns about quality of life issues in their neighbourhoods.
We also discovered that officers used force in less than 2% of cases and most of it was minimal – officers only had to use serious force in a fraction of 1% of cases. We were able to use this data to address concerns and perceptions about officers’ performance and to have more constructive conversations with the community. Showing that our day-to-day operations were very different from the police shows most people see on TV also helped to enhance trust and build better relationships.
Without timely data and reliable metrics, it’s difficult to engage communities and focus on the issues that matter to them. When it comes to building trust, and policing with consent, we can use improved data collection techniques and analysis to shine a light on police work. This helps move the tone of debate away from speculation and perception to facts that enable a better understanding of the way police work and the often unseen effort that officers put in to helping their communities, looking after the vulnerable and keeping us safe.
Making the most of digital tools
One of the biggest changes I’ve seen in policing is the availability of digital tools to capture data and analyze crime, and also to understand the broader demands on the police. In Seattle, we used our operations data to establish community policing micro plans, unique strategies to address concerns in specific neighbourhoods.
Of course, we always focused on serious ‘headline’ crimes. But we also developed a grassroots approach, engaging front-line police officers and community members in the development of neighbourhood strategies. They had first-hand knowledge of the challenges they faced, and often presented the best solutions. For each neighbourhood, we distilled three to five issues to focus on. We partnered with a local university to develop quantitative and qualitative metrics to gauge results, measure community satisfaction, and inform the evolution of each micro-plan.
At the same time, we acquired and developed additional tools to build trust and enhance transparency. Body cameras were issued to all patrol officers. Public-facing dashboards allowed the community to access timely data and reports. We even worked with Code for America and mental health service providers to develop an app to assist officers when encountering those in mental health crises. It enhanced safety for officers and vulnerable members of the community.
The SeaStat system that we set up in Seattle helped us define priorities and reframe our discussion with communities about the work we were doing and, indeed, the work they wanted us to do. With a view to being as transparent as possible, we published maps and data online. Anyone with an interest in policing could see data and reports on crime, other calls for service, community policing micro-plans, use of force, hate crimes, mental health crisis intervention and a variety of other topics. Clearly, it’s not possible to publish all operational data. We wouldn’t undermine ongoing investigations and certainly had to be mindful of privacy issues. However, we shared as much as possible. It may not always be easy, particularly if the results are not flattering, but honesty and authentic engagement are essential to building trust. Independent surveys showed that our relationship with the community benefited significantly as a result.
Engaging the community and building trust
As we look forward, at a time when budgets will, no doubt, be tight and police will face ever-growing demands for service, it’s clear that robust technology tools will be essential to the delivery of effective policing.
However, there is a ‘but’. And it’s that technology is developing at an incredibly fast pace - so fast that the capabilities available to police services could undermine important democratic principles. There are growing community concerns around privacy and data retention. License plate readers are a good example. People are not necessarily concerned that their license plates are captured by a tolling system on the highway, but they probably would want to know if police have access to data from such systems, how long that data is retained, and with whom it is shared.
Most people are reasonable and understand that we have to strike a balance. But it is incumbent on us as law enforcers and service providers to present them with certain scenarios and potential solutions, and to ask if they are willing to sacrifice a bit of privacy for a safer neighborhood. The more we communicate, and the more transparent we are, the more the community will trust our judgment. Of course, consulting the public isn’t always a smooth process, but a bit of healthy tension is a key element of democratic governance.
Failure to consult, in my view, is deeply problematic. If the community or their elected leaders find out that the police department has purchased technology that violates democratic principles without consultation, I believe we contravene the letter and the spirit of Peel’s ideals. We have already seen facial recognition trials become contentious and there is concern around new AI applications designed to anticipate crime but which seem to introduce human racial biases into their results. These developments, in particular, must be handled with care – and a great deal of openness.
I have interacted over the years with lots of vendors and appreciate companies, like Axon, that engage in meaningful conversations about the impact new technologies may have. In a democracy, we all – including the manufacturers – have a responsibility and need to pull together to succeed. Or, as Peel put it, “The police are the people, and the people are the police.”
How is Axon joining the conversation?
I think Axon is unique in creating an AI Ethics Board. As a manufacturer, Axon is taking responsibility and addressing the issues before their products are released.
For example, when Axon was contemplating integrating facial recognition in their cameras, they consulted us. The board consists of some incredible researchers and civil liberties experts as well as a couple of us with police experience. After a thoughtful process and some spirited discussions, we landed unanimously at the recommendation to not use this technology until it improves, because we were concerned by troubling data about misidentifications. I thought it was a bold and admirable decision of the CEO to follow our recommendation.
It’s great to think that there is a vendor out there, who is actually taking responsibility at the front end, rather than flooding the market with products that are attractive but could have significant privacy implications.
As a police chief, I had some really smart people on my team, but the more people we have contemplating these issues and potential risks associated with these products, the better. More companies should be thoughtful and transparent in their processes.