Benchmark Analytics was invited to participate in a roundtable discussion on March 15, 2023, hosted by the White House and the Office of Science and Technology Policy. The topic of conversation was President Biden’s Executive Order from May 2022 on Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety.  Specifically, the gathered team was seeking to assess the current state of data collection, use, and transparency practices in law enforcement agencies across the country.

Denice Ross, the U.S. Chief Data Scientist, opened the session by laying out the key principles of accountable and equitable data, including:

  • Ability to Disaggregate
    Basically, how can you slice and dice data? Can you metaphorically pick up the information, reposition it and look at it from another angle? All while balancing data privacy and transparency, of course.
  • Maximum Utilization
    There is a tremendous amount of underutilized data that exists in the world today. How can we leverage information already being captured to gain new insights?
  • Analytic Capacity
    Is there the ability to dig into the data? That includes data quality, systems/tools for analysis, and the human capital/skill to understand the data and make it actionable.
  • Open and Shared
    Successful data initiatives should be available through diverse partnerships for various perspectives on content, analyses, and outcomes.
  • Accountable and Transparent
    Last, and far from least, the data has to be accountable and transparent. Full stop, period.

From there, the discussion transitioned to law enforcement data transparency and collection. It was a vibrant group dialogue echoing a collective and clear passion for service. The conversation took us through several interesting challenges around data collection. As one example, an agency might have multiple different sets of reporting standards for the exact same data points. Stated differently, use-of-force data reported to the FBI is, in many cases, dramatically different than use-of-force data required by state legislation — which can also be dramatically different from data required by settlement agreements (consent decrees) or civilian oversight. Today, the federal government collects the following data:

    • Incidents (arrests)
    • Hate/bias crimes
  • Use of force
  • Law enforcement demographics
  • Law enforcement/workforce characteristics
  • Police-Public Contact Survey
  • National Crime Victimization Survey

This is a good start – but this list lacks so many key elements. Only capturing whether a force incident occurred (or did not occur) misses valuable scenario context:

  • What was the force encountered?
  • Were there any attempts at de-escalation?
  • How did proportionality of force play into the situation?
  • What is the officer’s peer group and the expected force utilization patterns for that peer group?
  • Moreover, how do OTHER parts of that officer’s behavior pattern impact the force utilization and proportionality on any given day?

These are the meaty questions that need to be answered to build truly equitable data and enhance public trust.

Not surprisingly, the federal government is looking to augment current data collection with the following:

  • De-escalation incidents
  • Patrol locations
  • Complaints
  • Officer training
  • Community engagement
  • Vehicle pursuit
  • Other

These data points help tell the full story of an incident and an entire pattern of behavior. Still, software platforms cannot be unduly burdensome. Law enforcement officers are amazing civil servants tasked with protecting and guarding communities, not with building complex data schemas and reports. As a partner and software provider, we are constantly asking ourselves questions – how can we improve design and deployment of data collection tools to make life easier for the officer on the street while still maximizing the value of the resulting data? How can we provide law enforcement agencies with collaboration and education on best practices to simplify data collection and prioritize accountability and transparency?

Ultimately, everyone agreed that data transparency is the key to trust. With the power of analytic platforms, law enforcement agencies can harness the power of data to help tell a full and accurate story of officer behavior.

This session was part of a series of roundtables; OSTP also met with community members, academia, law enforcement agencies, and various other key stakeholders. Ultimately, all their learnings will feed a final report on Equitable Data due to President Biden on May 25. If that date is not immediately familiar to you – it’s the third anniversary of George Floyd’s death in Minneapolis. That significance isn’t lost on the Working Group; they understand the importance of this work and the unique role data plays in identifying, predicting, and ultimately preventing negative incidents.

Only through equitable and actionable data will we begin to bridge the divide and enhance public trust in public safety.

As political leadership shifts with each election cycle, law enforcement agencies are often on the front lines of implementing the policing policy priorities of new administrations and congressional leadership. Among these policy changes, Department of Justice (DOJ) consent decrees are generally agreed to have the most far-reaching impacts on an agency’s day-to-day operations, compelling the cooperation of departmental leaders to deploy often drastic changes to operational policies and those dealing with data collection and reporting. That need to overhaul data collection and reporting can be the source of the kinds of implementation and paperwork nightmares that keep chiefs up at night.

The success of a consent decree, in large part, hinges on how policy changes are implemented, documented, and reported. While they have little input on the consent decree terms, chiefs and departmental leaders are in the driver’s seat when it comes to fulfilling the terms of the decree and those of the monitor.

Over the next several articles, we’ll be exploring the lifecycle of a consent decree by incorporating conversations with prominent chiefs and law enforcement thought leaders who spoke at Benchmark’s Digital Science Leadership Series held at IACP 2022 in Dallas. Their real-world experience and thoughts on consent decrees form the bedrock of this series.

Consent Decree Beginnings

While the nuts and bolts of DOJ consent decrees vary from agency to agency, they all share a similar origin story. Initially, there is an incident or pattern of incidents that draws scrutiny from the media and the wider public. In the past, this could take months to develop into an issue of national attention, but in the age of 24-hour news and social media, a particularly striking example of alleged misconduct can be reported around the world before the day is over. Without a doubt, the perception of policing as a local issue is changing due to technological advances.

Once an agency is on the radar of the DOJ, federal officials investigate to determine if there is a pattern of practice involving misconduct or the deprivation of constitutional rights. In political eras where the DOJ is issuing consent decrees, they tend to move relatively quickly, relative to the level of bureaucracy involved. Using the contemporary example (late 2022) of the Louisville Metro Police Department, the trigger incident occurred less than two years prior to the expected announcement of a DOJ consent decree. Once a department is under a consent decree, daily operations shift and management duties change drastically.

“I don’t wish a consent decree on any police chief or on any police department. It is an enormous undertaking unlike anything I have ever experienced in my career,” said Chief Jason Armstrong – currently chief of Apex (NC) Police Department and whose prior experience includes managing the Ferguson (MO) Police Department’s consent decree. “I had real-life problems in the community that needed our focus, but so much of my time and energy had to be pulled away to work on the consent decree,” he shared during Benchmark’s Leadership Series panel discussion at IACP 2022.

Policing Versus Paperwork

A common observation of consent decrees, especially those that fail to reach a timely conclusion such as at Oakland, California is that the requirements imposed by the DOJ aren’t aligned closely enough with the broader goals of reducing crime, improving public safety, and contributing to positive community relations. This can create tension within a department and can be a source of stress for chiefs and other departmental leaders attempting to balance their day-to-day duties and larger mission with achieving the performance criteria associated with the consent decree.

Speaking at Benchmark’s IACP2022 Digital Science Leadership Series, Virginia Gleason, former deputy director of the Oakland Police Department and 2022 Harvard Advanced Leadership Initiative Fellow, noted the potential for disconnect between a department’s operations and the stipulations of a consent decree.

“There needs to be meaningful interaction between the monitoring team, the court, and the department about what is feasible, what is affordable, and what is practical for that jurisdiction to achieve the principles of [the consent decree] instead of chasing numbers.”

Chiefs are under extraordinary pressure to ‘hit the numbers’ of a consent decree – in many cases, their job depends on it. Collecting, collating, and reporting data types that haven’t been dealt with before significantly strains an unprepared department. Without carefully considering data analysis needs, this can disrupt the balance between administrative tasks and actual, patrol-oriented police work.

Rethinking Data

Data reporting, especially when there can be a perception of moving the goalposts, is more than just an annoyance in a department – it can take a severe toll on morale and retention, contributing to the staffing crisis experienced by agencies around the country. To help mitigate the effects of enhanced data collection and analysis, chiefs and agency leaders must be intentional in their methodology and policies.

Breaking down complex and multifaceted compliance tasks into smaller, more manageable pieces produced results for Ben Horwitz, Co-Founder of AH Datalytics and speaker at Benchmark’s 2022 Digital Science Leadership Series.

“You have to start by measuring something with body worn camera compliance” Horwitz said, speaking of his time managing the New Orleans Police Department’s consent decree. Step one was ‘do we turn it on when we’re supposed to?’. Within three months of command staff seeing the performance [and acting on that data], we went from 70-80% compliance to 95% and above”.

The path towards compliance in this instance involved not just breaking down reporting into more manageable steps but reporting that data up the chain of command to enable more impactful decision-making. Because that data was intentionally managed, it became more valuable as it was used for both compliance and leadership’s decision-making within the department.

Next Steps

Consent decrees are an arduous process that taxes an agency’s capacities and resources in many ways. They are expensive, time-consuming, and, without clear goals and effective management, a drain on a department’s morale that can have measurable impacts on its ability to improve public safety. While some of the processes are outside of the departmental leadership’s hands, they are anything but powerless.

Understanding the intersection of leadership and data analytics is a powerful tool when navigating the ins and outs of a consent decree. Benchmark Analytics is committed to both contributing to the conversation surrounding these core elements and building the best data analytics tools – tools that help law enforcement leaders navigate the demands of data-driven policing and personnel management.

Advancements like drones, virtual reality training, and other hardware-based technologies are said to be “revolutionizing” policing, giving law enforcement leaders a vast array of tools to confront rising crime rates and public demands for action. Useful as these tools are, they cannot inform policy decisions or point to improvements in personnel management strategies –measures that show the greatest potential in positively impacting public trust and the effectiveness of police operations as a whole. The emerging field of data science, however, has the capability to move the needle on some of these “big picture” problems.

Without intentional and research-based data collection and analysis practices, policing leaders are effectively flying blind when shaping policies and allocating resources within their agencies. Just as troubling, they may miss out on substantial cost-savings that often come with an investment in their data science capabilities. This article looks at the basics of data science in law enforcement and its potential to guide meaningful and positive changes in law enforcement.

What is Data Science?

Data science is a relatively new technical field, with the term only coming into common usage in the last 25 years. Though those in the field still debate some of the finer points of what encompasses data science, Amazon Web Services (AWS), a global leader in cloud computing, defines it in the following way.

“Data science is the study of data to extract meaningful insights for business. It is a multidisciplinary approach that combines principles and practices from the fields of mathematics, statistics, artificial intelligence, and computer engineering to analyze large amounts of data. This analysis helps data scientists to ask and answer questions like what happened, why it happened, what will happen, and what can be done with the results.”

Many in the field differentiate data science from statistics because it focuses on questions and problems unique to the digital age. Essential to this definition is the notion that data science is not a singular pursuit but, instead, a set of intersecting skill sets and techniques used to study a problem. Crucially, data scientists in fields like law enforcement and public policy use these insights to craft evidence-based solutions to these problems.

Crime Side Advances

Tracking largely with the personal computer revolution of the 1980s – which brought computing and data processing out of university labs and into homes and smaller offices – data science found an immediate use-case in policing.

COMPSTAT is one of the earliest and most recognizable examples of the use of data science methodology in law enforcement. Pioneered in the NYPD in the early 1990s, COMPSTAT incorporated crime mapping and trend analysis and was thought to have significantly impacted crime rates in cities that have adopted the practice.

Building on the fundamentals of COMPSTAT, a new generation of predictive analytics shows impressive potential as a crime-fighting tool – with substantial cost-savings as well. According to the National Institute of Justice (NIJ), predictive analytics build on established policing strategies while leveraging growing data sources to inform newer, proactive tactics. It also offers substantial cost-savings, enabling policing leaders to deploy their resources more efficiently and, ultimately, effectively.

Informing Policy and Personnel Management

In addition to its demonstrated success in field operations, data science also has a vital role on law enforcement’s administrative side. The 2015 final report from The President’s Task Force on 21st Century Policing emphasized the importance of data collection and analysis, stating, “(A) lack of relevant data impacts the ability of communities and law enforcement agencies to make informed policy and practice adjustments based on good information.” The report called for enhanced data collection efforts as a means to increase transparency and accountability – ultimately in service of improving public trust in policing.

Accurate and reliable data is crucial to modern law enforcement personnel management strategies. For instance, early intervention systems rely on this data for predictive analysis, and the stakes couldn’t be higher. False negatives derived from faulty analysis are potentially costly, contributing to an agency’s exposure to the risk of a lawsuit or civil rights claim.

In building First Sign®, Benchmark’s data scientists and engineers leverage the power of the world’s largest multi-jurisdictional officer performance database while incorporating iterative learning that uses cumulative analytics to get “smarter” and more efficient over time. This technology gives supervisors a more holistic picture of an officer’s performance, especially relative to others, and enables them to engage in more targeted and meaningful interventions.

Finally, the use of data science in personnel management has the potential for substantial cost savings. According to a recent paper published by the Ash Center for Democratic Governance and Innovation at Harvard, every one dollar of the cost associated with data analytics can return up to nine dollars in value to agencies. At a time of economic uncertainty when municipal budgets are strained, the cost-savings inherent in a data-informed personnel management strategy cannot be ignored.

Looking Ahead

At Benchmark Analytics, our purpose is guided by data science and evidence-based analysis. We specialize in public safety personnel management – it is our area of unique expertise. When we gather and analyze data sets, we use the product of that work for personnel management, professional standards, and early intervention. Taking this a step further, we work in partnership with our academic research consortium and use this data to contribute to a broader understanding of policing for the public good.

Incorporating data science more thoroughly into law enforcement operations allows law enforcement leaders to make smart and more cost-effective decisions, from personnel decisions to deploying resources in the field. Benchmark Analytics is proud to be on the leading edge of using data science to produce better policing outcomes while improving community relations.