Unlocking and harnessing the power of “big data” is a common challenge for modern professionals, and that’s just as true for those in law enforcement as well.

That was the central theme of one of the opening sessions – “Improving the Collection and Utility of Law Enforcement Use of Force Data” – from the 123rd annual International Association of Chiefs of Police Conference in San Diego, Calif.

But as Thomas Taffe, inspector, New York City Police Department (NYPD) noted at the start of the session, data analysis in law enforcement is certainly not a new practice. In fact, the NYPD began tracking data tied to officer-involved shootings in 1971. In 1972, there were 994 officer-involved shootings in the city, but what happened next is particularly noteworthy: once the data was reported, officer-involved shootings dropped 33 percent the next year and continued to drop ever since. In 2015, there were 67 total shootings in New York.

As Taffe described, taking a robust, analytical look at force “helped us get a handle on these issues.” And while he admitted the NYPD has had its controversial moments when it comes to data, specifically addressing the controversy surrounding the effectiveness of their former “Stop-and-Frisk” program, Taffe explained that the department has worked to do a better job when it comes to how data is gathered, categorized, analyzed and used.

To do this, the NYPD met with unions, advocacy organizations and others to make sure their input was “in the fold,” and to create transparency. As Taffe proudly noted, in a universe where the NYPD has made 422,503 arrests in the last year, there were only 49 officer-involved shootings. In other words, tracking and leveraging the data to improve officer training has been effective in the city. But as Amy Blasher, unit chief with the FBI, noted, this is a much bigger problem.

“We’ve all seen the headlines, we’re constantly reminded of what’s happening nationwide,” Blasher said. As she explained, law enforcement agencies from state, local, federal and tribal jurisdictions need to take a better look at the data behind officer-involved use of force incidents. The challenge, however, has been to find and leverage a consistent data model.

That is why, as Blasher described, law enforcement agencies of all levels joined forces in 2015 to standardize. While the states themselves are free to decide how the use-of-force data is aggregated and reported, the idea is to collect uniform data points, including the total number of officer incidents, where the incidents occurred, the number of agencies involved in the incident, how weapons were involved, and more. What’s more, there could not be “free text input of this data,” it’s all based on standardized use of empirical data and yes/no inputs, she added.

As a result of this effort, the FBI will gather all this data from law enforcement agencies across the country and put it into a central hub called the Law Enforcement Enterprise Portal (LEEP), which is set to launch in January 2017.

Gathering the data was the first challenge, but as Blasher noted, the next priority was to determine how the data would be published. The decision was made to push use-of-force and other data back to agencies on a quarterly basis, and public reporting would occur on a “semi-regular basis” after that.