digital community

Algorithm as colleague? Effectiveness and legitimacy of predictive policing

Predictive policing faces a lot of scepticism and criticism. Is this type of policing actually increasing the effectiveness and efficiency of the police?  The scarcity of empirical evidence making it difficult to determine the effectiveness of predictive policing. There are also voices of concern which argue that these systems are lacking transparency and might lead to profiling and stigmatisation of societal groups.

Predictive policing employs algorithms that help to make an analysis of where and when criminal activity is most likely to occur. Most of the analyses of predictive policing focus on the quality of the algorithm but, in the end, the crucial success factor is the police officers that use it. Do they follow the advice of the predictive policing algorithm or simply ignore it because their experience tells them otherwise?

The algorithm is used for a human practice: policing. Therefore, we need to understand the interaction between predictive policing algorithms and intelligence specialists in the police to understand the effects. An algorithm that is ignored effectively does not exist! We studied the practices of a police region in the Netherlands to develop a better understanding of the role that predictive policing systems play in their organisation.

A key finding is that algorithms can be seen as ‘colleagues’ of the intelligence specialists. Many specialists talk about them in basically the same way as they would talk about colleagues. We found in the research site three types of interactions between the predictive policing algorithm and the intelligence specialists:

  1. The algorithm as a supporting colleague:The intelligence specialists see benefits of using CAS as it provides new insights but they still deem their own expectations and experiences as most important, and provide advice on the local trends and issues that are observed. Hence, human practices are still leading in the interaction.
  2. The algorithm as an unsuitable or unnecessary colleague: The algorithm adds no value to the advice of the intelligence specialists as it either is inaccurate in its predictions, or see no benefits to the operational level. Therefore, the intelligence specialists develop advice based on the their own expectations and knowledge.
  3. The algorithm as a directive colleague: The algorithm is a directive tool to directly advice the street-level officers. The intelligence specialists explicitly communicates the output of CAS and perceives this as a legitimate tool for decision-making.

The dominant pattern that was recognised during the research was the supporting colleague: intelligence specialists see the algorithm as helper but in the end they rely on their own judgment.

These findings highlight that we should not be afraid (yet) that systems take over the judgment of intelligence specialists. At the same time, we need to be develop a better understanding of more subtle changes.

  • How do these automated colleagues change their perspectives on the world out there?
  • Are these algorithms good friends that empower the intelligence officers or bad friends that strengthen their prejudices?

Managing police organisations may increasingly mean being able to stimulate positive friendships between man and machine and to intervene when this relationship requires counselling.

Martijn Wessels & Albert Meijer

University of Utrecht

Leave a Reply

Your email address will not be published. Required fields are marked *