Smart Police: The Worrisome Predictive Policing Software

There is no shortage of techno-security innovations on the “safe cities” market. Among them, the Smart Police software aims to facilitate the work of municipal police by offering, among other things, a delinquency prediction module. Like other “predictive policing” tools, however, it could threaten certain public freedoms.

“350 cities improve the peace and security of their citizens with our solutions”, asserts on his website Edicia, the Nantes company behind the Smart Police software. The following description suggests that it is an interface with multiple objectives: a sort of “all-in-one” to facilitate the work of municipal police.

Advertisement

Smart Police makes it possible to dematerialize administrative tasks, but that's not all. “ Optimize your means and resources thanks to Artificial Intelligence”, also tells us the designer's website. Because the software is also intended to be used to “ anticipation and visualization of delinquency events », thanks to its predictive policing module.

AI to predict delinquency

Science did not wait for the explosion of digital technologies to try to predict crime: in the middle of the 19th century, the French statistician André-Michel Guerry was already interested in “ social attitudes » which could lead to breaking the law. However, with the development of AI, predictive policing has taken it to the next level.

In 2018, Edicia files a patent titled “Method and system for monitoring and preventing dysfunctions in territorial security”. The Smart Police predictive module was born. Without claiming to predict crimes, it is supposed to anticipate incidents of delinquency that could occur soon — by indicating “where” and “when” precisely enough to allocate the necessary staff.

As part of its action research project Technopolicethe La Quadrature du Net association has investigated several predictive policing software. She obtained Smart Police user manual in which several features are presented in detail.

Advertisement

Thus, two sources of data would be used to feed the prediction algorithm.

  • On the one hand, field data with elements provided on Smart Police, in particular the location of patrols or even “alerts” issued by agents.
  • On the other hand, “external” data, such as socio-economic indicators on the population, town planning data, the weather or even information from social networks.

How this data is then processed by AI is a question that cannot be answered by an outside observer: “ The Edicia user manual does not provide precise information that allows you to see how the predictive policing module works. explains Félix Tréguer.

This lack of transparency then creates a “black box” effect, says Simon, fundamental rights lawyer and member of the local Technopolice campaign in Montpellier. “ The private nature of the companies that manufacture this type of software means that they are not “free”, he specifies. The source code is not made public for reasons of industrial secrecy. »

Tom Cruise in Minority Report
Tom Cruise in Minority Report, the reference in sci-fi films on predictive policing

Correlation is not causation

Certain risks are common to the majority of predictive policing tools, such as the fact that they often rely on criminological beliefs called into question by sociology. For example, the “broken windows theory”who would like small acts of delinquency to require a firm and rapid police response, to avoid deterioration towards more serious acts.

More broadly, predictive policing technologies sometimes rely on statistical correlations which are not sufficient to establish causal links. Smart Police seems to be no exception: “ Postulates of the same order (that the theory of broken windows, editor’s note) seem registered in the Smart Police predictive module, deplores Félix Tréguer. According to our sources, workshops have even been organized with the police to code into the algorithm the knowledge or beliefs that the agents may have formed on the basis of their experience. » On this last point, none of the municipal police contacted responded to us.

On the other hand, a report recent report from the European Agency for Fundamental Rights also indicates that one of the risks linked to predictive policing algorithms is that of “ self-reinforcing effects “, Or ” feedbak loops “. The Quadrature du Net report demonstrates this as follows: “When a large number of patrols are sent to a given area in response to the algorithm's recommendations, they will be led to note infractions – even minor ones – and to collect data relating to this area, which will in turn be taken taken into account by the software and will contribute to increasing the probability that this same area will be perceived as 'at risk'. » This effect can then lead to a strengthening of the police presence in certain neighborhoods.

As for the effectiveness of the software, the company states on its site that it allows “ +50% effectiveness of delinquency prevention plans » (sic), « -20% reduction in incivility and unsanitary conditions “, or ” -30% reduction in feeling of insecurity », but it has not communicated any study allowing these figures to be verified for the moment.

What real use on the ground?

Documents attest to the sale of Smart Police to several communities, such as the Nice Metropolis, the Roissy Pays de France urban community or the town of Etampes, which has renewed his contract in December 2023. However, it is difficult to assess the actual use made of Smart Police in the field: “ Our CADA requests did not provide any evidence of the use of the predictive component of Smart Police, apart from the city of Nice which indicated that it had never used it. », Indicates the Quadrature du Net report.

Today it seems that Smart Police is bundled with two other software namely Smart Control and Smart Security on a single platform called City Zen and whose trademark was registered on February 26, 2018 by Edicia.

Furthermore, Edicia does not seem to be content only with the French market. A report from 2018, from the town of Morrison in the United States, indicates that a contract was signed by the police department with the company. In 2019, it was the capital of Colorado, Denver, thatEdicia had sold her software. She even installed an antenna there.

If the extent of the real use of Smart Police can therefore be difficult to assess, La Quadrature du Net is not alone in worrying about the issues linked to predictive policing: during the discussions around the AI ​​Act in June 2023, the European Parliament voted in favor of their classification in the category of so-called “unacceptable” AI.

However, the political agreement reached in December of the same year contains only a partial ban on these technologies : it does not prohibit predictive policing software based on geographical approaches such as Smart Police.


Advertisement