JUSTICIA

The justice administration in smart cities: between streamlining and automation

Author | Eduardo Bravo

Improving the living standards of citizens in smart cities does not just involve urban planning, environmental or mobility issues. It also affects aspects such as effective judicial protection, the administrative management of fines and claims relating to these.

According to the Eurobarometer published in 2019, more than 50% of Spaniards rate the Spanish judicial system as “quite bad or very bad”. Only Croatia, Slovakia and Bulgaria have a worse opinion than Spain of the justice system.

To improve this situation, the Government has started using artificial intelligence to resolve some simple cases involving traffic fines, administrative fines and even to streamline the system for processing certain judicial cases which, given their complexity, require large volumes of information.

Accordingly, the Superior Judicial Council signed an agreement in 2017 with the Secretary of State for the Information Society and the Digital Agency, for the Spanish justice administration to use artificial intelligence to quickly and effectively consult case law or complex proceedings. Thanks to Language Technologies, the system analyses hundreds or thousands of documents and interprets them as a human would, in order to locate any data that are relevant to a specific case.

In this regard, Microsoft is currently developing technologies to assist judges. Machine Reading Comprehension (MRC) is one particular field that could help to reduce certain expenses. The company is currently working on this technology in order to help in reviewing case law and according to Kirk Arthur, Sr. Director of Microsoft’s Government Industry division, “although the technology has a ways to go, the promise of MRC can greatly improve the amount of time it takes a judge to review existing case law in order to pass down a ruling“.

Part of the mistrust generated by the justice system among citizens stems from the lack of impartiality of magistrates since, in any judicial decision, there is a human element that could jeopardise the objective analysis of the facts. Given this situation, in common law countries, a large number of companies are emerging that use artificial intelligence to analyse the sentences issued by a judge and to determine a pattern of the criteria used when making a judgement.

«Instincts are good. Foresight is better» is the slogan used by de Blue J Legal, a company that claims to be able to «accurately predict court outcomes». Premonition goes one step further, when claiming that it can determine «which lawyers win which cases in front of which judges» since, there is no doubt that knowing the strategy used by the opposing lawyers is also important.

This mistrust in terms of possible human errors makes many people believe that artificial intelligence could be the solution to achieving an aseptic, efficient and equitable justice. If machines were judging, they believe there will be no margin of error. The problem arises when they forget that algorithms still have the need for a human component. From the moment that a human being is responsible for programming the smart intelligence, this may be contaminated by that same cognitive bias.

An example of this is what happened in the United States with COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), an artificial intelligence system used in the justice administration which, when establishing the sentence, does not just assess the facts concerning the life of the defendants, but also data concerning their place of birth, the district in which they live or their family history. A situation that has lead people living in troubled districts or whose families have a history of violence to receive longer sentences for the same acts as others whose circumstances are different.

At the same time, protecting all this information is of critical importance, specially in countries which anonymize alleged offenders until there’s an ongoing process or they’ve been convicted. Microsoft, as any other company, must invest in the protection of its services, but also comply with a number of rules that may greatly differ on a per country basis. In the case of the European Union, fielding technologies that involve personal data must be done with respect to the GDPR.

Given these experiences, many indicate that it is too soon to entrust artificial intelligence with decisions related to effective justice. Therefore, it is currently only used to impose administrative fines in which the evaluative margin is not considerable. In Spain, for example, drones that work with algorithms and machine learning are being used by the Traffic Agency to detect driving offences, such as talking on the phone, not wearing a seat belt, not respecting traffic signs or driving in restricted areas. This is just a small taste of what lies ahead.

Images | Sang Hyun ChoSucco, NakNakNak, Gerd Altmann