The probation service uses algorithms in an irresponsible manner. Its most important algorithms have several flaws that could have negative consequences for society, suspects, and convicted offenders. This is evident from research conducted by the Inspectie Justitie en Veiligheid Inspectie JenV).

For example, formulas were swapped in one of the algorithms. As a result, this algorithm never accurately predicted the risk of a suspect or convicted person reoffending. Calculations by the Inspectorate show that this led to the risk of recidivism being incorrectly assessed in approximately a quarter of the recommendations. This algorithm also contains variables that could lead to discrimination. The probation service must quickly remedy the shortcomings or temporarily suspend that algorithm, according to the Inspectorate.
The probation service provides advice on suspects and convicted offenders to the Openbaar Ministerie judges. It uses algorithms to predict the risk of someone reoffending. The OXREC algorithm in particular is risky. It is always used in the advisory process and is applied approximately 44,000 times a year.
The Inspectorate JenV has found errors in the OXREC software. Since it was introduced in 2018, formulas for prisoners and suspects have been mixed up and incorrect figures have been used. In most cases, this results in the risk of recidivism being underestimated, particularly for drug users and people with serious mental illnesses such as psychosis. The Inspectorate has not investigated the extent to which probation officers and judges have adopted incorrect advice with possible risks for society, suspects, and convicts themselves.
The OXREC data is significantly outdated and was developed for a different target group than the one to which the probation service applies it. Furthermore, the OXREC does not comply with privacy legislation. The inspectorate also sees risks that employees will adopt its results too easily and no longer rely on their own judgment. They are told that their own judgment is as reliable as "flipping a coin." This overstates the reliability of this algorithm.
The OXREC uses variables that can lead to discrimination, namely 'neighborhood score' and 'income level.' Several years ago, the Netherlands Institute for Human Rights stated that the use of these variables in algorithms is prohibited in principle, unless it can be substantiated why their use is necessary and additional measures are taken. However, the probation service has not done so.
According to the Inspectorate of Justice and Security, the probation service does not sufficiently monitor the use and maintenance of the algorithms. As a result, the shortcomings identified have gone unnoticed for years.
The Inspectorate recommends that the shortcomings be remedied as quickly as possible or that the OXREC be shut down. In addition, the Inspectorate recommends that the probation service set up a structure for the development, use, and maintenance of algorithms and implement it. The probation service must train the algorithms with new data and verify that they are being used correctly. It must also comply with the relevant legal standards and guidelines, such as privacy testing. The Inspectorate JenV wants to receive updates from the probation service every six months on the progress of these improvements. The probation service has announced that it will temporarily stop using the OXREC.
