The Utrecht Data School and Prof. Janneke Gerards developed the Human Rights and Algorithms Impact Assessment (IAMA), an assessment that enables careful decisions about the use of algorithms. The motion by MPs Kauthar Bouchallikh (GroenLinks) and Hind Dekker-Abdulaziz (D66), to make the use of IAMA mandatory, was adopted by the House of Representatives on Tuesday.

Algorithms may seem unbiased, but they are not. If they are to be used in the future to support governments and businesses in carrying out legal obligations, carelessness, ineffectiveness or, worse, human rights violations must be ruled out. A repeat of the scandal surrounding the ethnic profiling of the Tax Administration, the benefits affair, must be prevented at all costs.
To this end, Mirko Schäfer, Arthur Vankan and Iris Muis of Utrecht Data School and Professor of Fundamental Rights Janneke Gerards developed IAMA at the request of the Ministry of the Interior and Kingdom Relations (BZK). The adopted motion paves the way for mandatory use of impact assessments before algorithms are used to evaluate and make decisions about people.
