Menu

Filter by
content
PONT Data&Privacy

0

Marlies van Eck: "Every automated decision system should have a Kafka button"

In order to make decisions quickly and efficiently, the government often uses automated decision-making. However, because the decision-making systems are complex, it is difficult to test whether they are in accordance with the law, so the danger of human rights violations lurks. If a mistake is made, citizens often have to take legal action to obtain their rights. How can citizens be offered legal protection in automated decision-making? Data&Privacyweb talked about this with Marlies van Eck. She wrote a dissertation on legal protection in automated chain government decisions.

27 September 2021

One pitfall with automated decisions is that some facts and circumstances cannot be considered in the decision. How can this be explained?

"A computer can only work with hard standards. Except that the law also has many open standards. Suppose an article of law states that a person is legally resident in a municipality if he or she resides there for the most part. A computer can't do anything with a term like "for the most part. So such a term will have to be translated into a hard standard, say "80 percent of the time.
When a computer then starts making decisions based on such a hard standard, the nuance falls away. In cases where the situation is not immediately clear, a problem then arises. For example, a person may alternate between living abroad and not living abroad for work purposes. That is difficult to translate into a hard standard."

But the citizen can object when a mistake is made, right?

"That's right. According to Article 22 of the General Data Protection Regulation (AVG), automated decisions must have adequate safeguards for the data subject. According to the legislator, the Netherlands has fulfilled this duty by having an objection and appeal procedure in the government. My research shows that the objection and appeal procedures provide insufficient legal protection. It still happens too often that someone calls to rectify a situation and is told by the employee 'I can't do anything about it, I didn't make the system either'. That should be different."

How can this deficiency be addressed?

"No matter how hard the law is, there will always be exception situations. So computer programs must be made in such a way that it is possible to deviate when someone invokes an exception situation. Every computerized decision system must have a 'Kafka button.'"

With chain decisions, the risk of Kafkaesque scenes is even greater. What safeguards are needed here?

"Government services are very sophisticated because of data sharing. It's great that you can buy a car and receive the registration certificate within a day. But when something goes wrong, the burden is often placed on the citizen. In practice, in order to obtain justice, the citizen often has to take it to court. One solution the government is now working on is the "no wrong door" principle. This means that the citizen with a problem can report to the government and then not be sent to another counter. The government then has a duty to go through all the chains and find out what went wrong. In the case of victims of identity fraud, a start has been made on this-the government then offers help. It would be better if the government actively fixed errors at a central point."

A second pitfall is that decision systems are often so complex that they cannot be understood by the average person. Who tests whether such a system complies with the law?

"Again, you see that it often ends up on the citizen's plate. Only when a citizen brings a lawsuit does the judge theoretically start to test the legality of the decision system. It's a strange idea that both the citizen and the judge should have to understand how those underlying algorithms work, because that's just a profession. It would be better if the government were given a duty of care and held accountable for what systems it has put in place and what any risks are. Then a regulator would have to review whether the systems put in place actually constitute proper implementation of the applicable laws. This gives the citizen facing a decision confidence that the system has been recently tested and no oddities have been found. Then, if there is still an error in the decision, the judge can look at this individual decision without having to test the system itself."

How can citizens gain access to the decision-making process?

"Under the AVG, a data subject already has the right to an explanation of how the automated process works. Personally, I do see this with insurers informing you as an applicant. The government could also do this. There are already experiments with, for example, a viewing guide for algorithms. So it is already emerging, but it should be much better."

Are there any other parties who can contribute to legal protection for automated decisions?

"When it comes to automated decisions, almost no lawyer in the field of administrative law makes the link to the AVG. Conversely, when litigating about the AVG, almost never a link is made to the decision concept and administrative law - while there are an incredible number of connections. I would therefore urge lawyers to get to work on this. After all, case law can only emerge if there are lawyers who know how to take this route."

Marlies van Eck is currently working the Radboud University Nijmegen and Hooghiemstra and Partners, where she is working on an evaluation of the General Data Protection Regulation Implementation Act.

Share article

Comments

Leave a comment

You must be logged in to post a comment.

Learn more
-->