Menu

Filter by
content
PONT Data&Privacy

0

First time presumption of algorithmic discrimination successfully substantiated

A student has succeeded in presenting sufficient facts for a presumption of algorithmic discrimination. The woman complained that the Free University discriminated against her by deploying anti-spying software. This software uses face detection algorithms. The software did not detect her when she needed to log in for exams. The woman suspects this is because of her dark skin color. The university will have 10 weeks to prove that the software did not discriminate.

Human Rights Board December 9, 2022

News press release

News press release

This is evident from the interim opinion published today by the College (1).

In brief 

  • The Free University deployed software to prevent students from cheating during exams if they had to take them at home during corona. A student with a dark skin color had problems logging in several times. She also experienced several times that she was no longer allowed to access the questions. Then she had to log in again and again. The woman suspects that these problems are related to her skin color. 

  • The Board finds that the woman succeeded in presenting sufficient facts for a presumption of discrimination. Both the university and the woman agree that the woman was inconvenienced by the software on several occasions. Academic research also shows that face detection algorithms perform less well with people of darker skin color. At the same time, the Free University has shown too little verifiable data showing that the software did not discriminate.   

  • This is the first time anyone has succeeded in making algorithmic discrimination plausible. The university will now have 10 weeks to prove that the software did not discriminate. 

Important milestone 

"While this is an intermediate judgment, it is still an important moment in our history of judgments," said Chair Jacobine Geel. "After all, someone managed to say, 'Hey, what this algorithm is doing is strange. I suspect the algorithm is discriminating." That's a very meaningful first step. Regardless of what the final verdict will be, the chair hopes this interim verdict is an incentive for anyone to report to the College if he, she or they suspect an algorithm is discriminating.  

'Face not found' 

A woman was pursuing a master's degree at the Vrije Universiteit Amsterdam Foundation during the 2020/21 academic year. Because of the corona pandemic, the university took exams that year (mostly) online. To prevent fraud, the educational institution used, among other things, so-called proctoring software ("anti-spying software").  

To access the exam questions, students had to go through a number of checks, including a "webcam check. The anti-spying software then assessed whether someone was present by applying a face detection algorithm to webcam images. The software also ensured that during the exam, movements at the student's workplace were monitored via webcam. 

The woman claims that the system often failed to detect her. She then received messages such as 'face not found' or 'room too dark'. In addition, she was denied access to the exam questions by the software several times during an exam, after which she had to log in again, a so-called "restart. 

VU: no discrimination 

The Free University argues that there is no discrimination. It has inquired with the supplier. The supplier denies any discrimination by the software. The supplier has asked an external party to do research and they have found nothing to indicate this.  

The university further reviewed the log data, which would not show that the woman had been exceptionally more inconvenienced than other students in logging in or experienced exceptionally high restarts.  

Suspected software works worse on people with darker skin color 

The woman had no effective opportunity to review the operation of the anti-spying software. The Court of Justice of the European Union has ruled in similar cases that a person in such a situation may support a presumption of discrimination with more general data.  

The College notes that there is academic research showing that facial detection software generally performs worse on dark-skinned individuals. 

In addition, both the woman and the university agree that the woman was inconvenienced by the software on some occasions.   

Together, these facts are enough to put the ball back in the university's court and ask it to produce more evidence that the algorithm did not discriminate.  

The College notes that the University has not (yet) met that requirement. The external research cited by the software vendor is not public and cannot be accessed. The data provided by the university itself is incomplete. For example, it does not sufficiently compare the woman's situation with other students. Therefore, the Free University is asked to further substantiate its position that there is no discrimination. 

Ten weeks for college 

In ten weeks, the College will review the new evidence submitted by the University. The College may find it necessary to hold another hearing to discuss the new evidence. But the College may also be able to reach a verdict already after viewing the university's new evidence. The College may decide that only after the evidence has been viewed. 

Report strange decisions to the College! 

Should people have similar experiences or have received inexplicable decisions from organizations (employers or the government, for example), the College encourages these people to first inquire whether an algorithm was used. And if it was, to ask how the algorithm works. If organizations cannot explain that to their satisfaction, then the College invites these people to contact contact the College. There may have been algorithmic discrimination.  

Share article

Comments

Leave a comment

You must be logged in to post a comment.