Menu

Filter by
content
PONT Data&Privacy

0

Board rules: ad algorithm Facebook discriminates based on gender

The Clara Wichmann Foundation and Global Witness asked the Board to assess whether Facebook's advertising algorithm discriminates by showing job ads for stereotypically male and female positions primarily to the relevant group.

Human Rights Board February 19, 2025

Meta Ireland is responsible for offering Facebook in Europe. The company makes money by selling personalized ads, including job ads. An algorithm determines which ad is shown to which user.  

Gender part of algorithm

Meta's algorithm learns from the click behavior of Facebook users. This can create a one-sided view of that user. This can cause the algorithm to promote stereotyping if not monitored. Meta acknowledges that "gender" as a data point is part of the algorithm and does not disprove that stereotyping can occur through the algorithm. 

Research by Global Witness shows that the job posting for the position of receptionist was shown 96% (2022) and 97% (2023) to female Facebook users. The mechanic job posting was shown 96% (2022 and 2023) to male Facebook users.

Indirect discrimination

The Board ruled that there was indirect discrimination based on gender. With indirect discrimination, a practice or provision appears neutral, but nevertheless affects people of a certain gender in particular. Indirect discrimination is prohibited unless there is a good reason for it (objective justification). This justification depends on the purpose of the distinction and the means used to achieve it. The aim must be legitimate and the means employed must be appropriate and necessary. 

Legitimate purpose?

Meta states that they want to offer advertisers the best service and value for money. They do this by publishing job ads in such a way that the advertiser achieves their goals. Meta also wants to provide Facebook users with the best experience, by showing them ads that are most likely to interest them. The Board rules that this is a legitimate goal.
But is the means necessary to achieve these goals? To do so, the means must meet two requirements:

  • Proportionality: is the interest proportional to the intrusion? 

  • Subsidiarity: is there a less intrusive means by which you can achieve the goal?

Medium too heavy

The Board ruled that the remedy did not meet the two requirements. As a social media platform, Meta has a responsibility to properly monitor how the algorithm works. It also needs to investigate whether and how stereotyping occurs in the ad algorithm. With what data are the algorithms trained? And how do the algorithm selections work out for different groups of people? If necessary, Meta should take measures to neutralize stereotyping reinforcement. 

Conclusion

The Board failed to find that Meta is taking the necessary action and concludes that the indirect discrimination is not necessary. Meta's advertising algorithm therefore discriminates on the basis of gender. 

Read the entire judgment 2025-17.

Share article

Comments

Leave a comment

You must be logged in to post a comment.