There are growing concerns in society about the impact of social media on young people, including exposure to harmful content and the influence of addictive algorithms. Privacy First advocates for an approach that respects young people's privacy and autonomy, while taking steps to protect them from the risks of social media. This includes parental oversight and education as well as strict regulation and accountability on the part of the platforms themselves.
In March this year, the House of Representatives passed a motion calling for the establishment of a minimum age of 15 for the use of social media platforms.[1] The motion focuses mainly on platforms that use addictive algorithms, such as TikTok, Instagram and Snapchat. This motion does not come out of the blue: many people are concerned. A panel survey of parents by EenVandaag found that 71% of those surveyed support a minimum age of 15 for social media. Also, a fire letter was recently published by doctors, scientists and practitioners to limit screen time and social media use.[2]
The secretary of state in charge has announced that he will issue advice on a minimum age for social media and recommended screen times for children before the summer of 2025. He will also consult with other European countries to develop joint guidelines. Several European countries have expressed support for an age limit for social media. French President Macron has signaled his intention to regulate a social media ban at the European level within a few months. Should the EU not act quickly enough, the French president says he will implement the ban in France.[3]
The discussion is not only in the Netherlands and Europe. The use of social media by young people raises questions about age limits around the world. In November 2024, Australia introduced a law banning social media for children under 16 unless parents give explicit permission.[4] Platforms that violate these rules risk hefty fines of up to €31 million. The ban does not go into effect immediately. Implementation will take another year.
Social media can expose young people to inappropriate or harmful content. An age limit can help protect children from harmful content. Consider videos or photos that romanticize eating disorders, normalize self-harm, or contain extreme violence or hateful messages. Confrontation with such content can have a major impact on young people's mental health, especially if they feel vulnerable or insecure. Fake news and conspiracy theories can also be harmful and give a distorted view of the world. What makes it extra complicated is that algorithms often show just more of this type of content once you click or linger on it a few times. This creates the danger of getting caught in a negative spiral.
Social media algorithms are designed to keep users on the platform for as long as possible. They do this by analyzing behavior and offering increasingly sophisticated content that capitalizes on emotions, curiosity and vulnerabilities. For young people, whose brains are still developing and extra sensitive to rewards, social affirmation and peer pressure, this is a risky combination. The algorithm learns at lightning speed which videos, images or messages grab their attention, then feeds them more of the same or more extreme.
This mechanism leads to so-called "rabbit holes": young people end up in a bubble of one-sided or harmful content, for example around appearance, eating behavior, violence or fake news, without always being aware of it. They have fewer skills to critically filter that flow or slow themselves down.[5]
The popular series Adolescence reveals the profound influence of social media on young people's lives. Social media plays a major role in identity formation, self-image and relationships. Intensive use can have a major impact on young people's well-being, including risks such as bullying and low self-esteem.
The use of social media does not have only negative sides. Social media, when used consciously, also offer clear benefits to young people. They strengthen social connectedness, help with identity development and provide a platform for creative expression. Young people can go there for information and education, often in a way that better suits their perceptions than traditional media. In addition, social media offer room for involvement in social issues and are a valuable source of support and recognition, especially for young people who feel less seen offline. If used "properly," social media can contribute to personal growth, self-confidence and global citizenship. It offers young people opportunities for social interaction and participation in online communities. A ban or an age limit limits the opportunity to participate digitally for a large group of young people and find like-minded people online, for example.[6]
Verifying age is complex. Young people can easily circumvent age restrictions by providing incorrect birth dates. Age verification should be implemented without sharing (sensitive) personal data(privacy by design). Techniques such as facial recognition or fingerprinting raise (too) many privacy questions.
The European Commission has decided to build an age verification app to address the concerns. The idea is that the app will allow young people to prove they are old enough without compromising privacy. Social media providers should be required to modify their access to use this app.
The AVG (or GDPR in Europe) explicitly states that children deserve extra protection when processing personal data. In the Netherlands, young people under 16 are not allowed to independently consent to the use of their data on social media. Platforms must therefore verify that parental consent has been obtained. Moreover, they are not allowed to collect sensitive data or show personalized ads to this age group. In addition, the AVG describes that platforms must make active efforts to protect young people from data abuse.[7]
Since 2024, the Digital Service Act (DSA) has been in effect in the EU. Major platforms such as TikTok, YouTube and Instagram are now required to identify and mitigate the risks their systems pose to young people. Consider algorithms that recommend harmful content, or mechanisms that encourage addiction. Platforms must take measures to protect minors. Transparency about how the recommendation algorithm works is mandatory here.
Because of the DSA, changes are visible. Meta provides more information on options for parents to regulate screen time and adjust privacy settings. YouTube also announced new default settings. It is certainly not enough yet and is not going fast enough. For that, the guidelines from the DSA should be tightened and more needs to be done on enforcement.
The Digital Fairness Act (DFA) offers opportunities. The DFA is an important next step in the European Union to protect users from unethical digital practices. The proposal is expected in 2026.
Parental supervision is often cited as part of the solution. However, there are often privacy concerns associated with such monitoring. The Jeugdjournaal recently featured an item about young people who were being followed by their parents through an app. The importance of a free space for young people to experiment was rightly pointed out. The right to privacy, and thus the right to determine what information they share with their parents, also applies to young people.
The Volkskrant spoke to 18 different children about using their phones.[8] Many children report being confronted online with images they would rather not see. The threshold for involving parents appears to be high. Other research also shows that children are reluctant to go to their parents when unpleasant things happen online.
Simone van der Hof (Professor of Law and Digital Technology, Leiden University) examines the impact of technology on children's rights. She criticizes the proposal to ban social media for children under 16.[9] She argues that such a ban excludes children from an important part of their social lives, including the positive aspects of social media.
It is the opposite world that we deny children access instead of forcing social media platforms to provide a safe environment for young people. The UN Committee on the Rights of the Child has made it explicit that the rights of children also apply in the digital world. Young people have a right to protection, privacy, information and participation, also online. Platforms need to adapt their design and policies accordingly.[10]
Instead of banning social media for young people, Privacy First advocates respecting children's privacy and autonomy in the first place. In addition, we must take measures that protect young people. Parents can help their children do this, but attention to media literacy in education can also contribute.
But let's turn primarily to the platforms themselves. For example, by strengthening, tightening and enforcing regulations such as the DSA, the AVG and, in the future, the DFA, and enforcing changes in social media design. Think rules about addictive design, manipulative interfaces such as endless scrolls and better reporting options.
It is time for platforms to mature enough to provide a safe environment for their youngest users as well.
[1] Motion by member Van der Werf et al. on differentiated age limits for social media platforms (Feb. 20, 2025). See also NRC, Lower House wants minimum age of 15 for social media, https://www.nrc.nl/nieuws/2025/03/04/tweede-kamer-wil-minimumleeftijd-15-jaar-voor-sociale-media-a4885178.
[2] See https://smartphonevrijopgroeien.nl/brandbrief/. See also NRC, Experts and parents want social media ban, https://www.nrc.nl/nieuws/2025/05/26/experts-en-ouders-willen-een-socialemediaverbod-komt-dat-er-ook-a4894706.
[3] https://www.security.nl/posting/891751/Franse+president+wants+European+ban+on+social+media+for+under+the+15+years
[4] NOS, Australia wants social media ban on children under 16, https://nos.nl/l/2546272
[5] Amnesty International (2023), TikTok's 'For You' feed risks pushing children and young people toward harmful content, https://www.amnesty.org/en/latest/news/2023/11/tiktok-risks-pushing-children-towards-harmful-content/.
[6] https://netwerkmediawijsheid.nl/een-minimumleeftijd-voor-sociale-media-een-verbod-kan-zelfs-een-tegengesteld-effect-hebben/
[7] See art. 8 AVG.
[8] How do kids of 10 handle their phones? That's what they tell themselves for once. 'It's very dirty, I can't believe I saw it', https://www.volkskrant.nl/wetenschap/hoe-gaan-kinderen-van-10-om-met-hun-telefoon-dat-vertellen-ze-nu-eens-zelf-het-is-heel-erg-vies-ik-kan-niet-geloven-dat-ik-het-heb-gezien~b1032b77/.
[9] NRC interview with Simone van der Hof: Banning social media solves nothing, says professor, https://www.nrc.nl/nieuws/2024/12/13/sociale-media-verbieden-lost-niets-op-zegt-hoogleraar-dwing-platformen-veilig-te-zijn-voor-kinderen-a4876442.
[10] For further explanation of the rights of the child in relation to the digital world, see General comment No. 25 of the UN Committee on the Rights of the Child (2021): https://open.overheid.nl/documenten/ronl-ad4f21ab-3f0d-4dfd-82a4-e6822c23c203/pdf.