A number of parties in the House of Representatives are looking with interest at Denmark, where people were recently granted copyright on their own faces and voices. It should be a weapon in the fight against deepfakes, AI-manipulated images and sounds that are often almost indistinguishable from the real thing. A majority of the House wants to know if such an extension of copyright law is also possible in the Netherlands. The goal is to better protect victims of online abuse. But is that legally possible? And what does it mean in practice? In this article, Evert Stamhuis, professor of Law and Innovation at Erasmus School of Law and affiliated with the Law and Technology master's program, addresses these questions and he interprets the legal framework.
According to Stamhuis, possibilities already exist under current law to act against misuse of face and voice. For example, registration of face and voice counts as biometric data. "This data therefore has an increased level of protection in the AVG," Stamhuis said. In addition, the Copyright Act provides protection through portrait rights. "It does not seem to be difficult to assume a reasonable interest on the part of the person depicted to oppose the use of an image in deepfake material."
For voices, this is more complicated, the professor acknowledges. Although abuse of audio is illegal in many cases, it is not yet clear whether extending copyright really adds much. "After all, you have to take action against abuse yourself to be compensated. That will take quite a bit out of the person." However, according to Stamhuis, the benefit could be that there would be less doubt about the illegality of the content, which would allow platforms to call for removal more quickly.
According to Stamhuis, dealing with deepfakes depends largely on the attitude of large platform companies. They already have a legal duty to combat illegal content. "That is a system of moderation on their own initiative plus a built-in possibility to actually take the content offline after a notice." That so-called notice-and-take-down procedure can be especially effective if civil society organizations cooperate in detecting illegal material.
In practice, however, platforms are not always cooperative. Stamhuis: "Some providers do not necessarily adopt a loyal attitude towards the damaged person in practice. They dispute that the content is obviously illegal, they argue that the content could fall under 'freedom of speech,' and some still claim that they are only pass-through providers." An extension of copyright to face and voice would potentially make a difference here, according to Stamhuis, because it would more clearly settle the discussion of illegality. Still, he points out the limitations: "Since many tech companies are based in the U.S., a ruling by a national lawmaker in the Netherlands or Denmark might not have as much impact on the platform provider's stance as hoped."
There are also international developments relevant to the Dutch situation. For example, deepfake technology has not been classified as "high risk" by the EU in the AI Regulation, so the obligations for providers are limited. For now, it does not go beyond transparency requirements when using AI in communications. However, the European Commission is working on proposals around combating online child abuse, some of which may also be relevant to deepfakes.
Also at issue is a recent UN treaty requiring states to criminalize the distribution of intimate images without consent. "Unfortunately, the text of that treaty does not address deepfake video material," he said. This leaves current international and domestic law fragmented, according to the professor. "Use of face and voice in deepfake video can only be addressed with criminal law if other crimes are committed with it, such as extortion, defamation, involuntary sexting, stalking, etc."
An important question is what impact a copyright on face and voice will have on freedom of speech, satire and art. Stamhuis anticipates that this will lead to similar discussions as with cartoons or satire. "Public performers, such as politicians, vloggers, regular talk table guests and the like, will have to swallow a little more than a random individual when it comes to mockery or satire using face or voice." According to Stamhuis, this is where the judiciary can strike a balance between protection from abuse and room for public debate.
Another concern is enforceability, especially when deepfakes are spread through the dark web. "If the misuse of face or voice is spread through the dark web, fighting it with whatever tool is available will be problematic. Only when a deepfake product surfaces, so to speak, can something be done."
The idea of giving citizens copyright to their own faces and voices seems, on paper, an attractive means of empowering victims. Yet Stamhuis has reservations about its practical significance. He believes it is therefore doubtful that extending copyright will bring the hoped-for breakthrough in the fight against deepfakes.