The Personal Data Authority (AP) is concerned about the handling of personal data by organizations using so-called generative artificial intelligence (AI). Special attention is paid to apps for young children in which this form of AI is applied.
Therefore, the AP is taking several actions in the coming period. For example, the AP has asked a tech company for clarity on the operation of the chatbot that was integrated into their app that is popular with children. Among other things, the AP wants to know how transparent the chatbot is about its handling of users' data.
Developments around the deployment of AI are moving at lightning speed. This means that generative AI applications in particular are being used more and more widely. The AP sees major risks in offering applications such as AI chatbots to children, because they are insufficiently aware of these risks. For example, it is important that they know what happens to their chat messages and whether they are talking to a chatbot or a real person.
Among the questions the AP is asking the tech company in question are how transparent the app is about its data use, and whether a retention period for the data collected has been properly considered. In response, the AP determines whether follow-up steps are necessary.
Earlier, the AP announced it was taking several actions toward organizations using generative AI. This summer, for example, the AP also asked software developer OpenAI for clarification on chatbot ChatGPT and asked how the company handles personal data when training the underlying system.
In addition, the AP is part of the ChatGPT Taskforce of the alliance of European privacy regulators, the European Data Protection Board (EDPB).
Generative AI can generate text or images, among other things, based on a task. To do this, the model is trained with a large amount of data taken from the Internet, often collected through scraping.
The questions people ask in the tool can also be a source of data. Sometimes this involves very personal information. For example, if someone seeks advice through such a tool about a marital dispute or medical issues. Generative AI tools therefore often process personal data. When developing these tools, developers must comply with privacy laws.
The AP will soon issue a handbook on scraping for companies.