The Information Security Service (IBD) has published a Guide to AI and Algorithms for Municipalities (1). This guide provides the Data Protection Officer (FG) and the Privacy Officer (PO) in municipalities with an overview of artificial intelligence (AI) and algorithms. It outlines the applications, potential risks, measures to be taken and considerations for municipalities regarding the responsible use of these technologies.

Municipalities are increasingly using AI and algorithms to improve services, optimize decision-making processes and address societal challenges. Within municipalities, AI and algorithms can be applied to work more efficiently and effectively, while delivering better services to residents. For example, application of AI and algorithms can help interact with residents, help analyze data, make predictions about future events and thereby take preventive measures. In addition, they can contribute to sustainability and energy management, be used to identify suspicious patterns and better manage traffic flows.
The use of AI and algorithms carries risks. To manage these risks, it is important that municipalities act proactively and take appropriate measures. Consider the following risks and associated measures:
Violation of privacy and data protection through improper collection, storage, or use of personal information can lead to reputational damage and possible legal consequences. Associated measures include handling personal information carefully, ensuring AI systems comply with applicable privacy laws, anonymizing, and securing data.
Overreliance on AI may lead to reduced human involvement and responsibility. This may lead to less scrutiny of ethical and moral aspects. An appropriate measure is to involve citizens, experts and stakeholders in the design and implementation of AI systems to ensure involvement.
Unintended discrimination due to bias in the data set can lead to unequal treatment of citizens. Clear practical example is the Tax Administration, about which we previously wrote an article (2). To counter this, it is necessary to test and monitor whether the data used are representative and unbiased.
Difficult to understand results and lack of transparency can reduce trust in AI systems. Taking responsibility for transparency by understanding and explaining how AI systems work is an appropriate measure.
Vulnerability to cyber attacks and misuse can lead to unauthorized access to sensitive information. An appropriate measure is to ensure that AI systems meet high security standards.
The use of AI should not be at the expense of human contact and the relationship between the municipality and its residents. AI can make work easier and faster, but it is also important to take measures so that human contact is maintained.
Important functions
The FG and the PO play an important role in ensuring privacy and data protection within municipalities when using AI. The FG supervises the processing of personal data. The PO ensures the protection of personal data and privacy of individuals by supporting municipalities in the use of AI applications in accordance with applicable privacy legislation. The guide explains more about the specific interpretation of these functions. In addition to the FG and PO, it is important to put together a multidisciplinary team with representatives from different departments and areas of expertise. The guidebook appoints various roles and functions within this team.
Online printing meter
The guide discusses two examples of AI applications within municipalities. One of these examples concerns the use of an online pressure meter, which allows residents and visitors to see in real time how busy it is in shopping malls, parking lots or stations. This topic is particularly topical as the Overijssel District Court recently overturned the privacy fine in a similar case at the municipality of Enschede. Read more about this ruling in this article (3).
With an online printing meter, there are several privacy risks:
The collection of location data can violate the privacy of individuals by tracking their movements, thus inferring their behavior and habits.
Location data can be combined with other information to build profiles of individuals, which can lead to unwanted tracking and profiling.
Tracking location data can analyze behavioral patterns and preferences of individuals, which can violate their privacy.
Municipalities can manage these risks by taking the measures below. Read more about items 3 through 5 in this article (4).
Consider first whether the intended purpose can be achieved in a less intrusive way, possibly without the use of personal data.
Anonymize location data and merge it before display to make individual identification difficult.
Obtain explicit consent from data subjects for the collection and processing of their location data, and ensure understanding about the use of their data.
Increase the sense of control by allowing data subjects to manage or revoke their location data at any time.
Avoid collecting additional personal information.
Implement robust security measures to protect location data from unauthorized access.
Limit the retention period of location data and delete it over time to minimize unnecessary exposure. The Municipality of Eindhoven also faced this issue, read more about it in this article (5).
(1) https://www.informatiebeveiligingsdienst.nl/product/handreiking-ai-en-algoritmen/
(2) https://www.nysingh.nl/blog/belastingdienst-krijgt-recordboete-ap-opgelegd/
(3) https://www.nysingh.nl/blog/rechtbank-vernietigt-privacyboete-e600-000-aan-gemeente-enschede/
(4) https://www.nysingh.nl/blog/gemeenten-onderzoeken-smart-city-toepassingen/
(5) https://www.nysingh.nl/blog/eindhoven-onder-extra-toezicht-om-privacyrisicos/
