Menu

Filter by
content
PONT Data&Privacy

0

Algorithms? Transparency!

With the benefits scandal still fresh in the memory - the settlement is far from complete - it was recently brought out that Dienst Uitvoering Onderwijs ("DUO") also used(ed) an algorithm that may have led to discrimination. The use of algorithms brings both advantages and disadvantages. While algorithms can make decision-making cheaper, faster, more rational and predictable, such processes can also lead to negative consequences. To combat discrimination and arbitrariness as much as possible, the Dutch government's Algorithm Register was set up. This blog explains how the Algorithm Register works and describes areas where there is still work to be done.

July 12, 2023

Background articles

Background articles

The registry of Dutch government organizations

It is important that there be openness about the digital systems used by the government. Algorithms are included in this. To that end, an online Algorithm Register has been available since December 2022. In the register, government agencies publish information about the algorithms they use. This includes information on the purpose and impact of the algorithm, any Data Protection Impact Assessment ("DPIA") or Human Rights and Algorithms Impact Assessment ("IAMA") conducted, and the data sources used.

Filling out the Algorithm Register is currently only voluntary and therefore does not yet seem to be really alive among the target audience. For example, a Tweakers survey shows that in six months, only eight new algorithms have been added to the register. However, it follows from the Werkagenda Waardengedreven Digitaliseren that from 2025 all algorithms relevant to citizens must be included in the Algorithm Register. This obligation can only be deviated from when it explicitly follows from a law or from justified considerations.

In principle, government organizations themselves will be responsible for providing access to algorithms and thus creating and managing an algorithm register. The Personal Data Authority (AP), as algorithm regulator, supervises the use of algorithms in a general sense. The latter, incidentally, applies not only to government agencies, but (of course) also to companies and other organizations.

These factors will hopefully lead to a more proactive attitude among government agencies.

High-risk systems

Algorithms with a "high risk" must in the future, according to the Werkagenda Waardengedreven Digitaliseren, have a CE mark and in any case be included in the Algorithm Register. It is not ruled out that other types of algorithms will also fall under the Dutch register requirement.

The aforementioned "action" - a concrete act based on the goals as included in the Work Agenda - is in line with the systematics of the Artificial Intelligence Act (AI Act). After all, the AI Act contains a risk methodology, under which specific obligations apply to the use of an algorithm based on the relevant level of risk. Those obligations include transparency requirements for high-risk AI systems and their providers, as well as the obligation for users to register their AI system in an EU database.

The registration obligation for Dutch governments, depending on the final design, could possibly be seen as an additional safeguard to the AI Act. Where the AI Act in principle leaves little room for rules with respect to providers of AI systems, it is possible (based on the current text) to impose additional rules on users. Like in this case government bodies. How the Dutch register obligation will eventually be formulated depends in part on the final text of the AI Act. That is expected to follow at the end of 2023. The (additional) Dutch rules regarding the Algorithm Register are expected in early 2024.

Incidentally, it is assumed that where the Values-Driven Digitization Work Agenda refers to "high risk," it follows the definition given to that term by the AI Act - at least the current version.

Implementation framework 'Responsible use of algorithms'

With the entry into force of the AI Act still some time away, and with government agencies increasingly using systems that will fall under the scope of the AI Act in the meantime, an Implementation Framework "Responsible Use of Algorithms" ("IKA") has already been developed (1).

The IKA provides an overview of the most important standards and measures that must be met, and standards and measures that are not mandatory but serve as a guide for safeguarding so-called public values. Based on the overview as included in the IKA, it can be tested to what extent an algorithm complies with the standards, and whether its implementation does not lead to (major) risks. Thus, pending EU legislation, the IKA can already be used to identify and minimize risks to citizens.

Unlike the AI Act, the IKA is a "dynamic document," meaning that any requirements can be added as appropriate based on relevant developments.

How to move forward?

Both the Algorithm Register and the IKA are already available online. However, at this time there are no concrete Dutch obligations regarding the aforementioned resources. This is because the final text of the AI Act is being awaited. Currently, the European Parliament (2) is negotiating with the Council and the Commission on the final form of the AI Act.

However, this does not mean that government agencies cannot (and perhaps should not) take steps now regarding transparency and risk mitigation in the use of algorithms. The Algorithm Register can already be used and the IKA can already be followed. Also, frameworks and procedures regarding the use of algorithms can already be started, and, for example, making the internal organization aware of potential risks of using algorithms and AI, and of the AI Act in a general sense.

In short, a call to all active algorithm users in government: get to work!

  1. https://open.overheid.nl/documenten/9b7b55fd-1762-499b-b089-2b7132c12402/file

  2. https://www.europarl.europa.eu/news/en/press-room/20230609IPR96212/meps-ready-to-negotiate-first-ever-rules-for-safe-and-transparent-ai

AKD

Share article

Comments

Leave a comment

You must be logged in to post a comment.

KENNISPARTNER

Martin Hemmer