Menu

Filter by
content
PONT Data&Privacy

0

Autoriteit Persoonsgegevens: trends, risks and recommendations in the Labor and Social Security sector

Last month, the Autoriteit Persoonsgegevens ("AP") released the Labor and Social Security Sector Assessment. In its sector picture, the AP describes the trends it observes in the area of Labor and Social Security, the risks of these trends and recommendations to counter these risks. In this blog, I discuss some of these trends, risks and observations made by the AP.

16 May 2024

Trends and risks

Several trends were identified by the AP, particularly in the Labor sector, as it relates to almost all sectors in the Netherlands and many developments are taking place in this sector.

Trend: health checks in the workplace
The AP describes the trend that there is an increasing interest among employers to be aware of the health status of employees, such as by taking an alcohol, drug or medication (ADM) test at the gate. The AP points out this is not permitted unless it involves taking ADM tests for certain occupations listed in the Shipping Act, Railroad Act, Local Railroad Act and Aviation Act.

It is indicated by the AP that there is usually no basis under the AVG for employers to process employee health data and that there is a risk of employers doing so unlawfully when they do process health data. The AP also emphasizes that the basis of explicit consent for processing health data usually cannot be invoked, given the hierarchical relationship between employer and employee. The AP recommends using the occupational health and safety service or company doctor when an employer wants to process health data.

Furthermore, the AP mentions that since 2020, an amendment to the Occupational Health and Safety Act is being worked on that will create a legal basis for employers to take ADM tests. This only concerns specific functions that are crucial for safety within Brzo companies (companies that work with many hazardous substances), where strict conditions apply for the use of ADM testing. Furthermore, the AP mentions that the possibility of creating a legal basis for other specific functions with a high safety risk is also being explored.

Trend: employee tracking systems
Employee tracking systems are digital applications used by employers to track and/or monitor employees. The AP points out that there is not always a basis for employers to use such systems to process (special) personal data. Also, the use of personnel tracking systems, especially in combination with the use of AI and algorithms, can, among other things, lead to a higher workload among employees because it allows them to be monitored even more closely, constantly and cost-efficiently by their employer, according to the AP.

The AP recommends employers determine whether there is an exceptional ground to process special personal data, for example, because it is necessary for authentication or security purposes. The AP also mentions the role of works council in the deployment of personnel tracking systems: the works council has the right of consent when introducing a personnel tracking system.

Trend: AI and algorithms in the job application process
The AP describes that employers are increasingly using AI and algorithms to improve existing work processes and increase production, as well as to facilitate the use of algorithms in the recruitment and selection of potential employees.

Risks the AP points out when using algorithms in the application process are discrimination and profiling. Also, algorithms and AI applications are not transparent and accurate, along with the risk of bias. When applicants or employees are not informed that they are subject to AI and algorithms, this violates, among other things, the right to information and the right not to be subject to automated decision-making. As an employer, the AP recommends asking whether the use of AI and algorithms in existing work processes is necessary. If it is, then as an employer you should know how the algorithm is set up and be able to explain it to employees.

The AP also reflects on the upcoming AI Act. The AI Act places several obligations on users of certain AI systems. For example, AI systems in the areas of employment, workforce management and access to self-employment, including AI systems for recruitment, selection and screening are classified as high-risk systems, based on which various obligations apply to users of such systems. However, there is an exception to this classification as a high risk system, namely when such an AI system does not pose a significant risk of affecting the health, safety or fundamental rights of natural persons. Incidentally, this is only the case with a limited number of AI systems, including with an AI system intended to perform a limited procedural task or an AI system intended to improve the result of an activity previously completed by a human being. However, when an AI system is used for profiling natural persons, it is always classified as a high-risk system. In addition to these high-risk systems, there are also prohibited AI applications under the AI Act, such as an AI system for emotion recognition at work. The AI Act has not yet taken effect, but is expected to take effect this month, after which the AI Act will apply incrementally over the next few years.

In that light, the AP further points to the upcoming Platform Directive. The Platform Directive applies to work platforms and people employed through such a platform and contains new rules for algorithmic management: the automation of labor control. It also contains new rules for automated monitoring and decision-making by platforms. Among other things, the Platform Directive aims to increase transparency, fairness and accountability in the use of automated platform monitoring and decision-making systems. The Platform Directive is not yet in force, but when it is, member states will have two years to implement it into national law.

Trend: use of algorithms in the social domain
The AP indicates that in the social insurance sector/social domain, various government organizations process large amounts of (special) personal data, and that these parties often do more with these data than is part of their statutory duties. According to the AP, these parties act from the idea of providing the best possible service to clients, so that, for example, personal data are linked or exchanged in order to track down citizens who may be entitled to benefits and do not use them. By linking or exchanging personal data, there is a risk of exceeding the boundaries of what is allowed under the AVG or other applicable laws and regulations. The AP cites as examples the obfuscation of purpose limitation, the lack of a basis for processing, and the overly broad interpretation of the legal principle of necessity.

Furthermore, the AP points to the development within the social domain in which government organizations are using algorithms to profile, for example, citizens on welfare benefits for fraud risk. Citizens with a higher fraud profile can count on more intensive monitoring by their municipality after profiling, according to the AP. The AP also mentions that profiling on the basis of the neighborhood where someone lives or on the basis of form of residence and thus indirectly on 'race' can result in discrimination against people. According to the AP, the deployment of algorithms and experimentation with AI is often not properly thought through. One reason for this may be that the right people are not at the table, such as the Data Protection Officer ("FG").

The AP has the following recommendations for the social insurance/social domain:

  • Linking/exchanging personal data: linking/exchanging personal data - even if done from a noble objective - is only possible when a basis can be invoked for this. The AP recommends investigating the legal possibilities with the various parties. Guidance can also be requested from the AP if necessary. If the problem is structural, the AP recommends bringing it to the attention of the legislator and asking the legislator to create a new basis.

  • Protection of personal data and involving the FG: the AP points to the involvement of protecting personal data as well as involving the FG from the beginning of a new (technological) project in which personal data are processed. Furthermore, the AP advises making good use of a DPIA (Data Protection Impact Assessment, also known as "data protection impact assessment"), which identifies the risks of a proposed processing and then measures can be taken to reduce these risks. Where necessary, the AP can be consulted. In addition, the AP points to testing the necessity of a processing, whether a basis can be invoked and whether there is transparency.

  • Sound security policy: the AP points out the importance of a sound security policy, which reduces the chance of data leaks. The AP mentions the undesirable scenario of sensitive personal data of benefit recipients ending up in the streets. It is therefore up to benefit agencies to ensure that this does not happen. In addition to a sound security policy, regular employee training also contributes to this. Raising awareness among employees can also be done via the intranet, for example.

Main observations AP

The AP notes that there are wide variations in the privacy maturity of organizations, both in terms of leadership and oversight and in terms of awareness and consciousness:

  • Dealing with privacy: some organizations are consciously concerned with privacy, with it also being on the board's agenda while for others it is a "fill-in-the-blank" or side business.

  • Role of the FG: In some organizations, the FG has a proactive role and can therefore function well, and in other organizations this is not the case at all or to a much lesser extent. However, an active FG usually has a positive impact on the organization. If an FG also performs other work, as an organization you have to ask whether the FG then has enough time to properly perform the FG duties.

  • Training and awareness: many organizations pay close attention to training their employees on privacy. However, this is not true for all organizations. Training is important to increase employee privacy awareness.

  • Works council: there are differences between works councils in terms of knowledge of the AVG. If works councils are not knowledgeable about this, they cannot properly exercise their rights under the Works Council Act.

AKD

Share article

Comments

Leave a comment

You must be logged in to post a comment.