Under the General Data Protection Regulation (AVG), it specifically addresses profiling and automated decision-making, whether by profiling or not. Profiling and automated decision-making are being used in more and more industries. Consider banking, healthcare, insurance companies, marketing and advertising. Thanks to technological developments and the possibilities of big data analysis and artificial intelligence, it has become even easier to create profiles and make automated decisions.

Profiling and automated decision making can be beneficial to data subjects (those to whom personal data relates) and organizations. After all, organizations are better able to segment their market and target audiences and thus can better tailor their products and services to individual needs, which ultimately benefit individuals.
However, profiling and automated decision-making can also pose risks to the privacy of data subjects. For example, many people often do not know they are being profiled and do not understand how it works. In addition, it can lead to a person being pigeonholed and even, if data is inaccurate or incomplete, to inaccurate predictions and (unjustified) refusal of delivery of certain products or services and even discrimination.
The AVG does not only address decisions made as a result of automated processing or profiling. It applies to the collection of data for profiling, as well as the application of those profiles to individuals.
The AVG defines profiling as "any form of automated processing of personal data in which, on the basis of personal data, certain personal aspects of a natural person are evaluated, in particular with the aim of analyzing or predicting his professional performance, economic situation, health, personal preferences, interests, reliability, behavior, location or movements."
With regard to automated decision-making, the AVG stipulates, in a provision specific to it, that data subjects have the right not to be subjected to a decision based solely on automated processing, which may include a measure - on personal aspects concerning them, which has legal consequences for them or affects them significantly in a similar way, such as the automatic refusal of a credit application submitted online or of processing job applications via the Internet without human intervention.
Thus, when it comes to profiling and automated decision-making, three forms are conceivable:
Profiling
Profile-based decision making (e.g., a person decides whether the loan can be made based on a profile)
A decision based purely on automated processing, whether or not based on a profile (for example: an algorithm determines whether the loan can be granted and the decision is automatically shared with the data subject, without human intervention)
The prohibition against automated decision-making applies only if there is no human intervention at all. If a fully automated process provides a recommendation regarding a data subject, but there employee first evaluates other factors in making the final decision, the decision is not based solely on automated processing.
Note that as an organization (data controller) you cannot circumvent the prohibition by fabricating human intervention. For example, if someone applies automatically generated profiles to data subjects without any actual influence on the outcome, this is still a decision based solely on automated processing.
To qualify as human intervention, the person in charge must ensure that any oversight of the decision is meaningful, rather than just a gesture. It must be performed by someone authorized to change the decision. As part of the analysis, they must consider all available input and output data.
The ban on exclusively automated decision-making only applies if the decision has legal effect, think of the refusal or granting of rent or child support or the automatic blocking of your cell phone because the bill has not been paid, or otherwise significantly affects the person concerned.
For the latter, the AVG cites only two examples in the recital: the automatic refusal of a credit application submitted online or processing of job applications over the Internet without human intervention. But determining on this basis whether a decision "significantly affects" a data subject is not so easy. Indeed, when it comes to credit applications, it would also mean that not only an automatic credit check on a mortgage application would be covered but also when it comes to renting a bicycle abroad or buying a television on credit.
Especially in online advertising, automated decision-making is important. In this regard, the European regulator says that in the case of targeted advertising, in principle, there is no significant impact. For example, consider an advertisement for an online fashion webshop based on a simple demographic profile: 'women in Amsterdam'. However, there are also conceivable situations in which a targeted advertisement significantly affects a data subject. This depends on the size of the profile, the expectations of the persons, the way the advertisement is delivered or the specific vulnerabilities of the person.
Processing that generally has little effect on individuals may in practice have a significant impact on certain groups in society, such as minority groups or vulnerable adults. For example, someone in financial difficulty who regularly sees advertisements for online gambling may sign up for these offers and potentially incur further debt.
However, automated decision-making, whether based on profiling or not, should be possible when expressly permitted by law applicable to the controller, including for the purposes of monitoring and preventing tax fraud and evasion, and ensuring the security and reliability of a service provided by the controller, or necessary for the conclusion or performance of a contract between the data subject and a controller, or when the data subject has given his or her express consent.
So it is not yet so easy to assess when an automated decision falls under the prohibition in the AVG. The European regulator has tried to clarify some things in guidelines issued in October 2017, but it is expected that this still needs to be crystallized further, by national regulators or otherwise.
This article can also be found in the AVG file
More articles by SOLV Lawyers
