The City of Amsterdam has been fighting against unlimited rentals through Airbnb for some time. It was recently announced that Amsterdam is going to conduct a trial with an algorithm to track citizens who do not comply with the rules for home rentals through such platforms. According to the municipality, they will keep a close eye on privacy in the process. They should, after it was ruled in early February that fraud detection system SyRI violates the human right to respect for private life (Article 8 ECHR). This blog takes a closer look at the regulation and enforcement of platforms such as Airbnb and the use of algorithms for fraud detection by the government.
Regulation and enforcement of platforms like Airbnb is not uncontroversial. A December 2019 ruling by the European Court of Justice determined that Airbnb qualifies as an "information society service." As a result, EU countries cannot in principle subject information services from another EU country to rules that are stricter than in the country where the service is based (Ireland in this case). In practice, the bottom line is that Amsterdam does not impose rules on Airbnb, but must impose rules on residents who rent out their homes through the platform.
Enforcing residents is not without controversy either. In fact, Amsterdam had stipulated in municipal regulations that renting out a home to tourists without a permit was allowed, provided the landlord, among other things, reported this to the municipality. In late January of this year, the Council of State ruled that these municipal rules violated the Housing Act. Consequence: renting without a permit (but with notification) is not allowed at all. Nor is imposing a fine for not reporting. Amsterdam must therefore revise its rules.
However, other rules remain in place: Amsterdam continues to enforce its rules which stipulate that the house may be rented for a maximum of 30 nights and to a maximum of 4 people. Moreover, the landlord must be the main tenant and registered with the Basic Registration. Amsterdam therefore wants to use the algorithm to make checking this easier and less labor-intensive. In doing so, careful attention must be paid to privacy law aspects.
Last month, the District Court of The Hague ruled that System Risk Indication (SyRI), a legal tool used by the government to detect fraud in the area of social security, among other things, violates the human right to respect for private life (Article 8 of the European Convention on Human Rights (ECHR). In SyRI, an algorithm searched interconnections between data from different government agencies. Based on these links, SyRI automatically created risk profiles for social security fraud. Moreover, if SyRI found a person worthy of investigation, a risk notification was made. This is very reminiscent of the algorithm that Amsterdam now wants to deploy.
In its assessment, the District Court of The Hague took into account, among other things, the general principles of data protection from the General Data Protection Regulation (AVG). According to the court, the purpose of fighting fraud is sufficient in itself to justify an invasion of private life. But in the case of SyRI, the court ruled that SyRI does not have sufficient safeguards because the deployment is insufficiently transparent and verifiable.
It follows from the SyRI case that when deploying algorithms for fraud detection, the following should be considered, among other things.
Inform commitment. Everyone has the right to follow personal data to a reasonable extent. Residents must therefore be informed about data processing under the project, and in an individual case that a risk notification has been made. Informing only when there is a control that follows the risk notification is not sufficient.
Transparency system. The system must be transparent and verifiable. According to the court, the use of an instrument like SyRI requires special responsibility, because it is difficult for residents to oversee the effect of such an instrument. Therefore, it must be clear how the risk model works so that it can be verified that data is processed on the right grounds, including by residents.
Stereotyping. The deployment of a system like SyRI can result in discriminatory effects, for example, because it is only deployed in "problem neighborhoods" and unintentional connections are made. This risk must therefore be adequately addressed.
Purpose limitation and data minimization. With SyRI, a large number of data from different government agencies could be combined, so that "it is hard to think of a personal data that is not eligible for processing in SyRI. Whether it was really necessary to provide or link all that data was not tested beforehand.
Not only in case law, but also by regulatory bodies, the use of algorithms is being keenly watched. In February, the Personal Data Authority (AP) published a document describing points of attention for the use of algorithms. These correspond in part to what the District Court of The Hague ruled. Moreover, the AP will focus specifically on AI and algorithms in its supervisory work through 2023. The European Commission is also paying attention to AI.
The nuisance caused by tourists in big cities like Amsterdam is partly attributed to platforms like Airbnb. Whether that is entirely justified remains to be seen. The fact is that government enforcement of Airbnb itself is not easy. Whether enforcement through the landlords by deploying a fraud detection algorithm is legally permissible is equally uncertain. In any case, Amsterdam will have to be very careful when deploying those algorithms, even though many Amsterdam residents are not happy with the large number of houses being rented out through Airbnb.
More articles by SOLV Lawyers