We are concerned about the lack of control over police use of facial recognition. A request we made under the Open Government Act shows that the police are not complying with existing legislation around data protection and privacy. We are sounding the alarm with the Personal Data Authority.
We have known since 2016 that police have been using facial recognition. Now police are experimenting with new applications of this sweeping surveillance technology. An expansion of the CATCH database to include images of suspects who could not previously be identified and FaceF1nder, which allows police to search in which images a particular face appears elsewhere.
Existing data protection legislation, which also applies to the police, requires that for things that pose a high risk to privacy and data protection rights, the police must involve the Data Protection Officer and do an inventory and assessment of the impact on those rights. This is known as a data protection impact assessment. The documents we have requested invoking the Open Government Act now show that the police have not done this for new applications.
That the police do not follow the law very closely when deploying facial recognition is obvious by now. For example, there was never a democratic debate preceding the deployment of CATCH, and thus there was never a specific legal basis for the deployment of facial recognition by the police. The police are now trying to fill that gap with a self-drafted "Deployment Framework for Facial Recognition." This leaves the consideration of whether a particular new application is deemed acceptable - and thus can be deployed - to an internal committee.
Not for nothing is the Personal Data Authority critical here. The president calls this course of action at NOS Read the entire article about the new expansion of facial recognition by the police at NOS here the reverse order. There should be legislation first, and only then can the police start working on a protocol. That legislation is not currently in place.
When the police start working on their own and invest the review internally, it is extra important that the internal safeguards for the protection of human rights and our free society are in place. Therefore, we requested - with a request under the Open Government Act - from the police documents that should be there before such an application takes place: the assessment of the impact that the application makes on the right to privacy and data protection as well as the opinion of the Data Protection Officer. And now what turns out? There aren't any. In fact, it seems that the Data Protection Officer was not actively informed, but had to indicate herself after rumors that she wanted to be informed.
See the decision and documents for yourself here (1)
We are deeply concerned. Of course about the effects of the deployment of facial recognition technology on our fundamental rights and free society. But also about the position of internal police surveillance. That is why we have written a tip to the Personal Data Authority. It is time to stand up for the rights and freedoms of citizens and bring internal supervision up to standard.
See our tip to the Personal Data Authority here (2).
https://www.bitsoffreedom.nl/wp-content/uploads/2024/03/20240321-besluit-plus-documenten.pdf
https://www.bitsoffreedom.nl/wp-content/uploads/2024/03/tip-ap-inzetkader-positie-fg.pdf