The number of street cameras in the Netherlands is increasing every year. Earlier this year, the mayor of Amsterdam announced an extension of camera surveillance at several locations in the city to prevent the threat of future serious incidents. Although research shows that Amsterdammers value street safety higher than their own privacy, there should be a way to strike the right balance between these two interests. With the introduction of a blurring algorithm to camera images, the City of Amsterdam aims to achieve just that.

The City of Amsterdam has announced a new algorithmic application for recording street images by surveillance cameras. The algorithm, called "Blurring as a Service," causes people appearing in a camera image to be blurred and made unrecognizable. In addition, the algorithm also anonymizes car license plates. Councilman Alexander Scholtes explains that the purpose of this is to better ensure the privacy of Amsterdam residents, so that citizens no longer have to feel spied on when they walk down the street.
The Municipality has already begun anonymizing individuals on panoramic images taken from street level. These images are collected annually by the Municipality for various purposes related to public space inspection. Nevertheless, the Municipality also refers to other situations where the algorithm can be used, such as with scanning cars and photos taken by citizens who make a report. In the future, the number of applications may increase by training the algorithm with other types of photos, which will make people unrecognizable even on normal images. "This will make Blurring-as-a-Service more widely applicable to municipal tasks and services," the municipality said. It is possible that the self-developed software will also be offered as a service to other municipalities.
The city's Computer Vision Team trained the new tool with about 10,000 raw images to learn to recognize people. Then the developers kept a portion of them separate for the testing phase. The municipality states that as many demographic groups as possible are represented in the dataset to avoid bias, for example, against people of color or certain age groups. In a fairness impact assessment, the Computer Vision Team states that this bias has now been minimized with respect to gender, age and skin color (1). However, the report shows that errors can still occur in the system, as is the case with children standing far from the camera. The municipality adds that "The Computer Vision Team is going to further develop the algorithm so that it is even better trained to successfully recognize and anonymize children even at a great distance from the camera," but without mentioning a specific deadline.
In a recent report, the Amsterdam Court of Audit tests the management framework for algorithms deployed by the municipality (2). In doing so, the Court finds that there is not enough clarity on the requirements for the "in-house development of algorithms. While the Amsterdam management framework requires suppliers of software purchased by the municipality to continuously monitor and update it, a similar obligation does not apply to algorithms developed in-house by the municipality, such as "Blurring as a Service. Nevertheless, information about the new software in the municipality's Algorithm Register shows that a feedback process has been put in place to correct errors in the software as well as an annual review.
https://s3.eu-west-1.amazonaws.com/assets.saidot.ai/customerAssets/Q2l0eSBvZiBBbXN0ZXJkYW0%3D/Fairness%20analyse.pdf?AWSAccessKeyId=AKIAUKM7DXWQ2RQVHRRF&Expires=1856104281&Signature=LajisMnim1LuTtIVYZmIyfU4Rm8%3D;
https://publicaties.rekenkamer.amsterdam.nl/algoritmenonderzoeksrapport
