Stratumseind 2.0 Privacy, powered by tech
How can new technologies be used to make a city safer and more livable while safeguarding citizens' privacy? Tinus Kanters has been appointed project manager from the Municipality of Eindhoven for Stratumseind 2.0, a project on tracking and nudging in the longest pub street in the Netherlands.
Authors: Léonhard Weijmar Schultz & Anna de Haas
Léonhard: What techniques are being used in Stratumseind to enhance the livability of the street without violating the privacy of pub-goers?
Tinus: We work from
privacy by design. If we fail to anonymize information in a timely manner, we stop the technology behind it. If one of our counting cameras records a person, that recording is stored locally for a maximum of two minutes.
Using edge computing, the on-site registration is converted into an anonymous data, and that anonymous data is eventually sent to a server. So the software that anonymizes the data hangs next to the counting camera. At present, everything is still connected by fiber optics, but we will soon replace this with a "LoRa network": a wireless connection over which data packets of up to 64kb can be sent. One advantage: this makes the system unhackable. The bandwidth is so low that it is never possible to gain access from the server to the counting camera or the software next to it.
Doesn't such a low bandwidth require you to wait a very long time for the data?
If we sent a lot of data, we would indeed have to wait a long time. But we don't. In
sound cameras for example, hanging throughout Stratumseind, are 64 microphones that can accurately record where a glass falls or fireworks are set off. That sound camera records 50MB of data per second. If you want to store all the data of a weekend, you need a large hard disk. Nine times out of ten, however, nothing happens. Therefore, in the interest of data minimization, we throw away what we don't need even before it is sent to a server. Thus, only a small amount of data needs to be sent. Here we also look at purpose limitation: why are we storing something? Especially if we start combining anonymized data, we need very little information to recognize deviant behavior. Tilburg University helped us create an algorithm that recognizes when a number of dots move differently than the dots around them. The algorithm then draws conclusions from that. A nice example of privacy by design; in the end, you don't need anything more than a dot.
How do you anonymize conversations?
The sound camera does not look for words, but for sound characteristics such as stress. The software still needs to be trained in this. We have now caught all fights, but the sound camera also sometimes records a noisy bachelor party. The police inform us per report from the automated system whether there was actually a fight or not. We make a list and correct the software where necessary. At some point you have
deep learning address. Then the software itself recognizes when something can be thrown away. We hope that the software will eventually recognize patterns, so that in the future the system can sound the alarm a few seconds before a fight breaks out. Deep learning also has a downside. The system draws conclusions based on the information it has. Algorithms can therefore teach themselves strange properties. There are plenty of examples of this, yet they are being used more and more. I am glad that the General Data Protection Regulation (AVG) says something about the use of algorithms. For the creation...
Read the rest of the article 'Stratumseind 2.0: Privacy, powered by tech
' in the free magazine Privacy and the Municipality.