When security scanners were introduced at Schiphol ten years ago, there was a fuss about these "naked scanners. Opinion makers and media raised questions. Wasn't the way travelers had to 'expose' themselves in the scanner going too far?
This article was written by: Marijn Biesiot, Erik de Bakker, Tim Jacquemard & Rinie van Est
At the request of the National Police, the Rathenau Institute is investigating how citizens view the use of sensor data to improve livability and safety. To get a good idea of this, we are organizing six focus groups with Dutch citizens. With the aid of a number of scenarios, we will bring them into discussion with each other. The focus groups should provide insight into the perceptions of various citizens and the motivations behind their opinions.
Our goal is that as many aspects related to citizens' perceptions as possible are discussed in the focus groups. To do so, we use insights from previous surveys.
This article lists what scientific literature says about citizens' perceptions regarding the use of sensors and sensor data. We introduce a conceptual framework that helps us prepare and shape the focus groups according to the above objective.
Framework of terms for engaging with citizens
When the security scanners were introduced at Schiphol ten years ago, there was a fuss about these 'naked scanners'. Wasn't the way travelers had to 'expose' themselves in the scanner going too far? Could people trust that security handled the data carefully? Was it certain that there were no negative health effects? Were the scanners the right tool in the fight against terrorism?
These questions point to factors that may influence the extent to which citizens may or may not find a sensor application acceptable, such as the handling of personal data and health safety.
Preparing focus groups
In this article, we look for more of these types of factors. Understanding what is already known about citizens' perceptions helps us prepare the focus groups. In the focus groups, ordinary Dutch people will discuss in small groups various situations in which sensors (1) and sensor data are used to improve livability and safety (2). What do they think of those situations, what considerations do they make and what are the underlying reasons for their opinions?
In this paper, we introduce a conceptual framework for citizens' perceptions about the use of sensors and sensor data. Our starting point is: someone (a subject) has a perception of something (an object). In other words, a citizen either finds a sensor application acceptableor not. The subject and object of perception form the first level of our conceptual framework.
Introduction conceptual framework. Source: Rathenau Institute
Whether a citizen finds a sensor application acceptable (or not) may have to do with characteristics of the citizen himself (is someone open to sharing information?) and of the sensor application (what kind of information is collected?). The conceptual framework is a tool to search for relevant factors in practical examples and scientific literature, and then to organize the factors in a logical way into dimensions related to the subject or object.
How is this article structured?
We first explain why in this article we focus primarily on surveillance and not on the other forms of sensor surveillance we identified in the previous article "Eyes and ears everywhere. We then study two practical examples of surveillance with sensors and identify three important dimensions of sensor applications. We then search scientific studies for factors that can influence the extent to which citizens find a sensor application acceptable or not. With these insights, we further complete our conceptual framework. Finally, we return to the various forms of sensor surveillance and, based on the insights from this article, we formulate some guidance for the focus groups.
Surveillance with sensors
The security scanners at Schiphol Airport are an example of surveillance with sensors to enhance security. Surveillance involves "targeted, systematic and routine" searches for personal details.(3) In this case, airport scanners look for objects on the body, but sensors can collect all kinds of data (such as locations, fingerprints, sounds and images).
In the previous article, we saw that "sensor surveillance" is a dynamic and interactive process. It does not happen from one central point. Citizens, government agencies and companies watch each other and are watched. In addition to government agencies and companies monitoring citizens(surveillance), citizens conversely also use sensors to monitor organizations(sousveillance) and other citizens (horizontal surveillance). In addition, citizens deploy sensors to hold themselves to rules for livability and safety(self-surveillance). Figure 1 illustrates these four forms of sensor surveillance.
In this article, we focus on surveillance. This is because more is known about what citizens think about surveillance with sensors, than about the other forms of sensor surveillance. The factors we find from this perspective may also play a role in the other forms of sensor surveillance. We will return to this at the end of this article.
Four forms of sensor surveillance. Source: Rathenau Institute
Two examples: questions about surveillance in practice
We take a closer look at two practical examples in which sensors and sensor data are being used to improve livability and safety: security scanners at Schiphol Airport and a trial with sensor data in Roermond. We highlight some of the social questions raised by opinion makers.
1. Security scanners at Schiphol Airport
At Schiphol Airport, security scanners check that travelers are not carrying prohibited items on their bodies. In 2009, a Nigerian man's attempt to blow up a plane from Schiphol Airport to Detroit with a bomb failed. Since then, more and more security scanners have come into use to prevent such attacks. At Schiphol Airport, these sensors use millimeter waves that pass through clothing but are reflected by skin and other materials. Thus, suspicious objects are detected.
In 2007, passengers at Schiphol Airport were first checked by a security scanner on a small scale.(4) These scanners showed prohibited materials hidden under clothing, as well as a three-dimensional black-and-white image of the traveler's body. As a result, the scanners quickly became popularly known as "nude scanners. To ensure privacy, a security guard reviewed the image in a separate room. In the process, the traveler's face was shielded and the image was deleted after review. The use of the security scanners at Schiphol Airport was changed after criticism of the 'nude images' of travelers.
Modification of scanners
The security scanners at Schiphol now indicate the spot of suspicion on a drawn figure. This picture of the scan is visible on the spot to both the traveler and the security officer. Thus, the results of the scanning process have been made transparent to the traveler. Schiphol says: "The scan is analyzed by a computer and not by a security employee. The scan cannot see through your body nor does it see you naked. This guarantees your privacy.'
Civil society organizations and opinion makers are raising several critical questions about the security scanners at Schiphol Airport. In 2009, civil rights organization Bits of Freedom expressed concerns about the infringement of the 'naked scanners' on privacy and bodily integrity. In their view, travelers are being "digitally undressed. How can you be sure that security personnel do handle imagery with integrity? In addition, Bits of Freedom pointed out that the scanners are not error-free, involve high costs and can only partially reduce the risk of attacks. Ancilla van de Leest, then the list leader of the Pirate Party, which advocates for digital civil rights, raised the issue of the rules and procedures involved in these scans for pregnant women in the summer of 2015.
So in the eyes of Bits of Freedom and Van de Leest, security scanners also have less desirable sides. They question whether the use of security scanners is an effective tool in the fight against terrorism. Can attacks be prevented with a measure like the security scan? On the other hand, what do citizens think if an attack could have been prevented with a security scanner and this technology was not deployed?
2. Sensors as point counters in Roermond
The city of Roermond is highly affected by shoplifters and pickpockets who come mainly from Eastern European countries. In 2017, there were 456 reports of pickpocketing and 383 reports of shoplifting.(5) Roermond is an attractive location for this type of crime. This is partly due to the large number of foreign visitors from Asia and Russia who flock (often with cash) to Europe's largest outlet center. About eight million visitors flock to the Designer Outlet Roermond every year. Together with the other facilities, Roermond attracts a total of about fourteen million visitors annually, out of a population of less than sixty thousand.
Despite the deployment of private security guards, collaborations with the German and Romanian police, and adjustments in the local ordinance, the number of incidents has not decreased significantly over the past three years. The police are testing in a so-called "living lab" whether a new approach with sensor technology can change this.(6) In July 2018, the police together with the municipality of Roermond, TU Eindhoven and the Public Prosecutor's Office launched a trial in which sensor data are linked to detect possible gangs of itinerant criminals as early as possible.
From data to action
The living lab in Roermond works with smart sensor applications that recognize patterns and an associated point system. From the analysis of the linked data, a "suspicious profile" can roll. Not all sensors and sensor data are already being used, but the idea for the first profile is as follows: ANPR ((Automatic NumberPlate Recognition) ) cameras above the access roads to the Designer Outlet Roermond register the license plates of and the number of passengers in all cars. If a car with a Romanian license plate drives by, for example, it receives ten points. If there are four people in a white rental car, additional points are added. Wi-Fi trackers analyze phone data and track further movement patterns.
When enough points have been awarded to (a group of) people, the police take action. For example, more attention if the vehicle stands out again in the coming week, asking for extra surveillance by security or, in the case of recognized cars from previous incidents, a follow-up action by the police themselves, such as addressing the driver. Such concerted action based on sensor data should also act as a deterrent.
Practice
In early 2019, the above linking of sensor data and joint action in response to suspicious profiles are not yet fully implemented. At that time, for example, license plates are already recorded, but not yet the number of occupants. Also, tracking of movement patterns is not yet taking place.
In local politics there is support for this testing ground. However, there is a cautionary note from society. Digital civil rights organization Bits of Freedom expressed concern in September 2018 that citizens are a priori suspect when they behave abnormally or meet certain criteria. If similar systems are deployed more frequently in the future, Bits of Freedom said, there is potentially nothing left that can escape the eye of regulators. Any form of wrong behavior can be punished by automated means. But what happens when innocent people are falsely identified as suspects(false positives)? Who is responsible for any negative consequences? An algorithm? The Roermond living lab does not involve automation of punishment, but Bits of Freedom sees the danger of a slippery slope here.
Lokke Moerel, professor of Global ICT Law at Tilburg University, wonders if there aren't less intrusive ways to track down pickpockets than collecting data on every passerby. The issue she raises is whether collecting information on large numbers of citizens is proportionate to apprehending a possibly small number of pickpockets.
The Personal Data Authority (AP) pointed out in November 2018 that digital tracking of people on the street, in shopping malls and (semi-)public places is only allowed under strict conditions. Wi-Fi tracking falls under personal data and thus under privacy rules. AP board chairman Aleid Wolfsen: 'There are virtually no reasons that make tracking shoppers or travelers lawful. Moreover, there are less intrusive methods to achieve the same goal without violating privacy.' (7)
Three dimensions of sensor applications
The above two practical examples touch on several societal questions about surveillance. These questions relate to three main dimensions of sensor applications:
The operation of sensor technology;
the social practice in which the technology is applied and the actors within it;
the broader social, cultural and institutional context within which said practice is shaped.
Perceptions of citizens can be about all these dimensions of a sensor application. So can the factors that can influence these perceptions. In Figure 3, we have included relevant factors from the practical examples in the conceptual framework.
Concerns about health risks from security scanners, for example, focus on the operation of the technology itself. The desire of travelers to be able to see on the spot how the scanner displays data about their bodies is about the application in practice at Schiphol Airport. Trust in security services is about the social and institutional context in which the application takes place. In practice, the three dimensions are not separate. The extent to which a sensor application infringes on people's privacy, for example, may be related to the design of the technology itself, but also to the procedures in the specific practice in which the technology is used.
In the second dimension (social practice), "actors," is a collective term for all parties involved in the sensor application. These include, for example, the person or organization collecting sensor data, the analyst who gets to work with the results of the computer programs that link and analyze all the sensor data, the cop on the street or the security guard in the store who takes action. In addition, sensor data can be collected on different (groups of) people. For example, anyone walking down a shopping street or driving their car past an ANPR camera.
In the practical examples discussed, sensor data are automatically collected and analyzed by computer programs, but there is always an agent or security officer who studies the information and makes a decision on what action is needed. Thus, within the system of collecting, analyzing and applying sensor data, humans still play a decisive role.
Inventory of factors from case studies. Source: Rathenau Institute
Later in the article, we further fill in the conceptual framework with factors related to the subject. To do so, we first look at Dutch studies.
Factors from Dutch research
Every year, Capgemini surveys how citizens in the Netherlands view developments in the security domain. Figures from Trends in Security 2018 show that 23 percent of citizens in the Netherlands feel very confident and safe with the growing number of cameras in public spaces, while nine percent say they experience the opposite.
Capgemini also asks citizens how they feel about the use of sensors to enhance security. For example, 78% of those surveyed are (very) positive about the use of bodycams by the police and 6% (very) negative. The survey does not provide insight into the arguments and considerations behind citizens' opinions.
Little (recent) empirical scientific research has been done in the Netherlands on the wishes, concerns and considerations of citizens about the use of sensor data to improve liveability and safety. Research from the 00s shows that citizens are more inclined to allow more privacy-sensitive interventions when it comes to solving serious crimes, and that citizens are more positive about checking camera images than phone tapping or house searches.(8) Citizens also find it important to know what happens to their information, for what purpose their information is used. They trust the police more than private security companies.(9)
Furthermore, it emerges that citizens who are more open to sharing information are more positive about the deployment of sensors for surveillance purposes than citizens who are less inclined to share information.(10) A study also shows that sex (gender) can influence perceptions: men are more likely to consider the party deploying the sensor (such as the police or a security company) to be more important, while women are more likely to value the purpose of detection.(11)
A factor such as effectiveness of the sensor application emerges as hardly relevant (12): citizens weigh the degree of personal information and the purpose for which sensor data are used, but it seems less important for the consideration whether the tool actually works.
Three personal dimensions
The above studies show that personal characteristics (such as gender) and general attitudes of people (such as openness to sharing information) can also play a role in the considerations people make about sensor applications.
How citizens view a sensor application can be influenced by three main dimensions of the sensor application, as well as personal dimensions that color the "glasses" through which a person views a sensor application. Figure 4 illustrates these dimensions of the subject and object of perception.
Here the social, cultural and institutional context plays a role in various ways. In a broad sense, this dimension can be about legal rules for camera surveillance and about a general social trust or distrust in authorities. On a smaller scale, people's views are influenced by their immediate social environment, such as the neighborhood they live in and their work environment.
Conceptual framework: dimensions of subject and object. Source: Rathenau Institute
Factors from European research
Internationally, a number of current studies can be found on how citizens view sensor applications. From the perspective of our research, a limitation of these studies is that they focus heavily on privacy and are often studies within the American context.
In particular, the European study "SurPRISE" is interesting for the question we are investigating. This study provides an overview of factors that may play a role in whether or not citizens find a sensor application acceptable.
SurPRISE in brief
SurPRISE is a recent and large-scale study of the public acceptance of surveillance-oriented sensor technologies. The study was conducted in nine European countries. In 2012-2015, public acceptance was surveyed both quantitatively and qualitatively in Denmark, Germany, Hungary, Italy, Norway, Austria, Spain, United Kingdom and Switzerland.
The topics included (smart) cameras, biometric identifiers, drones, smartphone location tracking and deep packet inspection (a far-reaching monitoring of electronic data traffic). Two thousand citizens (about two hundred per country) were surveyed and participated in participatory workshops.
Based on an extensive literature review, SurPRISE arrives at thirty factors that potentially influence citizens' perceptions around surveillance-oriented sensor technologies. These various influencing factors were then empirically tested.
Factors influencing acceptance of surveillance with sensors
The qualitative SurPRISE study found that seven factors, nuances aside, significantly influence people's views on sensor technologies used for surveillance:
General attitude toward technology: a positive attitude toward the potential of technologies to enhance security feeds through into greater acceptance of sensor technologies. Conversely, a critical or skeptical attitude reduces their public acceptance.
Trustworthiness of institutions: trust in the relevant authorities contributes to public acceptance of sensor technologies. The use of more acceptable technologies also helps security authorities appear more trustworthy.
Social "proximity": once technologies are targeted to specific audiences, such as suspects and criminals, they are more acceptable than technologies where deployment is indiscriminate.
Sense of intrusion: the more people feel a technology intrudes into personal or everyday life, the less acceptable it is.
Perceived effectiveness: the more people are convinced that a sensor technology is effective, the more acceptable it becomes.
Substantive privacy concerns: the higher the concerns about personal data and (bodily) integrity, the lower the public acceptance.
Age: older people are more likely to accept sensor technologies than younger people.
For all of these factors, people may not always be aware that these factors influence their opinions, feelings and trade-offs.(13)
The study found that seven other factors, nuances aside, had little or indirect influence on public acceptance of sensor technologies.(14) The researchers found that "feelings of security threat" and "degree of familiarity with sensor technologies" had no significant influence. The same was true of "income," "education," "physical proximity to the technology," "expectation that the technology would have a significant impact in the future," and "representation that security and privacy are a trade-off.
'Rules of the game' for surveillance with sensors
SurPRISE also provides insight into what conditions or 'rules of the game' citizens believe must be met to make a sensor application more acceptable.(15) These rules of the game indicate what citizens consider important in surveillance with sensors.
In the participatory sessions within SurPRISE, citizens provided criteria and arguments to support their opinions and feelings about sensor technologies.(16)
Citizens find surveillance-oriented sensor technologies from the SurPRISE study more acceptable as:
these are covered by a European framework of rules and under the control of a European body;
its implementation is embedded in transparent data protection and accountability procedures;
their implementation is in the hands of government agencies and that they are used only for public purposes. Should private parties also be involved, this should be strictly regulated;
its benefits far outweigh its costs, especially when compared to non-technical alternatives that are less intrusive;
the implementation can be regulated through consent of the data subjects(opt-in approach);
affected individuals have access to the data about them and the ability to modify or delete it;
these are focused as much as possible on less sensitive data and spaces, according to criteria and goals that are publicly known;
these are not deployed in a comprehensive sense, but are tied to specific goals, times and places;
these are designed on a privacy-by-design basis;
these are used in conjunction with non-technical measures and social strategies that target the social and economic causes of insecurity.
If we look at our conceptual framework, these ground rules are mainly about the dimension of practices and actors: how sensors are applied in practice, by whom and for what purposes.
Yet citizens also place demands on the technology itself and the institutional context. Citizens consider it important that a sensor technology be designed to avoid privacy problems, for example, by minimizing data storage, opting for anonymization, and securing all information with encryption. 'Privacy by design' principlescan also be part of the procedures surrounding the use of sensor technology, such as agreeing to give as few people as possible access to the data.
Citizens also value the broader institutional context: compliance with the framework of European rules and their supervision. This also presupposes a confidence on the part of citizens that the European rules are the right ones and that they are properly supervised. In practice, these rules are in turn translated into a specific social context.
Inventory of factors from scientific studies
We organized the factors from the Dutch studies and the European SurPRISE study into our conceptual framework. Figure 4 provides an overview of factors from scientific research that may influence citizens' perceptions of surveillance with sensors.
Inventory of facors from scientific literature. Source: Rathenau Institute
In conclusion
In this article, we have developed a conceptual framework through which we offer insight into how people arrive at judgments about (the acceptability of) the application of sensor technology to improve livability and security. This framework was developed by studying surveillance.
We distinguish three personal dimensions and three dimensions of sensor applications that may play a role in people's considerations. Based on practical examples and scientific research on citizens' perceptions, we found factors that may play a role in what people think about surveillance with sensors.
Those factors can be placed within the developed conceptual framework. They thus support and give substance to that framework.
What does this mean for our further research? We conclude this article with two reflections on this. First, we show that the framework can also be well used to look at perceptions of other forms of sensor surveillance. Finally, based on our conceptual framework and the inventory of possible factors, we formulate two tools that will help us shape the focus groups.
1. Reflection on other forms of sensor surveillance
Although the conceptual framework was developed by studying surveillance, the framework is also well suited to looking at perceptions of other forms of sensor surveillance.
Below we show that the three dimensions of sensor applications also apply to sousveillance, horizontal surveillance and self-surveillance. In this way, we also gain insight into some relevant similarities and differences between the various forms of sensor surveillance.
Sensor technology
In various forms of surveillance with sensors, the technology is mostly the same. For example, video cameras can be used for surveillance (CCTV), sousveillance (cameras on cell phones) and horizontal surveillance (security cameras at the front door).
There are also differences. For example, police may use different types of sensor technologies than civilians.
Social practice and actors
The "social practice and actors" dimension contains important differences between forms of sensor surveillance. In surveillance , citizens are watched with cameras and other sensors by government agencies and companies. We have seen in previous studies that it can matter to people exactly who collects and uses the sensor data. Is personal data being used by public or private parties, and for public or for commercial purposes? What rules are these parties bound by?
In sousveillance, something interesting happens. Here, in fact, the "line of sight" of using sensors is reversed: citizens themselves wield the camera to monitor government agencies and businesses. For example, when filming police officers or emergency services at work.
In the case of horizontal surveillance, the viewing direction of the use of sensors also changes. In horizontal surveillance, citizens can be filmed by other citizens as well as film other citizens themselves.
In self-surveillance, people use sensors to monitor and control themselves (see Table 1).
These different views change not only the performer (who collects and uses the sensor data), but also about whom the sensor data are collected. And they underlie different purposes and motivations for collecting personal data.
According to the SurPRISE study, these are all factors that can influence whether or not citizens find a sensor application acceptable.
Social, cultural and institutional context
Surveillance involves trust in the police or businesses, for example. Horizontal surveillance and sousveillance are about trust in fellow citizens, including with respect to whether they comply with privacy laws, for example. Here the police and municipalities are bound by different rules than companies and citizens.
2. Tools for the focus groups
In the focus groups we want to collect various opinions and considerations. We do this by presenting scenarios to small groups of ordinary Dutch people in which sensors and sensor data are used to improve livability and safety.
To what extent do ordinary Dutch people find these sensor applications acceptable, and why? When do citizens expect sensors to be used to improve livability and safety? What reasons, experiences and feelings lie behind their opinions? What advantages and disadvantages do people see?
We want the focus groups to discuss as many aspects related to the perceptions of ordinary Dutch people as possible. Based on our conceptual framework and the inventory of possible factors, we formulate two handles that will help us shape the focus groups.
These handles refer to the two central tenets of our conceptual framework: someone (the subject) has an opinion about something (the object). People's perceptions thus involve both properties of the subject and the sensor application:
Ensure diversity in the group and have an eye for the individual
We are interested in "perceptions of citizens. This sounds abstract, but at its core it is about opinions and experiences of ordinary Dutch people 'of flesh and blood'. We have seen that personal characteristics, general attitudes and people's immediate social environment can influence how someone views sensor applications. Factors such as age, gender, education and the neighborhood in which a person lives can play a role. These are aspects we can consider when recruiting a diverse group of participants. In addition, it may be useful to gain more insight into people's attitudes before talking to them about specific sensor applications, such as whether someone generally trusts the police.
Ask participants about the key dimensions of sensor applications
We ensure diversity in the scenarios by making variations within the three key dimensions of sensor applications. After all, perceptions of citizens can be about the sensor technology being used, the specific practice and parties within it, and the social, cultural and institutional context. Does it matter to anyone what type of sensor data is collected? Does it matter whether the data are collected by the police or private security companies? Does it matter about whom the data are collected? To what extent does people's trust in the neighborhood cop or in the police as an institution play a role? And what role does "being filmed" and "filming yourself" play in the extent to which people find a sensor application acceptable?
With these tools, we will develop the approach for the focus groups. Police can also use these tools when having conversations inside and outside the organization about the socially responsible use of sensors and sensor data.
(1) By 'sensors' we mean digital measuring instruments that collect data about the physical and social environment. Examples of digital sensors are the camera and GPS on a smartphone.
(2) The research 'Citizen's perspective on the use of sensor data for liveability and safety' focuses on a broad spectrum of liveability and safety. By livability we mean minor offenses, such as throwing garbage on the street. At the other end of the spectrum are more serious crimes that cause a great sense of insecurity, such as street robbery, threats, assault and serious forms of crime such as drug and human trafficking. This study thus stays close to the mission of the police, which at its core focuses on law enforcement and emergency response. Therefore, when choosing to discuss a practical example, we always ask the question: is this a situation for which you might call the police?
(3) Boutellier, H. (2015). The secular experiment. How we started living together apart from God. Amsterdam: Boom Publishers, Chapter 4
(5) & (6) Driessen, G. 'Hunting the pickpocket'. In: de Limburger 12 July 2018.
(8) Koops, E.J. & A. Vedder (2001). Detection versus privacy: citizens' perceptions. The Hague: Sdu Publishers; Dinev, T. et al. (2005). 'Internet Users, Privacy Concerns and Attitudes towards Government Surveillance - An Exploratory Study of Cross-Cultural Differences between Italy and the United States'. BLED 2005 Proceedings 30.
(9) Schildmeijer, R., C. Samson & H. Koot (2005). Citizens and their privacy: opinion among citizens. Amsterdam: TNS NIPO Consult.
(10) Dinev, T. et al. (2005). 'Internet Users, Privacy Concerns and Attitudes towards Government Surveillance - An Exploratory Study of Cross-Cultural Differences between Italy and the United States'. BLED 2005 Proceedings 30.
(11) & (12) Dinev, T. et al. (2005). 'Internet Users, Privacy Concerns and Attitudes towards Government Surveillance - An Exploratory Study of Cross-Cultural Differences between Italy and the United States'. BLED 2005 Proceedings 30.
(13) Pavone, V., E. Santiago & S. Degli-Esposti (2015). SurPRISE. Surveillance, Privacy and Security: A large scale participatory assessment of criteria and factors determining acceptability and acceptance of security technologies in Europe, p.77
(14) Pavone, V., E. Santiago & S. Degli-Esposti (2015). SurPRISE. Surveillance, Privacy and Security: Final publishable summary report, p.6
(15) Pavone, V., E. Santiago & S. Degli-Esposti (2015). SurPRISE. Surveillance, Privacy and Security: A large scale participatory assessment of criteria and factors determining acceptability and acceptance of security technologies in Europe
(16) Pavone, V., E. Santiago & S. Degli-Esposti (2015). SurPRISE. Surveillance, Privacy and Security: Final publishable summary report, p.7
Source: Ratheneau Institute
This article can also be found in the Internet of Things dossier