B. Technology in service of democracy and fundamental rights
Privacy and the protection of personal data are essential for our freedom and security. If governments or companies infringe too deeply upon our privacy, we are prevented from thinking freely, speaking freely, and exchanging ideas freely. This leads to conformism. An all-seeing government stifles diversity and creativity in society. If companies know too much about us, we are exposed to the risk of having our opinions and preferences manipulated. Privacy is not only an individual right, but also a common good.
Smart cities are only really smart when they handle personal data carefully. They need to have a good reason to collect and process personal data and they must be able to explain this. This follows from the European Union's General Data Protection Regulation (GDPR) and its underlying principles: lawfulness, fairness, transparency, purpose limitation, data minimisation,[1] accuracy, storage limitation, integrity, confidentiality, and accountability. Cities must demand the same from the companies they work with. Contracts with companies partnering with the smart city must be public, especially in connection with tasks in which personal information is collected and processed. The transparency principle of the GDPR is at stake here.
The supervisory role of municipalities does not have to be limited to their own organisations and the companies they contract. They can make agreements about privacy and data protection with all companies and institutions operating within the municipal boundaries.[2] Rules that apply to everyone can be laid down in local regulations, for example concerning the use of sensors in the public space.
Some companies treat personal data as merchandise. However, rewarding people for their data puts them to an improper choice between economic gain and preserving their privacy. Trade in personal data undermines privacy as a common good and leads to a society in which the rich have more privacy than the poor. Municipalities should not provide support to companies that purchase or resell personal data, whether they are start-ups or tech giants.
Even when governments legally collect and process personal information in the performance of official tasks, they should seek opportunities to give citizens as much control as possible over their personal data. For example, by offering a privacy-friendly alternative in situations where showing a passport, identity card, or driving licence is currently required.
The open source app IRMA (I Reveal My Attributes) enables citizens to reveal properties (attributes) of themselves without disclosing personal information that is not relevant in the situation at hand. Thus, citizens can fill out municipal web forms without having to enter their official digital identity code; IRMA allows them to prove that they are residents of the municipality. At the door of a nightclub, ‘over 18’ and a digital passport photo are the only personal attributes that are needed to get in. These are the only data the bouncer gets to see upon reading out the QR code on the mobile phone of youngsters who have the IRMA app.
The more companies and governments facilitate the use of IRMA, the less often people need to cede their name, address, passport number, or national identification number. That enhances their privacy and reduces the risk of identity fraud.[3]
Initiatives such as IRMA show that governments can use technology to give citizens more control over their data. However, governments also use technology to gain more control over citizens. When it comes to combating benefit fraud, the principle of purpose limitation – personal data may be used only for the purpose for which it was ceded or for a compatible purpose – has become virtually meaningless. Governments feed algorithms with a wide range of personal data, from dog ownership to holiday destinations, in order to assign secret risk profiles to benefit recipients.[4] Those who are profiled as high-risk are prima facie suspects, to be subjected to investigation.
A society in which your socio-economic status determines the extent to which you are entitled to privacy and data protection is guilty of class injustice. Discrimination lurks every time people get a risk profile that is not based on their individual behaviour, but on group characteristics. Moreover, if every contact you have with a government produces data that might be repurposed to assign a risk profile to you, public trust in government erodes. Also, the support for useful technological innovations might crumble if citizens find out that their data is being used improperly: "Your waste card tells us that you produce a lot of waste. We are here to check whether you are in fact entitled to a single person's allowance." That is why national and local politicians must prevent the data dragnet for profiling from being cast too widely. Select before you collect: governments need to demonstrate the necessity and proportionality of the use of each category of personal data, especially when it concerns special personal data, regarding health for example.