A. Democratising the development of technology

4. Anticipate the unforeseen consequences of technology. Call upon the imagination of scientists, philosophers, and artists. Take responsibility.

 
New technology always has unexpected and unintended consequences. If a municipa­lity is too much led by reports from citizens via an app when it comes to the maintenan­ce of public spaces, there is a risk of discrimination: in poorer neighbourhoods, where people are less proficient at complaining digitally, street furniture is not repaired as quickly as elsewhere.[1] Sensors that monitor the well-being of elderly people living alone do not always deliver the promised time savings for care workers and informal carers; for example, some people deliberately leave the refrigerator door open for too long, just to receive a phone call from a carer.[2]

We can try to anticipate by drawing lessons from the past and sketching scenarios for the future. Governments can benefit from the knowledge and imagination of historians, philosophers, ethicists, and artists to map the possible consequences of technological innovations for people and society.

One way to reflect on the unforeseen consequences of technology is the development of techno-moral vignettes: fictional scenarios, written or visual, about the (ethical) changes that technology may bring about.[3] How freely do we move through the city, for example, when cameras with facial recognition hang everywhere? Or when passers-by, using smart glasses such as the Google Glass, are capa­ble of uncovering our identity and consulting our social media profiles?

Any government can bring together thinkers, experts, and citizens in an impact assessment commit­tee that provides solicited and unsolicited advice on new technologies.[4] For instance, such a committee can sound the alarm if it thinks the precautionary principle needs to be applied. This principle dictates that when human activities may lead to morally unacceptable harm that is scientifically plausible but uncertain, actions shall be taken to avoid or diminish that harm.[5]

Scientists disagree on the risks that electromagnetic fields pose to public health. The Belgian region of Flanders applies a strict limit for the electromagnetic radiation of antenna stations for telecommunication in the vicinity of homes, schools, and nurseries. In some other EU countries, there are no legal limits.[6] Now that the deployment of the fast 5G network will lead to a considerable increase in the number of small antennas, it is up to municipalities in those countries to decide whether or not they curb radiation, as a precaution.

Regular evaluation is required with the introduction of technological innovations. Techno­­logy needs a constant critical look, including by audit authorities and ombuds­(wo)men. For example, research into neighbourhood watch apps shows that these apps, instead of improving security, can fuel fear, mutual distrust, discrimination, and vigilantism.[7]

Unforeseen damage should not be passed on to society or affected individuals. Designers and providers of technology, as well as the companies and authorities that make use of it, must take responsibility.[8]

Return to principles  

Further viewing & reading

Video: Rathenau Institute, Bioluminescent Streetlamps, a techno-moral vignette

 

Data Ethics QuestionsDossier: DataEthics, It's time for a Data Ethics Impact Assessment

Footnotes

[1] Burak Pak et al., 'FixMyStreet Brussels: Sociodemographic Inequality in Crowdsourced Civic Participation', Journal of Urban Technology, 2017
[2] WRR, De robot de baas. De toekomst van werk in het tweede machinetijdperk, 2015, p. 118 (in Dutch)
[3] For examples of techno-moral vignettes using video images, see the animation video above and Rathenau Institute, SynBio Politics, 2013, under 'Future Scenarios'
[4] The cities of Paderborn and Enschede, for instance, have recently set up ethics councils for data and digitalisation. See Dietmar Kemper, 'Umstrittene Digitalisiering', Westfalen-Blatt, 24 January 2020 (in German) and City of Enschede, Ethische Commissie, 2020 (in Dutch).
[5] UNESCO, The Precautionary Principle, 2005, p. 14. The precautionary principle is one of the leading principles in the environmental policies of the European Union, according to article 191 of the Treaty on the Functioning of the European Union.
[6] RIVM, Comparison of international policies on electromagnetic fields, 2018
[7] Rani Molla, 'The rise of fear-based social media like Nextdoor, Citizen and now Amazon's Neighbors', Vox, 7 May 2019 and Clara van de Wiel, 'Amper beleid bij forse groei buurtpreventie door burgers', NRC Handelsblad, 18 April 2019 (in Dutch)
[8] See for instance Eindhoven smart society IoT charter. Amsterdam’s hotline for chained errors is a good example of taking responsibility. See principle 10

Return to principles

Reacties

Jüri Ginter

I recommend to explain, what it means to take responsibility. Often people say, that they are responsible, but in fact they are not. May-be you can find some good examples.

Richard

@Jüri: We've added an example of taking responsibility in footnote 7.

Reactie toevoegen