Tag Archives privacy

Are we ready for the robotic revolution? by Oane Visser and Pieter Medendorp

Japan has a hotel where guests are served by robots, and in Australia self-driving tractors autonomously harvest crops, day and night. Robots help with care for residents in some Dutch nursing homes; once in your house, they can order you a taxi, order your food, or mow your lawn. Robots of all shapes and sizes are beginning to penetrate our lives. Do they generate smarter, happier lives? What are the implications of a robotic revolution for our freedom and autonomy? 

For long, views on future robotisation in films, novels and public debate have been divided between utopian and dystopian visions. In 1921, robots appeared in the play R.U.R. (Rossum’s Universal Robots) by Karel Čapek; the word is derived from the Czech word ‘robota’, which translates to ‘toiling and servitude’. In ‘I, Robot’—Isamov’s classic 1969 science fiction novel—robots start off as a helpful comrades, but end up controlling humans.

Discussing the positive effects of robotisation, next to convenience—such as robots taking over household chores—medical applications come to mind. Robot radiologists could be analysing your X-rays possibly much better than your doctor. Robotic surgery is already common practice, easing a surgeon’s work. Rehabilitation robotics makes good progress and helps the paralysed to walk. New research shows paralysed patients who steer robotic arms and legs with their thoughts, based on the decoding of the neural signals that are converted into robotic guidance. Concurrently, advances in robotisation cause concerns about the expansion of surveillance and the erosion of (mental) privacy and identity.

When discussing societal impacts of robotisation, two starting points could be helpful. First, we should look beyond visible incarnations of automation-like robots. The algorithms within robots are increasingly central in everything we do online, as well as within the internet of things, from machines, cars, refrigerators to smart cities—everything gets connected and exchanges data. Second, a distinction between a person’s role either as consumer, employee, or citizen facilitates the categorisation of the manifold effects of such automation.

Algorithmic society

As (online) consumers, we tend to be winners. An ever-increasing range of products is just a mouse click away due to the rapid sophistication in online ordering algorithms and the massive investments in the ‘last mile’ of consumer product logistics. ‘The client is king’ already seems outdated; the consumer is turned into an ‘emperor’. The flipside of this apparent consumer Valhalla with almost real-time delivery constitutes the worsening labour conditions of workers in the value chains enabling it. In the distribution centres of companies like Amazon, underpaid workers often operate under a regime of unrealistic work targets, rigid digital surveillance, and a work pace that is set by robots. Even farms are affected—in India, the rise of e-agriculture has been found to contribute to agrarian distress (Stone 2011).

As citizens, our agency and physical and mental privacy seems to be increasingly under threat as both Big Tech and governments try to target or nudge citizens with algorithms which are unregulated and lack transparency. Think about the targeted advertisements in your web browser for weeks after visiting once an online shoe store. It is increasingly difficult to escape individual media bubbles or corporate surveillance, let alone the mass surveillance of those living under authoritarian regimes. In such an algorithm-led society, will people end up as emperors without clothes?

2018: a watershed year

Looking back, 2018 seemed a watershed in the public debate. It shifted from Tech companies as drivers of technological innovation (for consumers) and subsequently freedom (benefiting us as citizens), to these companies’ troublesome record in preventing fake news, let alone respecting privacy and democracy. With a recurring pattern of irresponsible conduct regarding privacy and fake news, Facebook has come to symbolise the downsides of an algorithm-led society.

Regarding the impact on us as employees, international agencies like the OECD recently issued unsettling reports on the effects of automatisation for labour, which are likely to exacerbate inequalities both within countries as well as between the Global South and North. Finally, policy action has been stepped up in the past year, with the EU taking the lead for instance with the GDPR regulations on privacy and law proposals to curb the excessive power of Big Tech.

Robotisation has arrived and will continue to change the way we consume, work, and live. As with all technologies, it can be used for good and for bad. Robotisation can be used to augment us, help us innovate, and can help address many of society’s grand challenges, yet it can also put us in undesirable competitions, eroding privacy, dignity, and identity. To make robotisation, algorithms, and data science beneficial and inclusive, it is time that governments, tech companies, civic organisations, hospitals, ethicists, and (social) scientists start having a serious dialogue on how to make this digital revolution ‘the best rather than the worst thing, ever to happen to humanity’.[1]

[1] We loosely paraphrase Stephen Hawking [hyperlink: https://www.cam.ac.uk/research/news/the-best-or-worst-thing-to-happen-to-humanity-stephen-hawking-launches-centre-for-the-future-of%5D

Image credit: Franck V. on Unsplash

About the authors:

Photo_PieterMedendorp_sept2018Prof. dr. Pieter Medendorp is a professor of Sensorimotor Neuroscience, Donders Institute for Brain, Cognition and Behaviour, Director Centre for Cognition, Radboud University Nijmegen, The Netherlands.


Foto-OaneVisser-Balkon-1[1]Dr. Oane Visser (associate professor, Political Ecology research group, ISS) leads an international research project on the socio-economic effects of -and responses to- big data and automatization in agriculture.

Legal mobilisation in the court of public opinion by Lotte Houwing and Jeff Handmaker

The idea of a dystopian government that is all-powerful, unrestrained and especially all-seeing is centuries-old. Machiavelli, Orwell and many others have pondered the opportunities and challenges of allowing a government, particularly an authoritarian one, to have access to a system of surveillance that provides every detail of people’s lives. But few could have imagined the implications of modern technologies, such as DNA testing and facial recognition software. What can be done by way of legal mobilisation, beyond the courtroom, to restrain the government when threats to human rights by surveillance agencies are regarded as unacceptable?

The societal debate in The Netherlands regarding privacy and surveillance has been accelerated by the process of reform of the Dutch Intelligence and Security Services Act (in Dutch, the WIV). The Bill was met by an unprecedented level of reaction from the public in a consultation round that took place over the Internet (reference in Dutch). Shortly thereafter, five students from the city of Amsterdam took the initiative to petition for a referendum on the Bill, which was accompanied by a public campaign wherein the students succeeded in collecting even 344,126 more than the required 40,000 signatures. After the students succeeded, several organisations joined in campaigning, highlighting a variety of human rights concerns. Subsequently, the Public Interest Litigation Project (PILP) announced that it would explore the possibilities to start strategic litigation concerning a number of human rights violations that they alleged would be a direct consequence of proposed amendments to the Act.

The outcome of the referendum confirmed that the majority of Dutch citizens were against the Act as it was drafted by the government. This was a huge victory for the students, organisations and other privacy advocates. In response, the government formulated a proposal to make certain changes to the Act. Unfortunately, these changes were not much more than cosmetic. However, since the proposal entails a new legislative process, there is a fresh opportunity to lobby Parliament to introduce more far-reaching amendments.

These forms of legal mobilisation—petitioning for a national referendum, law-based campaigns, (the threat of) strategic litigation—and now a renewed opportunity to lobby Parliament on the revised Bill, reveal the power of public pressure to restrain government over-reach and leverage possibilities for rights-based advocacy and reform.

Where does it hurt?

One of the guiding questions of the PILP in assessing the challenges and potential for launching strategic litigation is: “where does it hurt”? The general problem of the Act is that it contains several capabilities that allow for data collection of people that are not targets of the intelligence and security services. Bulk interception, for example, entails the automatic collection of incredibly large amounts of data before the data even gets analysed by anyone.

The problem with this capability is that the (communications) data of anyone can be gathered, without having taken into account whether individuals form any risk at all from a national security standpoint. It is this specific capability that led to the name “sleepwet”, a portmanteau word of the Dutch word for dragnet (“sleepnet”) and law (“wet”). Besides bulk interception, the Act includes other capabilities with untargeted effects: the capability to hack third parties; to gain real time access to databases; to acquire bulk (personal) datasets; and to exchange (unevaluated) data with foreign intelligence agencies.

Apart from the direct consequences of exercising these capabilities to obtain and share large amounts of data of innocent people, there is the chilling effect. This effect refers to the inhibition or discouragement of the legitimate exercise of certain fundamental rights caused by surveillance measures. For example, in an age of social media, most people recognise the situation of typing something, and then removing the social media post before sending it, because they do not have control over who will read it. Sometimes, such restraint can be a good thing. However, it is harmful for a democracy when political dissidents or whistleblowers begin censuring themselves and are discouraged from making political statements or revealing something bad that is happening.

A broader campaign on privacy

The controversial law reform process of the Act fired up a broader public debate, and, especially in the run up to the referendum, led to accompanying campaigns on privacy in The Netherlands. The most common reaction has been: “but I do not have anything to hide”. However, the campaign waged against specific parts of the Act succeeded in planting seeds of doubt and criticism against this popular, though indifferent attitude. Also, it was the first time that the secrecy of the Dutch surveillance regime was brought into question.

Beyond the Netherlands, the debate has international ramifications. The Netherlands is not the only country that is in the midst of an overarching law reform regarding its intelligence and security services. France, Germany, the U.K. and Finland, among others, are in the midst of comparable processes. The debate in the Netherlands is of international relevance because the Dutch law reform fits in an international trend wherein untargeted surveillance measures are introduced, Internet service providers are more involved in the application of the capabilities, and the focus shifts from content to metadata. Nevertheless, there is a sufficient extent of transparency and free speech in The Netherlands to have an open debate—circumstances which enable legal mobilisation to play a crucial role in bringing issues to the public’s attention, i.e. beyond the courtroom. The broader debate and campaign over privacy is therefore still highly valid.

 What is the role for strategic litigation?

The PILP coalition, which has been discussed in an earlier blogpost, focuses on strategic litigation for human rights. Strategic litigation, a specific form of legal mobilisation, involves the strategic use of legal procedures to bring about certain social, political or legal changes. Strategic litigation often accompanies campaigns or other means to amplify the voice of people and/or organisations fighting for this change.

 What is PILP doing in this specific case?

Regarding the Intelligence and Security Services Act of 2017, PILP is coordinating the legal procedures of a broad coalition of lawyers, journalists, NGOs, and IT/tech companies. This coalition is legally represented by the renowned law firm Boekx Advocaten. Within this file, two separate procedures are underway. First, PILP petitioned for an urgent procedure to force the postponement of the entry into force of the Act until the proposed changes had been passed by the Dutch Parliament. Unfortunately, the judge declined to answer this claim.

Secondly, the coalition is assessing the possibility of starting strategic litigation to challenge the untargeted effects of aforementioned capabilities provided for in the Act itself against the framework of the European human rights treaties. This procedure will be conducted if the changes made by parliament will be insufficient to address the fundamental human rights problems of the Act.

Given the unpredictability of the judicial system, it is difficult to predict the outcome of the lawsuit. However, it is very clear that the other forms of legal mobilisation—a law-based referendum and campaign—have not only underscored the value of taking matters to the formal courts. They have been doing well in their own right; restraining government through the Court of Public Opinion.

Picture credit: Magic Madzik

Lotte-zwart-wit-1-e1493911446330About the authors: 

JeffHandmakerISS_smallLotte Houwing is File Coordinator at the PILP concerning the WIV. Her views do not necessarily represent those of the organisation.

Jeff Handmaker is a senior researcher at the International Institute of Social Studies (ISS) and focuses on legal mobilisation.