Who will take care of me when I end
up in a nursing home? A robot? Most probably. A robot so realistic that I,
vulnerable as I will be by then, will perceive “him” as the actual person
taking care of me. Do we want this? This is not the question, because it will
happen. But we do need to decide how we want this to happen without losing
our human dignity.
The Council of Europe asked the Dutch Rathenau Institute to advise
governments how to preserve human dignity and human rights in the Robot Age.
The Rathenau report, accessible via this link, is described below.
What happens to our personal information?
Once our castle was our home. But
in the age of technology our home has become a place where we are continuously
being "watched" via smartphone, smart television, connected objects,
etc. This surveillance generates enormous amounts of data about us, about what
we read, what we think, how we feel, who we meet, etc. We barely know what
happens to this very personal information.
One thing is certain: you don’t own
it. You gave it away to social media like Facebook and Google. And don’t think
you “use” these social media: they are using you by selling data about you to
third parties. These media can only be free if we – the users – are sold. We
are nothing but their data, their "raw material".
To respect our dignity, social
media users should be fairly compensated for their personal information that is
traded. The European Data Protection Supervisor warns against the idea that
people can pay with their data in the same way they do with their money. But
businesses don't like the idea to regulate the ownership of data. The Rathenau
report recommends that guidance be provided in this respect.
We are biased by social media algorithms
Social media show us information on the basis of algorithms. The risk is
that the algorithm only show us what we want
to see and not the complete picture. Many were surprised when Donald Trump was
elected because their news consumption excluded news that did not appeal to
Hillary voters (‘echo chamber’ effect). Guidance is needed on the role of the gatekeepers
of our information society like Facebook and Google.
We are profiled and sometimes discriminated
With all these personal data freely available to companies, they can profile
us. Example: instead of treating people equally, companies can adjust their prices to
the category of poor or rich users. In 2012 it was reported that the website
Orbiz showed Apple users more expensive offers than non-Apple users. This form
of algorithmic profiling for the allocation of resources is discriminatory. Advice
is needed on ways to combat algorithmic discrimination. For example by imposing
transparency standards: cheaper offers should not be hidden for richer clients.
Social robots: are we in good hands?
Technology is neither good nor bad, but it is not neutral. Care robots
for example are good (they enable a person to be autonomous) but also bad (when
they become too pushy about taking your medication). The question is: where
does autonomy end and unwanted paternalism start? We must avoid the slippery
slope to the "authoritarian robot".
Softbots (robots and software) can
be so realistic that they manage to engage with us on an emotional level. Many
people are already addicted to their smart phones, to their virtual girlfriend
or to the game reality. As a result, the person may lose the capacity to enter
into meaningful contact with real people. Companies can take advantage of this
vulnerability to build "persuasive technology" to influence our
decisions. Advice is needed to prevent persuasion technology from becoming too addictive
or too authoritarian.
We should have the right to meaningful human contact
The Rathenau Report suggests a new
human right: the right to meaningful human contact. It considers that there are
situations that cannot be left to machines only: raising children (teaching
robots), caring for elderly people (care robots) and warfare (drones). In these
cases, technology should only facilitate and not replace human contact and control.
Drone. |
We should have the right to be let alone
The report recommends a second new
human right: the right not to be measured, analyzed or coached. This right is
currently not applied in the Netherlands. The former government stated that if
people do not want to be observed in shops via their wifi signal, they should
just turn off their smartphone. This means that the right to track people is deemed
more important than their privacy rights. This is potentially harmful: research
suggests that “the right to privacy and
the possibility to perform every day undertakings without being seen, monitored
or noticed, may be fundamental to the development of a sane personality”.
We don’t always have a fair trial
Softbots can speed up court proceedings by using ICT tools for legal decisions.
For example for the calculation of conditions for parole and bail. But is has
been reported that this software is biased against African Americans. Advice is
needed on how to avoid biased software. For example by developing norms for the
use of softbots in court proceedings. The suspect’s lawyer should know for
example if a tool is used and how it affects the final decision taken.
We should enjoy (e-)possessions
Technology has changed the way we enjoy possessions. We feel we own our
land, but do we also own the Pokémon that the developer of this game planted on
our land in a virtual reality? This question is relevant because people go to the
places where the Pokémons "are" and sometimes cause damage. The
municipality of The Hague sued the Pokémon game developer because gamers who
were looking for virtual Pokémons had demolished the protected nature area when
doing so. Guidance is needed on the notion of ownership in world of virtual
reality.
Can we trust legislation about self-driving cars?
Who is responsible when a self-driving car causes an accident? Is it the
car manufacturer, the software developers, the seller, the buyer? Or is it the
road authorities, as the car will depend in part on digital roadside systems?
But more urgently: who's responsible for an accident caused by a car that is
driven partly by a human and partly by software, like remote control parking? Accidents
like these already happen. Current laws however are not clear about how to
apportion liability with regard to robotics. Advice is needed here.
Conclusion
The main recommendation of the Rathenau report is to establish a
Convention on Human Rights in the Robot Age that would help answer the
questions raised above. Rathenau’s recommendations have already prompted the Council
of Europe to put forward a number of proposals. One of them is to establish cooperation
between UNESCO, the Council of Europe and the European Union to develop a harmonized
legal framework and regulatory mechanisms at the international level. UNESCO’s contribution is already in the making: a report on the ethics of robotics by
UNESCO’s World Commission on the Ethics of Scientific Knowledge and Technology (COMEST).
It will be presented at UNESCO in November by the Dutch philosopher Peter-Paul
Verbeek. Read also:
- Preliminary draft of UNESCO's report on the ethics of robots.
- Ethics of robots: questions and answers.
- My blog about "The limits of humanity" by Peter-Paul-Verbeek.