Skip to content
How Can We Help?

Search for answers or browse our knowledge base.

< All Topics

Why do we need a Bill of Digital Rights?

Defining human rights

The UN Human Rights Charter and the US Declaration of Independence do not adequately define what inalienable rights are, especially in the context of modern technology and its impact on human nature. The idea of a Bill of Digital Rights would be to define the fundamental rights that humans should have in the digital world in which we now live.

A Bill of Digital Rights needs to cover several areas where AI applications can harm humanity.

Cognitive acuity

Through reliance on decision support systems (e.g. in financial, judicial and medical areas) we are likely to loose cognitive acuity over time. In addition, as the Commission is already aware, these systems incorporate bias and lack transparency in respect of any decision reached. Such systems only deliver a probability. It is naïve to think that such algorithms can be made transparent nor free of bias, given that they are usually stochastic processes, not rule based, and data will always be biased because humans are biased. Decisions that impact individuals and groups must always have human oversight and a right of appeal with such process involving human assessment and judgement only.

Relationships

Over engagement with and reliance on digital assistants, along with the drive to create ever more realistic simulations of humanness is impacting relationships and communication, as well as encouraging gender stereotyping. Whilst we do not propose an outright ban on such devices, we believe that it should be a requirement that users should always know that they are interacting with an artefact, not a human. We propose that ore empirical research is conducted in this area regarding harms and also that research should be conducted on methods to ensure that such artefacts do not appear human (e.g. non-human voices). The evaluation of a user’s emotions, personality and character by AI based artefacts, simulating a dialogue should be banned (e.g. Interviewing systems).

Freedom & privacy

Privacy and freedom is lost through the use of private data and surveillance of citizens, whether by the state or private companies. The use of AI to monitor, track, and identify citizens from facial or other personal attributes is unprecedented in any civilisation and is quite different from the use of other biometrics such as fingerprints. We believe that the European Commission and many other countries are well aware of these dangers and action is required urgently to avoid mass surveillance being normalised.

Although not involving AI, the deployment of Covid-19 tracking apps has brought this prospect even closer. We urge an outright ban on the states use of AI based surveillance technologies. An even greater level of surveillance has already been established in the private sector through Big Tech’s use of a user’s browsing data, shopping activity and a host of other data gatherers such as FitBit health monitors. Much of humanity has already lost its freedom and autonomy! GDPR legislation needs strengthening to avoid the extraction and use of personal data. We believe that the practice of companies providing free services or products in exchange for data should be banned without explicit and informed consent.

Moral agency

In assigning moral agency to artefacts such as autonomous weapons and self-drive vehicles, humans are loosing their moral agency. We would wish to see this banned and a requirement that human decision making is required where life is at risk.

Dignity of work

AI systems, including robotics are already changing the workplace and displacing jobs. We believe that there is human dignity in work and that alternative work must be a condition of job replacement by AI systems and robotics, except where such systems preserve life in carrying out hazardous tasks.

Loss of reality

Through the over use of Augmented and Virtual Reality systems there is a danger that we loose a sense of what is real and people will also become addicted to such immersive technology. Research is needed in this area to provide more empirical evidence and to inform potential health warnings for the use of such devices. Strong human oversight and a cautious approach to application development is required to protect the harms to humanity from users losing touch with the real world and real relationships.

Was this article helpful?
0 out of 5 stars
5 Stars 0%
4 Stars 0%
3 Stars 0%
2 Stars 0%
1 Stars 0%
5
Please Share Your Feedback
How Can We Improve This Article?
Table of Contents