Digital rights and the protection of the right to asylum in the Charter of the European Union

Topic

The right to asylum, as delineated in Article 18 of the Charter of Fundamental Rights of the European Union (EU) (‘the Charter’), does not grant the right to asylum to every individual seeking it. Instead, it articulates that everyone is entitled to have their application for international protection examined in line with international and EU law. This principle is reinforced by Article 19 of the Charter, which strictly prohibits collective expulsions and forbids the removal, expulsion or extradition of any person ‘to a State where there is a serious risk that he or she would be subjected to the death penalty, torture or other inhuman or degrading treatment or punishment’.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

Image: Soulful Pizza, Pexels


This essay was originally published by the Digital Freedom Fund in Digital Rights are Charter Rights: Essay Series


Over the past two decades, asylum proceedings in the EU have been increasingly infused with digital technologies. The majority of these developments were initiated with the aim of controlling, monitoring and policing asylum seekers, and preventing their arrival in the EU. However, some civil society initiatives have also, endeavoured to leverage digital technologies as a means of assisting individuals with their applications or safeguarding people from pushbacks.

Despite their potential implications, digital rights within the context of asylum proceedings are frequently overlooked by legal practitioners, asylum seekers and civil society actors. These rights are seldom given priority, especially when facing potential detention or deportation. But authorities have remained resolute in their drive to increase the deployment and use of digital technologies, data and artificial intelligence (AI), with the dual objective of mitigating the entry of asylum seekers into EU territory and evaluating the claims of those who do submit an application. The right to asylum is now inextricably linked to digital technologies. This article seeks to explore the intricate relationship between these two concepts and to examine how digital rights can be leveraged to protect the rights of asylum seekers.

The right to privacy: Safeguarding asylum seekers against invasive technology and ‘junk science’

The right to privacy, enshrined in Article 7 of the Charter, is designed to guard against unwarranted, unnecessary and disproportionate invasions into people’s private lives. However, it can be curtailed by public authorities in accordance with the principle of proportionality as articulated in Article 52. For instance, within the EU, all passport applicants are obligated to provide their fingerprints to authorities for more accurate identification, notwithstanding that it ‘is not decisive’ that this method is ‘not wholly reliable’.

Within the context of asylum claims, where authorities often endeavour to amass as much information as possible about each applicant, the protection of privacy is paramount. This is particularly relevant given the access to huge volumes of digital data that is now available on individuals. The EU’s highest court has acknowledged the prevention of illegal entry into the EU as a matter an objective of general interest. This stance necessitates asylum seekers to compromise their privacy for a chance to secure protection. The question then arises of how much authorities should be able to probe into an applicant’s private life.

One of the primary objectives of authorities when evaluating asylum claims is to verify the identity of the individuals and the veracity of their claims. While some identity featuressuch as fingerprints – are straightforward for authorities to collect, others are more challenging to obtain. Age and sexual identity are two such examples. It is often impossible to validate an asylum seeker’s claim of being a minor or identifying as homosexual through documentation. Yet, these factors can significantly influence the final decision, as well as the conduct of interviews and the individual’s accommodation.

Public authorities have long sought a definitive test that would separate the wheat from the chaff. Before the Court of Justice of the European Union (CJEU) imposed limitations on national practices in 2014, asylum seekers were forced to deal with the most private and sordid questioning during attempts to validate their story. For example, Dutch authorities often suggested that applicants bring their own porn video to their asylum hearings as evidence of their claimed sexual orientation. Though officially a choice, Advocate General Sharpston entertained ‘serious doubts [that the] vulnerable party in the procedure of applying for refugee status, could really be deemed to have given fully free and informed consent to the competent national authorities in such circumstances’, particularly given the power dynamics at play. The CJEU eventually abolished this practice in the ABC ruling, citing infringements on human dignity (Article 1 of the Charter) and the right to private life (Article 7).

National asylum authorities have resorted to ‘junk science’ in their search for a truth serum to identify individuals deserving protection.  The 2018 case F v Hungary, which examined the use of projective personality tests to determine an individual’s sexuality, was particularly contentious. The CJEU declared that such a test may be accepted only if it is based on sufficiently reliable methods and principles in the light of the standards recognised by the international scientific community’. In assessing an individual’s sexuality, projective personality tests fall dramatically short of meeting these standards. The Court also highlighted in its ruling that ‘consent is not necessarily given freely, being de facto imposed under the pressure of the circumstances in which applicants for international protection find themselves’.

More recently, national courts have encountered instances where asylum authorities have requested applicants’ phones to extract and examine stored data for evidence supporting the individual’s claims. In Germany, a court ruled this practice illegal unless less intrusive alternatives had been considered. The judges made clear that the use of new technologies must be both necessary and suited to the intended purpose.

Looking forward, it is plausible that authorities might resort to AI to ascertain an individual’s identity. However, assertions that machine vision technologies can determine an individual’s sexuality are more reminiscent of pseudoscience than offering any credible reassurance. The EU’s AI Act, currently under negotiation, fails to adequately address and prevent potential harms arising from the use of AI in the context of migration. As a result, legal challenges rooted in the right to privacy will remain crucial in defining the boundaries of acceptable digital practices within asylum procedures.

The right to individual data protection: A prerequisite for an effective remedy against automated and semi-automated decision-making

The EU has established a mille-feuille of databases designed to identify all individuals who either seek to or do enter the EU. These information systems are intended to support migration and police authorities in their decision-making concerning individuals, such as their right to entry or stay pending an asylum decision. Article 8(2) of the Charter confers upon any individuals whose data has been collected by a European authority the right to individual data protection. This includes the right to access data stored about them and to rectify or delete any incorrect data.

Asylum seekers are progressively forced to surrender increasing amounts of personal information. The latest Eurodac system will collect the facial images and personal information of asylum seekers (and other foreign nationals) aged as young as six. National authorities massively collect and exchange individuals’ personal data, who largely remain unaware of it until the data is used as the basis for a decision on their case.

While the surge in new and expanded databases is purported to assist in decision-making, they cannot serve as the sole source of information for a decision. In the 2006 case Spain v Commission, the CJEU ruled that authorities should not make automated decisions based solely on information stored in a European information system. Decisions must rest on an individual assessment of the person’s situation, including an evaluation of the legal grounds for denying entry.

Nevertheless, the practice of denying entry and deporting individuals perceived as a risk to national security persists, with states often not providing access to the reasons for those decisions. In 2020, the CJEU clarified that an individual has the right to obtain minimum reasons for their refusal of entry into the Union. Article 47 of the Charter, espousing the principle of equality of arms requires national authorities to disclose the state that shared information used as the basis for the decision, as well as the specific grounds for the risk assessment. This disclosure allows applicants to seek effective remedy against the decision. Similarly, under Article 8(2) of the Charter, the right of access serves as a ‘gatekeeper enabling data subjects to take further action’ such as requesting removal or rectification of wrongful accusations that impact their right to a fair trial.

Despite these provisions, access to information is far from being uniformly respected by member states. All too often, asylum seekers find that ‘secret’ evidence is being used against them. In some instances, the country from which a person is seeking asylum is the one that supplies the data on which the authorities base their decision. Even though data sharing with a third country should adhere to EU protection standards, including the prohibition of using information obtained from torture, this is not adequately monitored in practice.

The risk of national authorities relying on inaccurate or illicit data has been amplified with the implementation of the latest information systems regulation and the Europol Regulation. Nonetheless, data protection standards for asylum seekers fall short of those provided to EU citizens. This was exemplified by the recent ‘Processing of Personal Data for Risk Analysis’ (PeDRA) scandal, in which Frontex proposed the collection of intrusive personal data, fragrantly violating data protection rights. At the same time, the European Data Protection Supervisor (EDPS) contended that the rules governing the agency are vague regarding the ‘conditions or limits for sharing data with other agencies, states and third countries, and on available remedies for individuals’.

As the EDPS pointed out,Privacy and data protection are part of the human rights too often suspended at the borders of the European Union’. This sentiment underscores a recurring theme in asylum, migration and border regulation, illustrating the tendency to view certain migrant groups as security concerns and undeserving of the protections afforded to citizens or other categories of foreign nationals.

Digital asylum rights: A call for increased safeguards amidst the digitalisation of procedures

Public technologies are often employed by authorities with the expectation of enhancing efficiency and mitigating or eliminating biases that emerge from human decision-making. However, studies on the impacts of these technologies frequently show the exact opposite. Issues of discrimination and racism persist, yet they become entwined within the complexity of technical systems. This makes it increasingly challenging to substantiate when and how rights violations occur.

The EU’s current legislative negotiations are set to further expand the use of digital technologies in asylum and migration procedures. Nevertheless, these negotiations also present opportunities for enhanced safeguarding. The proposed Screening Regulation potentially offers an avenue for bolstering the protection of asylum seekers’ right to privacy. This can be achieved through the inclusion of an independent mechanism designed to monitor the protection of individuals’ fundamental rights during their identification by border authorities. However, this regulation is yet to be approved, and it will ultimately fall under the jurisdiction of the Fundamental Rights Agency and Member States in their jurisdiction, to clarify the procedure of this new mechanism.

The digitalisation of asylum and immigration proceedings is poised to become ever more deeply entrenched in the years to come. It is therefore of paramount importance to amplify understandings of privacy, data protection and other digital rights among asylum seekers, migrants and migration activists, legal professionals and non-governmental organisations.

Author: Romain Lanneau

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

Further reading

23 January 2023

Submission for the EU Rule of Law Report 2023

On 20 January, we filed a submission to the European Commission's public consultation for its Rule of Law Report 2023, which will cover developments in 2022. Our submission highlights a number of topics - in particular regarding rule of law issues at EU level, surveillance, access to an effective remedy and the criminalisation of the press - that have not received sufficient attention in previous iterations of the report.

08 February 2023

Pushbacks, migration policy and returns at the core of EU support for authoritarian regimes

The ongoing debate on pushbacks and rights violations at external EU borders neglects an important aspect: the EU and its states betray their claimed goal to promote human rights, the rule of law and civil society development worldwide by helping authoritarian regimes oppress their citizens, and also to stop them from leaving.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error