EU: Business deregulation plans will undermine data protection rights

Topic
Country/Region
EU

More than 120 organisations, including Statewatch, are calling on the EU to keep the General Data Protection Regulation in place, as the European Commission announces plans to remove certain provisions of the law in the name of removing 'red tape' for businesses. The signatories express concern that the proposed changes "could instead roll back key accountability safeguards and with them, the accountability principle itself."

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

Image: LIBER Europe, CC BY 2.0


The letter was coordinated by European Digital Rights.

19 May 2025

Reopening the GDPR is a threat to rights, accountability, and the future of EU digital policy

 Dear Executive Vice-President Virkkunen, Dear Commissioner McGrath,

We write as civil society organisations, academics, companies, trade unions, experts and others alarmed by a growing risk: that the most important digital rights law seems set to be quietly unravelled. We are gravely concerned about ongoing proposals to reopen the General Data Protection Regulation (GDPR), including changes expected as part of the fourth omnibus package, and mounting rumours that the GDPR will be further reopened in subsequent initiatives later this year or beyond.

The GDPR is more than a Regulation. It is the backbone of the EU’s digital rulebook, a hard-fought legislative achievement that sets high standards and safeguards people’s dignity in a data-driven world. Its impact reaches far beyond the EU’s borders, influencing digital governance globally.

Proposals to amend certain provisions intended to support small and medium-sized companies to increase legal certainty and strengthen enforcement are good in theory. However, we are concerned that the proposed changes risk, unsupported by any evidence, missing the mark of genuine simplification, and could instead roll back key accountability safeguards and with them, the accountability principle itself. In practice, they could allow some companies to avoid keeping records of data processing (even when handling special categories of data) purely based on staff headcount or turnover.

This shift undermines what is often called the GDPR’s ‘risk-based approach’, a mechanism for calibrating obligations according to the potential harm to people’s rights and freedoms, not company size. More fundamentally, it could erode the Regulation’s original foundation as a rights-based instrument grounded in the recognition of personal data protection as a fundamental right. Data rights do not become less important when the controller is smaller; and people’s vulnerability to harm does not shrink accordingly.

While competitiveness is important, using it to justify exemptions from core protections sends a worrying message: that people’s rights are expendable when economic interests are at stake. But sustainable competitiveness depends on trust, accountability, and fairness, not on lowering standards. It also relies on other factors that have nothing to do with regulation: long-term investment, robust infrastructures and coherent enforcement. The GDPR, which is technologically neutral, supports innovation precisely by ensuring that people’s rights are respected and that businesses operate on a level playing field. Many companies and Data Protection Officers (DPOs), including those of us signing this letter, do not support reopening the GDPR. On the contrary, there is broad recognition that obligations such as those under Article 30 help ensure compliance and foster responsible data practices.

In our experience, deregulatory efforts rarely stop at ‘technical adjustments.’ Once reopened, the GDPR could become vulnerable to broader deregulatory demands. Many such pressures are already visible, including calls to weaken rules on consent with no effective safeguards for users, or legitimise invasive uses of personal data for AI training.

We also cannot ignore the geopolitical context. Over the past years, calls from foreign commercial and political actors to loosen the EU’s digital protections have consistently started with attempts to weaken the GDPR, a strategy now extended to the entire EU tech rulebook, including the DSA, the DMA and the AI Act – and already underway for corporate accountability and environmental justice. Weakening the GDPR would also harm the EU’s credibility. The Regulation is still widely cited as a benchmark for rights-based digital governance. Undermining it would send a signal that the EU is willing to abandon its own standards under pressure, further eroding trust in its digital policies.

The GDPR is presented by some as an obstacle to aggressive data extraction models that rely on opacity, manipulation, and disregard for rights. These are often the same actors who work to evade meaningful enforcement. Undermining the GDPR would not only weaken protections for people in the EU; it would send a signal globally that rights-based regulation is negotiable under pressure.

We share the concern that the current compliance model can feel burdensome, especially for smaller entities acting in good faith. But weakening legal protections is not the answer. Instead, the EU needs to invest in real enforcement of existing rules against repeat offenders, while improving guidance, access to tools, and proportional compliance support for smaller actors.

We urge the European Commission to:

  • Reject any reopening of the GDPR – no matter how limited it may appear - and reaffirm the Regulation’s integrity as a foundation of EU digital law;
  • Recognise that current implementation challenges can be solved by effective enforcement with clarity and not deregulation;
  • Continue to support compliance mechanisms and legal certainty, not by rewriting the law but by ensuring greater support and assistance, especially for smaller entities;
  • Resist external and internal pressures that seek to trade away people’s rights in the name of competitiveness or trade

The GDPR was designed to protect people in the face of growing digital power asymmetries, which disproportionately harm communities that have been systematically marginalised for decades. It is not broken, but the pressure to break it is real. Reopening it now would risk turning back the clock on hard-won rights.

We remain at your disposal for dialogue and urge you to stand firm in defence of fundamental rights.

Sincerely,

Organisations:

European Digital Rights (EDRi) Access Now

AI Forensics AlgorithmWatch Alternatif Bilisim Amnesty International ARTICLE 19

Aspiration ATTAC España Attac Österreich

Austrian Federal Chamber of Labour (AK EUROPA) Avaaz

Balanced Economy Project Bits of Freedom

Bizoneo

Centar za građanske inicijative Poreč

Centre for Democracy and Technology Europe (CDT Europe) Centre for Peace Studies

Civil Liberties Union for Europe (Liberties) Compliance Buro

Corporate Europe Observatory (CEO) Cryptee

CTRL Matters

Danes je nov dan, Inštitut za druga vprašanja Defend Democracy

Deutsche Vereinigung für Datenschutz e.V. (DVD) Digitale Gesellschaft

Digital Intimacy Coalition

Digitalcourage

Electronic Frontier Finland - Effi ry Electronic Frontier Norway

Electronic Privacy Information Center (EPIC)

Element

epicenter.works - for digital rights Ekō

EKPIZO

European Center for Not-for-Profit Law (ECNL) European Environmental Bureau (EEB)

European Federation of Public Service Unions (EPSU) European Network Against Racism (ENAR)

Fair Vote UK

Federación de Consumidores y Usuarios (CECU)

Federation of German Consumer Organisations (Verbraucherzentrale Bundesverband - vzbv)

Foundation the London Story Glitch

Global Forum for Media Development (GFMD) Global Health Advocates (GHA)

Global Witness

Goebel Consult (Information-Security and Privacy Consulting for SMB) Health Action International

Hermes Center Hostsharing eG IFEX

Iuridicum Remedium (IuRe) IT-Pol

Lie Detectors lolongo

Media Diversity Institute Mozilla

New School of the Anthropocene

Norwegian Consumer Council (Forbrukerrådet) noyb - European Center for Digital Rights Observatorio de Trabajo, Algoritmo y Sociedad Open Rights Group

Panoptykon Foundation People vs Big Tech

Platform for International Cooperation on Undocumented Migrants (PICUM) Politicode, data protection consultancy

Politiscope Privacy First

Privacy International Proton

Public Citizen

SHARE Foundation

Skyline International for Human Rights (SIHR) Statewatch

SUPERRR Lab

The Swedish Consumers' Association Transatlantic Consumer Dialogue (TACD) Tuta Mail

Volkshilfe Österreich VoxPublic Vrijschrift.org

Waag Futurelab

Individuals:

Aditya Tannu, Data Privacy Consultant

Anastasia Karagianni, Doctoral Student, Law, Science, Technology & Society (LSTS) Research Group, VUB

Anella Buković, Data Protection Lawyer Dr. Asli Telli

Beata Faracik, President of the Board, Polish Institute for Human Rights and Business Conor Hogan

Cristiana Santos, Utrecht University Professor Douwe Korff

Dr. Heleen Janssen, Assistant Professor of Information Law

LL.M. Flora Rebello Arduini, Senior Human Rights Strategist, Fellow at International Panel on the Information Environment (IPIE)

Gail Chalmin, DPO

Guido Gorgoni, Aggregate Professor of Digital Citizenship and Law, University of Padua Harshvardhan J. Pandit, AI Accountability Lab (AIAL), Trinity College Dublin

Professor Ian Brown

Igor Barlek, European Association of Data Protection Professionals (EADPP) Board member Y2021, International Association of Privacy Professionals (IAPP) member, Certified Information Privacy Professional / Europe (CIPP/E)

Dr. Irene Kamara, Assistant Professor, TILT, Tilburg Law School Jacobo Ponte, Sustainability Advisor. Action Aid Spain board member. Dr Joanna Mazur, University of Warsaw

Jug Puljizevic, GDPR-MEDIA

Kaiti Milona, Naturefriends Greece Kim Bjørn Jensen, Privacy Advisor

Kristina Irion, Associate Professor, University of Amsterdam Leandro Ucciferri, lawyer and digital rights advocate

Dr. Lisette Mustert, LL.M, Assistant Professor of Administrative Law, Utrecht University Dr. Lorenzo Dalla Corte, Assistant Professor in Data Protection and Cybersecurity Law Dr. Magdalena Brewczynska, postdoctoral researcher at Tilburg Institute for Law, Technology, and Society (TILT)

Dr. Marco Almada, University of Luxembourg Marco Giraudo, PhD

Maria Magierska, Maastricht University, European University Institute Mario Guglielmetti, alumnus SSSUP S.Anna, Pisa; LLM College of Europe Michael Thomas, PhD

Nicole Gross, Associate Professor in Business & Society

Dr. Paško Bilić, Chair of the Centre for Sociology of Media and Digital Society, Institute for Development and International Relations

Dr. Ronald Leenes, full professor of regulation by technology Sarah Tas, Assistant Professor of Public Law, Maastricht University Sophie Stalla-Bourdillon, Co-director Brussels Privacy Hub

Xavier Brandao, independent expert on digital threats

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

Further reading

07 May 2025

France: Flaws and injustices of 'predictive policing' laid bare in new report

As part of a European initiative coordinated by Statewatch, La Quadrature du Net has published an English translation of its report on the state of predictive policing in France. In light of the information gathered, and given the dangers these systems carry when they incorporate socio-demographic data as a basis for their recommendations, La Quadrature calls for a ban.

15 April 2025

Belgium: New report calls for a ban on 'predictive' policing technologies

Following an investigation carried out over the past two years, Statewatch, the Ligue des droits humains and the Liga voor mensenrechten, jointly publish a report on the development of ‘predictive’ policing in Belgium. There are inherent risks in these systems, particularly when they rely on biased databases or sociodemographic statistics. The report calls for a ban on ‘predictive’ systems in law enforcement.

08 April 2025

UK government wants to legalise automated police decision-making

A proposed law in the UK would allow police decisions to be made solely by computers, with no human input. The Data Use and Access Bill would remove a safeguard in data protection law that prohibits solely automated decision-making by law enforcement agencies. Over 30 civil liberties, human rights, and racial justice organisations and experts, including Statewatch, have written to the government to demand changes.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error