Exceptions, loopholes and carve-outs: Presidency wants “internal security needs” recognized in EU digital policies

Topic
Country/Region
EU

The growing number of EU digital policies should “benefit” justice and home affairs actors whilst “addressing and minimizing the associated risks,” the Swedish Presidency of the Council argues in a recent discussion paper. The Council’s internal security committee, COSI, should continue to “monitor and discuss” relevant legal proposals to create “a positive narrative… on the justice and internal security needs related to technological development and digitalization,” says the document.

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.


Image: scratch-media, CC BY-NC-ND 2.0


The proposed laws in question are the Artificial Intelligence (AI) Act, the Regulation to prevent and combat child sexual abuse (CSA), the new e-evidence rules on cross-border data-gathering, the e-Privacy Regulation, the Media Freedom Act and the Regulation on a European digital identity system.

The document (pdf), circulated by the Presidency on 17 February, argues:

“National coordination processes, and the consolidation of national positions between sometimes differing views, play a key role, and should ensure that internal security sector considerations are channelled into the working fora leading the negotiations on the various legislative proposals.”

The paper notes that it will be “crucial to closely follow” the negotiations on the AI Act, particularly as it is expected that “the European Parliament’s position will diverge significantly from the Council mandate on a range of issues, including crucial ones (e.g. Article 5 on the ban of real-time remote biometric identification).”

A European Commission presentation published by Statewatch in 2021 noted that the AI Act aims “to decrease administrative burden on home affairs authorities in order not to hamper innovation,” but there has been a substantial pushback from civil society and, subsequently, MEPs, against the loopholes and carve-outs in the proposal regarding the use of advanced technologies for policing and migration.

Organisations including Statewatch have called for bans on the use of predictive law enforcement technologies, profiling systems and biometric mass surveillance (“real-time remote biometric identification”), as well as numerous safeguards to prevent discriminatory and dangerous uses of AI technology.

The Council, on the other hand, would rather the opposite, having sought to simplify the use of biometric mass surveillance, loosen potential restrictions on law enforcement agencies, and increase secrecy

Since May 2022 there have been eight meetings of the Council’s Law Enforcement Working Party (LEWP) on the controversial CSA Regulation, which more than 70 civil society organisations (including Statewatch) have said should be withdrawn:

“The proposed CSA Regulation has made a political decision to consider scanning and surveillance technologies safe despite widespread expert opinion to the contrary. If passed, this law will turn the internet into a space that is dangerous for everyone’s privacy, security and free expression. This includes the very children that this legislation aims to protect.”

The Council thinks otherwise, and following the LEWP meetings “the Presidency tabled compromise proposals concerning removal orders, blocking orders and delisting orders,” says the Swedish paper.

The proposed e-evidence rules are expected to be adopted shortly (pdf), with the Presidency noting that: “Negotiations were time-consuming and challenging but the final result should give law enforcement and judicial authorities an important tool to fight crime more effectively.”

As European Digital Rights have noted, the aim of the rules is “to cut short current judicial processes,” in order to simplify law enforcement access to electronic data held in another member state. The Council and Parliament “have failed to build a framework that provides sufficient safeguards and remains bulletproof against abuses,” says the organisation, of which Statewatch is a member.

The proposed e-Privacy Regulation – formally known as the proposal for a Regulation on the respect for private life and the protection of personal data in electronic communications – was published more than six years ago, and just over two years ago the Council adopted its negotiating mandate.

“As far as JAI [Justice et affaires intérieures] is concerned, the Council mandate includes important access to electronic evidence and data retention aspects,” says the Presidency’s note, covering issues such as “data processing for law enforcement and public security purposes,” an “explicit provision” on the retention of telecommunications data, and “Exceptions to the obligations and rights provided for in the instrument”.

It appears that secret “trilogue” discussions between the Parliament and Council have reached something of an impasse as regards data retention. The Presidency’s note says the Parliament wishes to leave discussions on the matter until the end of the process, a point with which the Council disagrees, “and no compromise is found to date… At present the negotiations are on standby.”

As regards the proposed European Media Freedom Act (published in September 2022) and the European digital identity proposal (on which the Council adopted its position in December 2022), the Swedish Presidency’s note provides a brief overview of the aims of each measure and the parliamentary procedure so far, but makes no comment on what has been achieved or is sought for internal security purposes.

Documentation

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

Further reading

06 December 2022

Joint statement: The EU Artifical Intelligence Act must protect people on the move

Joint statement signed by over 160 organisations and 29 individuals, in the run-up to votes in the European Parliament on the position to be taken in negotiations with the Council of the EU.

03 October 2022

EU: Discussion on encryption ponders "retention of vulnerabilities and exploitation by the authorities"

At a recent event hosted by Europol's Innovation Hub, participants discussed questions relating to encrypted data and the ability of law enforcement authorities to access digital information. One issue raised was a possible "EU Vulnerability Management Policy for Internal Security," which could allow for "temporary retention of vulnerabilities and their exploitation by the relevant authorities." In effect, this would mean identifying weaknesses in software and, rather than informing the software developers of the problem, exploiting it for law enforcement purposes.

29 September 2022

EU: AI Act: Council Presidency seeks more secrecy over police use of AI technology

The Czech Presidency of the Council has inserted new provisions into the proposed AI Act that would make it possible to greatly limit the transparency obligations placed on law enforcement authorities using "artificial intelligence" technologies. A new "specific carve-out for sensitive operational data" has been added to a number of articles. If the provisions survive the negotiations, the question then becomes: what exactly counts as "sensitive operational data"? And does the carve-out concern just the data itself, or the algorithms and systems it feeds as well?

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error