EU: AI Act: Council Presidency seeks more secrecy over police use of AI technology


The Czech Presidency of the Council has inserted new provisions into the proposed AI Act that would make it possible to greatly limit the transparency obligations placed on law enforcement authorities using "artificial intelligence" technologies. A new "specific carve-out for sensitive operational data" has been added to a number of articles. If the provisions survive the negotiations, the question then becomes: what exactly counts as "sensitive operational data"? And does the carve-out concern just the data itself, or the algorithms and systems it feeds as well?

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

The third compromise text (pdf) was circulated last week and is due to be discussed in the Council's Telecoms Working Party today.

Amongst the new provisions is a "specific carve-out for sensitive operational data of users of AI systems which are law enforcement authorities," and which "has been made explicit in Articles 29(4), 47(2), 53(5), 54(1)(j), 61(2) and 70(2)."

Article 29(4)

The "sensitive operational data" of law enforcement authorities is no longer included under obligations for users to report to providers or distributors of AI systems when there are "reasons to consider that the use in accordance with the instructions of use may result in the AI system presenting a risk within the meaning of Article 65(1)," concerning health, safety and fundamental rights.

Article 47(2)

An obligation for market surveillance authorities to inform the Commission of any derogation from the conformity assessment procedure - which aims to ensure that an AI systems complies with the law before it is used - "shall not cover sensitive operational data in relation to the activities of law enforcement authorities."

Article 53(5)

Excludes "sensitive operational data in relation to the activities of law enforcement, border control, immigration or asylum authorities" from the obligation for national authorities to publish annual reports on "AI regulatory sandboxes" (which could perhaps be referred to as 'AI playgrounds').

Article 54(1)(j)

A requirement for those using "regulatory sandboxes" to publish "a short summary of the AI project developed in the sandbox, its objectives and expected results" no longer applies to "sensitive operational data in relation to the activities of law enforcement, border control, immigration or asylum authorities."

Article 61(2)

The same "sensitive operational data" as described above is excluded from scrutiny through "post-market monitoring", which is intended to ensure compliance with the law.

Article 70(2)

The same "sensitive operational data" is excluded from obligations to exchange information between the authorities responsible for implementing and enforcing the AI Act.

This is far from the first time the Council has sought to weaken safeguards over law enforcement use in the proposed AI Act - previous changes have excluded certain uses of technology and extended others.

Civil society and human rights groups, including Statewatch, have called on EU institutions multiple times to ensure that human rights come first in the AI Act, to prohibit remote biometric identification, and to ban predictive profiling in criminal justice.

The Presidency compromise text also contains changes to the following sections:

  • Title IA - General purpose AI systems
  • Title III, Chapter IV - Notifying authorities and notified bodies
  • Title III, Chapter V - Standards, conformity assessment, certificates, registration
  • Title IV - Transparency obligations for providers and users of certain AI systems
  • Title V - Measures in support of innovation
  • Title VI- Governance, Chapter I - European Artificial Intelligence Board
  • Title VII- EU database for high-risk AI systems listed in Annex III
  • Title VIII - Post-market monitoring, information sharing, market surveillance
  • Title X - Confidentiality and penalties
  • Title XII - Final provisions


Further reading

Image: Victoria Pickering, CC BY-NC-ND 2.0

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.


Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error