EU: Copyright Directive: Article 13 does "necessarily" imply automated censorship, says Council Legal Service

Support our work: become a Friend of Statewatch from as little as £1/€1 per month.

EU  
Copyright Directive: Article 13 does "necessarily" imply automated censorship, says Council Legal Service
13.10.17
Follow us: | | Tweet


The saga over the potential introduction of mandatory upload filters through the EU's proposed Copyright Directive continues, this time with "general and preliminary" considerations from the Council Legal Service (CLS) on a number of questions raised by Member States. Amongst many other things, the CLS considers that: "It does not flow though from the proposed Article 13 that the measures envisaged would be necessarily filtering systems..."

See: Contribution of the Legal Service to the Working Party on Intellectual Property: Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market (doc.12254/16) - legal issues on Article 13 and recital 38 of the proposal (13140/17, LIMITE, 11 October 2017, pdf):

"Various delegations considered that the possibility to use mechanisms such as content recognition technologies (or filtering systems) raises the issue of the compatibility of the use of those mechanisms with Articles 8 (protection of personal data), 11 (freedom of expression and information) and 16 (freedom to conduct a business) of the Charter. More specifically, those delegations found that the proposed Article 13 interferes both with the rights of the ISSPs (Article 16) and with those of the users of their services (Articles 8 and 11) in the following way : on the one hand, it places a burden on the ISSPs while exercising their economic activity and, on the other hand, it restricts the freedom of expression and information of the users of their services by preventing them from benefiting from the possibility to upload /enjoy protected content; it also interferes with their right to protection of their personal data, to the extent that mechanisms such as content recognition technologies could lead to their identification. Those concerns stemmed from the judgments in the Netlog and Scarlet Extended cases, where the Court found, under the specific and particular circumstances of those cases, that the injunction to use the filtering system at stake could not respect the requirement that a fair balance be struck between the fundamental rights which were competing in those cases (and which were the same as in the present proposal)." [emphasis added]

See: Civil society urges EU institutions to stop the “censorship machine” in the copyright proposal (EDRi, link)

Background (Statewatch News Online)

Search our database for more articles and information or subscribe to our mailing list for regular updates from Statewatch News Online.

Our work is only possible with your support.
Become a Friend of Statewatch from as little as £1/€1 per month.

 

Spotted an error? If you've spotted a problem with this page, just click once to let us know.

Report error