June 18
Anjum Shabbir
Anjum Shabbir
7th June 2021
Data, Tech & IP

Op-Ed: “The EU Commission’s Guidance on Article 17 of the Copyright in the Digital Single Market Directive” by Bernd Justin Jütte and Christophe Geiger

Some clarifications and limited safeguards for fundamental rights, but insufficient to save the provision from annulment by the Court of Justice

The Directive on Copyright in the Digital Single Market (CDSM Directive) was adopted in May 2019 and the deadline for transposition expires on 7 June 2021. One of its more controversial provisions is Article 17, which changes the liability rules for so-called online-content sharing service providers (OCSSPs). These are online platforms whose main activity is the storage of large amounts of copyright protected works uploaded by their users. Article 17 consists of a number of crucial elements. First, it assigns primary liability to OCSSPs when users upload works protected by copyright onto their platforms. As a result, such platforms must obtain authorisation for the relevant uploads from rightholders. In other words, they must obtain and pay for licences for content uploaded by their users. Second, failing this obligation, OCSSPs must ensure that works uploaded without prior authorisation are made unavailable on their services and remove works upon notification by the rightholder in an expeditious manner. Article 17 provides further that users of such platforms should not be restricted in uploading lawful content, including content for which they have obtained authorisation, or content which is subject to a copyright exception or limitation (acts for which authorisation is not required).

Recognising this complexity, the EU legislator had charged the Commission with organising a stakeholder dialogue to discuss it with rightsholders, OCSSPs, user organisations and other stakeholders, and to draft, based on those discussions, guidance on the application of Article 17.

After an eventful stakeholder dialogue and a first indication of the content of the guidelines published in July 2020, the Commission published its guidance on Article 17 on 4 June 2021, a weekend before the transposition deadline expires. In parallel, an action for annulment (C-401/19) of certain parts of Article 17 is pending before the Court of Justice, in which the Polish Government claims that Article 17(4) violated the right to freedom of expression of Article 11 of the Charter of Fundamental Rights of the European Union.

In anticipation of the Court’s decision, serious concerns had been voiced among academics that Article 17 violates the fundamental right to freedom of expression, or even other fundamental rights for that purpose. The guidance was thus eagerly awaited to see if the Commission would propose a fundamental-rights-compliant interpretation to Article 17 and thus save the provision from possible annulment by the Court. As a probable sign of the importance of the then still pending guidance for the final understanding of the Directive, Advocate General Saugmandsgaard Øe – whose Opinion in the Polish annulment proceedings was scheduled for the end of April – postponed his Opinion to 15 July.

It was hoped the Commission’s guidance would provide clarity, particularly on the seemingly irreconcilable tension between a ‘best efforts’ obligation created by Article 17(4) and by an obligation of results defined in Article 17(9). The obligation for OCSSPs to demonstrate best efforts to prevent infringing user-uploads without obliging them to generally monitor all uploads made on their platforms (Article 17(8)), and the guarantee of results that lawful uploads and thereby safeguard the rights of users stemming from certain limitations and exceptions as guaranteed by Article 17(7) and Article 17(9), raises significant fundamental rights problems.

While the guidance does not recommend a particular technology to fulfil the obligations incurred by OCSSPs, it acknowledges the use of content recognition tools – automated filtering – as one of the most used technologies in this regard. Interestingly, the Commission explicitly concedes that these technologies cannot distinguish between lawful and unlawful uses. As a result, the use of such technologies would certainly lead to ex-ante blocking of lawful content, which, as the Polish Government argues, constitutes a violation of the right to freedom of expression. This does however not lead the Commission to recommend a ban of these technologies but to reduce their negative impact on user’s rights by limiting the automated blocking to ‘manifestly infringing uploads’. The Commission had already defended a similar position (followed in this reading by the Council of the EU and the European Parliament) during the hearing before the Court of Justice  that preventive ex-ante blocking should only be used in relation to ‘manifestly infringing content’. The Commission then explains in various examples when an upload can be considered as such, for example when there is an ‘exact match of an entire work or of significant proportions of a work’. These criteria seem interesting, and the Commission certainly makes an effort to fine-tune its approach. How these nuances can be implemented by automated filtering technology is however very uncertain, if at all possible. Also, a quantitative approach to detect infringing content does not do justice to the legal approach, and there are sufficient examples where the upload of an entire work is necessary for quotation or review purposes (such as entire military reports, validated by the Court of Justice in Funke Medien (C-496/17)). The Commission admits that ‘the identification of manifestly infringing content and other content by automated means does not represent a legal assessment of the legitimacy of an upload, including whether it is covered by an exception’. However, it is exactly this that is problematic from a freedom of expression perspective. Automated filtering based on a quantitative ‘manifestly illegal’ standard will inevitably prevent perfectly legal uses, which does not comply with the necessary fundamental rights safeguards. This does not necessarily invalidate the ‘manifestly illegal’ approach, but shows how complex this assessment is, calling for an independent evaluation by a third party. Also human review, often presented (including in the guidance) as providing more safeguards for legitimate uses, is only of limited efficiency. Even the most knowledgeable copyright law professor cannot correctly and expeditiously assess the legality of a use in 27 different unharmonised copyright systems in the EU.

One additional new element introduced in the guidance significantly further changes the Commission’s position on ex-ante blocking and filtering of content, and, as a result, its appreciation of the balance of fundamental rights and the proportionality of the interference with at least two particular fundamental rights concerned.

First, the Commission opens a backdoor for ex-ante blocking and filtering of content ‘whose availability could cause significant harm’ to rightsholders’. This refers to time-sensitive content, that is, content that has a certain economic value in a specific window (examples mentioned are pre-releases of films or music and highlights of recent sports events). Rightsholders can ‘earmark’ such content with the effect that the detection of uploads containing this content or parts thereof would lead to a particular care and diligence in application of best efforts. The Commission suggests for example a rapid ex-ante human review when the content is earmarked. However, the Commission is rather cautious, explaining that this review may be undertaken ‘when proportionate, possible and practicable’. It is uncertain if the earmarking will lead to a presumption for the platforms that the content is manifestly illegal and thus potential to an over-blocking of all earmarked content. Maybe it would have been more accurate to include time sensitive content in the assessment of the ‘manifestly illegal’ criteria? It seems legitimate that pre-released music or films fall under such standards and to leave that untouched, maybe using earmarking as an information tool for rightholders and platforms without any effect on the duty of care? Again, this shows that these criteria, however interesting, need further (independent) monitoring to fully comply with fundamental rights. Admittedly, earmarked content should also be ‘limited to cases of high risks of significant economic harm, which ought to be properly justified by rightholders’, a limitation which might marginally alleviate the concerns of users.

Second, an important role is assigned to OCSSPs in moderating content. For earmarked content, OCSSPs have to conduct an ex-ante human review, and if necessary reinstate an upload if it is found to be lawful. But also in other circumstances, for example when a user protected against ex-ante blocking, or when a rightholder initiates an ex-post complaint (with a potential counterclaim by the concerned user), OCSSPs are the first instance of institutional adjudication, Although Article 17(9) provides that users should have access to out-of-court redress mechanisms and access to the courts of the Member States, crucial decisions will be made by private operators and, in addition, at their expense. As a result, OCSSPs incur a double burden, the primary liability to obtain authorisation for uploads made by their users and to moderate, technologically and quasi-judicially, between rightsholders and users.

From a more policy-oriented perspective, it is to be regretted that the guidance comes too late. And it is unfortunate, or one might even argue comical, that the Commission provides guidance on the application of Article 17 just a summer-weekend before the deadline for its transposition expires. This must be seen against the background that the current implementation across the Member States is incoherent at best and that the challenge against Article 17(4), which might also bring the entire provision to fall is still pending (as mentioned, the Opinion of the Advocate General is set to be published on 15 July, having already been postponed once). The Opinion itself, certainly more the judgment, can now give a more authoritative clarification and constitute an opportunity to highlight obvious flaws in Article 17 CDSM Directive from a fundamental rights perspective. The guidance is certainly a useful document containing very important clarifications. It introduces nuances and new safeguards for user rights (but admittedly also new uncertainties). However, this document is not sufficient to address all the numerous concerns raised by Article 17 from a fundamental rights perspective and thus likely will not save it from annulment by the Court of Justice. The Commission seems conscious of that, as it introduces the guidance stressing that the document might need to be reviewed following the judgment by the Court.

The guidelines are not binding and transposition laws in Germany and Finland have taken a more considerate path. This raises another problem, which the Court in the Polish challenge should also be mindful of: does Article 17 CDSM Directive achieve its aim to harmonise the laws in the Member States and provide desperately needed legal certainty and a level playing field for users and rightsholders in the EU? It should not be forgotten that the mandate of the EU is to legislate in order to harmonise according to the EU Treaties; if a provision does not lead to harmonisation, the competence of the EU to adopt it is questionable. This is very important, since the grounds of lack of competence of the EU will have to be raised by Court ex officio.


Bernd Justin Jütte is Assistant Professor in Intellectual Property Law at the Sutherland School of Law, University College Dublin and Senior Researcher at Vytautas Magnus University Kaunas.

Christophe Geiger is Professor of Law at the Centre for International Intellectual Property Studies (CEIPI), University of Strasbourg (France) and Affiliated Senior Researcher at the Max Planck Institute for Innovation and Competition (Munich, Germany).

The authors recently published ‘Platform Liability Under Art. 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match‘, GRUR Int (forthcoming 2021).





Your privacy is important for us

We use cookies to improve the user experience. Please review privacy preferences.

Accept all Settings

Check our privacy policy and cookies policy.