Faculty of Law
To Filter or Not to Filter: Compliance of Article 17
DSM with Freedom of Expression and the Arts
Kaan Saritas
Student Number: 12601314 E-mail: [email protected]
Master’s Thesis LLM International and European Law: European Union Law Supervisor: Prof. Nikolaos Lavranos
Abstract
Article 17 DSM with its aim to close the value gap between rightholders and big online content service sharing providers, such as YouTube, has led to many heated debates about the free internet in all its diversity of content. It is feared that the upcoming mandatory upload filters will eliminate freedom of expression within the webspace. In this thesis the argument is made that Article 17 DSM leaves just enough leeway for the Member States to introduce counterbalancing measures to strike a fair balance between intellectual property rights and the fundamental rights of the users. The most important counter-balancing measures being limiting upload filters to prima facie copyright infringements and introducing a counter-liability scheme in case of over-blocking of blatantly lawful content.
Table of contents
A. INTRODUCTION ... 1
B. INTERPRETATION OF ARTICLE 17 OF DIRECTIVE 2019/720 ... 3
I. Safeguarding intellectual property ... 4
1. Paragraph 1 – licensing ... 4
2. Paragraph 4 – filtering ... 5
a) Subparagraph (a) – best efforts to obtain authorisation ... 6
b) Obligation to use an upload filter ... 7
c) Filtering intensity ... 9
d) Leeway for Member States ... 10
II. Safeguarding freedom of expression and the arts ... 11
1. Paragraph 7 ... 11
2. Paragraph 9 ... 13
III. Interim Conclusion ... 16
C. FUNDAMENTAL RIGHTS PART ... 18
I. Freedom of expression ... 18
1. Scope ... 19
2. Limits and exceptions ... 20
II. Freedom of the arts ... 22
D. STRIKING A BALANCE? ... 24
I. ContentID ... 24
II. Two examples ... 24
III. Hypothetical case considering Article 17 DSM ... 27
Bibliography ... V List of abbreviations ... IX
A.
Introduction
“The text of the DSM Directive nowhere mentions YouTube, but anyone versed in the political economy of digital copyright knows that Article 17 was designed specifically to make YouTube pay. The important question in the wake of Article 17’s adoption is who else will pay – and in what ways.”1
1 On 17 May 2020, the European Union (EU) adopted Directive 2019/790 on copyright and related rights in the Digital Single Market2 (hereafter DSM) which was the first major update to the EU copyright regime since 2001.3 The most controversial provision is the newly introduced Article 17 DSM, which introduces a strict liability regime for online content service sharing providers (OCSSPs) such as YouTube. An online petition against it which reached over five million signatures,4 however, could not prevent its adoption. Since then, many voices have predicted the downfall of the free internet as we know it due to the fear that Article 17 DSM will introduce a censorship-machinery.5 Heated discussions in Member States have been and are still being held on the best possible way to implement the DSM into national law.6 Immediately after the adoption of the DSM, the German political party ‘CDU’ declared that they would abstain from introducing upload filters in their national implementation in Germany.7 The possibility of such an implementation will depend on the leeway the Directive leaves to the Member States. This thesis is dedicated to work out if such a leeway exists to limit the use of upload filters on national level or at least remedy its presumed negative impact on the fundamental rights of the users of OCSSPs. After thorough analysis, ultimately, the research question if Article 17 DSM – as the opening quote suggests – inevitably leads
1 A Bridy, ‘The Price of Closing the 'Value Gap': How the Music Industry Hacked EU Copyright
Re-form’ (2019) 22 JETLaw 323, 325.
2 Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright
and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC [2019] OJ L 130.
3 T Spoerri, 'On Upload-Filters and Other Competitive Advantages for Big Tech Companies under
Arti-cle 17 of the Directive on Copyright in the Digital Single Market' (2019) 10(2) JIPITEC 173, 174.
4 ‘Stop the censorship-machinery! Save the Internet!’ (change.org)
<www.change.org/p/european-parliament-stop-the-censorship-machinery-save-the-internet> accessed 18 June 2020.
5 ibid.
6 Spoerri (n3) [1].
7 ‘Kompromiss zum Urheberrecht: Keine Uploadfilter!’ (CDU, 15.03.2019)
to the infringement of the freedom of expression and the arts of users of OCSSPs shall be answered.
2 The first section of this master thesis deals with Article 17 DSM and what exactly it demands from the Member States. For this purpose, there is an introduction to the relevant paragraphs of Article 17 DSM which then will be elaborated. To that end, some interpretations of the paragraphs by various scholars are presented. The chapter will then be concluded with what Article 17 DSM demands from the Member States and therefore ultimately from the OCSSPs. Section one is partly descriptive as it will contain the opinions of various scholars, partly evaluative in nature as it will also contain the author’s interpretational choice of the concerned paragraphs. The second section will elaborate on the freedom of expression and the arts, especially in the context of online content and derogations from it.
3 Based on those two sections, the third section will try to analyse the (research) question if Article 17 DSM in its final form leads to the infringement of the fundamental rights of freedom of expression and freedom of the arts. For this, first, ContentID of YouTube will be introduced, and, by means of two examples, its current bias will be shown. Throughout the whole thesis, YouTube and its filtering system will be referred to as it is the main reason for the way Article 17 DSM was adopted. The main part of the third section is dedicated to a mock example which will be used to showcase how ContentID will have to work in order for YouTube to evade the financial liability imposed by Article 17 DSM. Possible leeway of the Member States in the implementation shall be factored into the solving of the mock example and ultimately lead to an answer for the research question. This whole subchapter will be evaluative in nature and bring the two previous main sections together to ultimately answer if Article 17 DSM must be declared invalid due to its inevitable infringement of the freedom of expression and the arts as the constitute general principles and are guaranteed by the Charter of Fundamental Rights of the European Union.
B.
Interpretation of Article 17 of Directive 2019/720
4 The rationale8 behind the adoption of Article 17 DSM was the alleged9 existence of a
value gap between (European) content creators and right holders on the one side, which are in the view of the EU’s legislator not remunerated fairly, and the big (American) OCSSPs10 such as YouTube, Vimeo and Facebook on the other side.11 Article 17 is
supposed to remedy this imbalance.12 However, there has been major criticism of Article 17 by lawyers, researchers, non-governmental organisations, observers and citizens alike.13 Some Member States joined the criticism, with the Republic of Poland ultimately initiating an action against the European Parliament and the Council of the EU demanding the annulment of Article 17(4) subparagraphs (b) and (c), alternatively the annulment of Article 17 itself.14
5 This chapter introduces important paragraphs of Article 17 and elaborates on the interpretations of these paragraphs by some scholars. Where diverging interpretations are available, the most plausible one shall be chosen. First, paragraph 1 and 4 which are aimed at the protection of intellectual property of rightholders shall be analysed. For this purpose, paragraph 5 and 8 will be used to determine the scope of the obligations of the OCSSPs. Paragraph 7 and 9 which deal with protection of the fundamental rights of the users of OCSSPs shall be introduced subsequently.
8 For a wide overview of the rationale of Article 17 DSM see Spoerri (n3) 175-176.
9 For criticism about the lack of empirical evidence of such a ‘value gap’ and how this term has been
pushed by the music and entertainment industry, see G Frosio, ‘To Filter or Not to Filter? That Is the Question in EU Copyright Reform’ (2017), 36(2) Cardozo Arts & Entertainment Law Journal 331, 331-334; agreeing Spoerri (n3) [9]; Bridy (n 1) 326-328.
10 For a definition of OCSSPs see Article 2(6) DSM.
11 P Samuelson, ‘Europe's controversial digital copyright directive finalized’ (2019) 62(11) Commun
ACM 24, 24.
12 For an overview of how Article 17 DSM (and the amendments to the Directive in general) came to be
see JP Quintais, ‘The New Copyright in the Digital Single Market Directive: A Critical Look’ (2020) 1
EIPR (forthcoming), 2-3.
13 Letter from various stakeholders to Union representatives (29 January 2019)
<https://edri.org/files/copyright/20190122_Open_Letter_Council-final.pdf> accessed 18 July 2020; Letter from Civil ‘Liberties Union for Europe (Liberties)’ to President Juncker (20 May 2019) <https://edri.org/files/copyright/20190517-EDRI_copyright_open_letter.pdf> accessed 18 July 2020.
I. Safeguarding intellectual property 1. Paragraph 1 – licensing
6 In its final version, paragraph 1 reads as follows [shortened and highlighted in bold by the author]:
“Member States shall provide that an [OCSSP] performs an act of communication to the public or an act of making available to the public for the purposes of this Directive when it gives the public access to copyright-protected works or other protected subject matter uploaded by its users.
An online content-sharing service provider shall therefore obtain an authorisation from the rightholders referred to in Article 3(1) and (2) of Directive 2001/29/EC, for instance by concluding a licensing agreement, in order to communicate to the public or make available to the public works or other subject matter.”
7 The provision is divided in two subparagraphs which deal with two different issues. The first subparagraph gives OCSSPs an active role when it comes to copyright infringements as it labels them to be performing “an act of communication to the public.” The perception of OCSSPs shifts from ‘mere-conduits’ to ‘gate-keepers’15 or
even active perpetrators,16 even if they do not have any knowledge of infringing content on their platform.17
8 Subparagraph 2 then introduces the primary escape route18 of OCSSPs from liability through the obligation to obtain authorisation for copyrighted work which is uploaded on their platforms by their users. The provision itself suggests licensing agreements. This obligation to obtain authorisation entails two separate obligations: the duty of the OCSSPs to check their platforms for content that might need licensing and the (soft) obligation to obtain licences for such detected copyrighted content.19 For the former, Spindler claims that from a systematic point of view of Article 17, the OCSSPs must
15 For an analysis how the global perception of online service providers changed gradually from
‘mere-conduits’ to ‘gate keepers’ of uploaded content see G Frosio and S Mendis, ‘Monitoring and Filtering: European Reform or Global Trend?’ in G Frosio (ed), The Oxford Handbook of Online Intermediary
Liability (Oxford University Press, 2019) (forthcoming) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3450194> accessed at 18 June 2020.
16 G Spindler ‘Art. 17 DSM-RL und dessen Vereinbarkeit mit primärem Europarecht’ (2020) 2 GRUR
253, 255.
17 M Senftleben, ‘Institutionalized Algorithmic Enforcement – The Pros and Cons of the EU Approach to
UGC Platform Liability’ (2020) 14 Florida International University Law Review, 5.
18 M Senftleben, ‘Bermuda Triangle – Licensing, Filtering and Privileging User-Generated Content
Un-der the New Directive on Copyright in the Digital Single Market’ (2019) <https://ssrn.com/abstract=3367219> accessed 18 June 2020, 3.
undertake an extensive analysis of their content to determine what kind of licences are to be obtained.20 Dealing with the latter, Spindler argues there is only the obligation to make a sincere effort to obtain licences; however, in light of the principle of proportionality in paragraph 5, neither the rightholders nor the OCSSPs can be forced to contract a licensing agreement; which is not amendable by the Member States.21 He also
argues that it is not enough to simply offer – especially in relation to non-professional or amateur copyrighted work – monetisation to the rightholders without some degree of prior attempts to obtain licences.22
2. Paragraph 4 – filtering
9 This paragraph deals with the liability disclaimer of OCSSPs when no authorisation (or rather licensing) can be obtained from the rightholders. This is the heart of Article 17 DSM since it is undisputed that it will be virtually impossible to obtain licences from every European rightholder.23
10 Due to massive protests throughout Europe, the EU legislator saw itself forced to change the wording of Article 17 insofar as to remove an explicit reference to ‘content recognition technologies’24 in paragraph 4.25 However, several critics see this only as a cosmetic change and believe paragraph 4, or rather Article 17 read as a whole, still demands an upload filter from the OCSSPs.26 The wording of the provision itself does
not specify in detail what kind of measures need to be taken by the OCSSPs.27 The final
adopted version of paragraph 4 of Article 17 reads as follows [highlighted in bold by the Author]:
“If no authorisation is granted, [OCSSPs] shall be liable for unauthorised acts of communication to the public, including making available to the public, of
copyright-20 Spindler (n16) 255. 21 ibid, 255.
22 G Spindler, ‘Upload-Filter: Umsetzungsoptionen zu Art. 17 DSM-RL’ (2020) 36(1) Computer und
Recht 50 [11, 13].
23 see Samuelson (n11) 27; M Lambrecht, ‘Free Speech by Design - Algorithmic protection of exceptions
and limitations in the Copyright DSM directive’ (2020) JIPITEC 11(1) (forthcoming), 6.
24 The Commission’s proposal explicitly included the term ‘content recognition technologies’ in
Arti-cle 13 (now adopted as ArtiArti-cle 17) see Proposal for a DIRECTIVE OF THE EUROPEAN PARLIA-MENT AND OF THE COUNCIL on copyright in the Digital Single Market, COM/2016/0593 final - 2016/0280 (COD).
25 J Reda, ‘EU copyright reform: Our fight was not in vain’ (Julia Reda, 18 April 2019)
<https://juliareda.eu/2019/04/not-in-vain/> accessed 3 April 2020.
26 Lambrecht (n23) 6. 27 Spindler (n16) 254.
protected works and other subject matter, unless the service providers demonstrate that they have:
(a) made best efforts to obtain an authorisation, and
(b) made, in accordance with high industry standards of professional diligence,
best efforts to ensure the unavailability of specific works and other subject
matter for which the rightholders have provided the service providers with the
relevant and necessary information; and in any event
(c) acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their
future uploads in accordance with point (b).”
a) Subparagraph (a) – best efforts to obtain authorisation
11 Volksmann points out that the obligation to obtain licences in itself is an impossible endeavour since it will not be possible for OCSSPs to acquire licences for all eventualities of uses of copyrighted work by their users. This is, among other reasons, due to a vast number of different types of works that can be uploaded by users and even if there are big copyright collection societies which hold considerable amounts of licences, they are still fragmented territorially.28 Nonetheless, it is still to be determined what the term ‘best efforts’ to obtain an authorisation encompasses.
12 As briefly addressed above,29 the obligation to make best efforts to obtain authorisation
also includes, as a first step, the obligation of OCSSPs to scan their whole database for copyrighted content to figure out which licences need to be obtained. Volksmann argues – with reference to paragraph 8’s prohibition of general monitoring, recital 66 and the fact that the term “authorisation” must incorporate subsequent30 consent by the rightholder – that paragraph 4(a) cannot be understood as demanding from OCSSPs an ex ante scanning of their whole database. She understands this obligation only to be triggered after receiving the necessary and relevant information from the rightholders.31 She further argues that this interpretation of subparagraph (a) is supported by a
28 C Volksmann, ‘Art. 17 Urh-RL und die Upload-Filter verschärfte Störerhaftung oder das Ende der
Freiheit im Internet’ (2019) 35(6) Computer und Recht 376 [18]; for further reasons about the sheer impossibility to obtain all-encompassing licences see Senftleben (n17) 5-6.
29 See margin no 8.
30 After a copyright violation.
31 Volksmann (n28) [21-24]; see also Pravemann (n42) 786-787: he comes to the same conclusion,
how-ever with a slightly different argumentation. He points out that any information from anyone starts the obligation to obtain an authorisation.
teleological interpretation in light of the primary aim of Article 17 in general to promote the conclusion of licences between rightholders and OCSSPs,32 suggesting that the liability scheme is just a secondary aim of the provision.
13 However, Spindler convincingly argues that the wording of paragraph 4 requires from the OCSSPs a full analysis of their database, and therefore the use of an ex ante (upload) filter. He rejects opinions which interpret this obligation only to begin after the reception of necessary and relevant information by the rightholders.33 According to Spindler, the system of paragraph 4 is twofold and the notice-and-take-down/stay-down obligation only starts after the failure of the OCSSPs to comply with the licensing obligation.34 Therefore, recital 66 can only be consulted as interpretational help for paragraph 4(b) and (c) and not for (a).35 Any limitation of the all-encompassing obligation to scan their database for copyrighted content would have needed to be set out in the provision itself. Therefore, according to Spindler any other interpretation of subparagraph (a) would violate the twofold system within paragraph 4.36 He then compares this required filtering system to the one in question in the SABAM v Netlog37 case (hereafter Netlog case) and consequently concludes that it is not compatible with the general monitoring prohibition in paragraph 8, meaning that subparagraph (a) does not strike a fair balance between the involved fundamental rights and is therefore not in compliance with EU law.38
b) Obligation to use an upload filter
14 When it comes to the obligation for upload filters, Spoerri focuses especially on paragraph 4(b) which demands from OCSSPs preventive measures “to ensure the unavailability [of copyrighted works] for which the rightholders have provided the service providers with the relevant and necessary information.”39 These measures
cannot be realistically achieved without the use of filtering technologies.40
15 He interprets the term ‘best efforts’ used in subparagraph (b) in context of recital 66 of the DSM which demands that OCSSPs take “all the steps that would be taken by a
32 Volksmann (n28) [23]. 33 Spindler (n16) 259. 34 ibid, 259.
35 ibid, 259. 36 ibid, 255.
37 Case C-360/10 Belgische Vereniging van Auteurs, Componisten en Uitgevers CVBA (SABAM) v Netlog
NV [2012] ECLI:EU:C:2012:85 [38].
38 Spindler (n16) 259. 39 Article 17(4b) DSM.
diligent operator to achieve the result of preventing the availability of unauthorised works […], taking into account best industry practices and the effectiveness of the steps taken in light of all relevant factors and developments, as well as the principle of proportionality”41 to relieve themselves from liability. According to him, such an understanding of ‘best efforts’ coupled with the financial threat this special liability mechanism poses, big OCSSPs like YouTube and Facebook will even further improve their already elaborate filtering technologies which will then become the new industry standard. Similar effective filtering technologies will then be expected from all OCSSPs. Therefore, in Spoerri’s opinion, paragraph 4, and especially subparagraph (b), cannot be understood any differently than imposing an obligation to use filtering technologies.42 16 Volksmann also assumes that upload filters will be unavoidable and that there is no
leeway within Article 17 for the Member States to circumvent their use in their national implementations. As the use of broad filtering technologies by OCSSPs to escape the resulting liability is undesirable from a fundamental rights point of view, Volksmann advocates for a rather strict interpretation of paragraph 4, leaving some leeway to the Member States in the implementation.43 She interprets paragraph 4 systematically in conjunction with paragraph 8, which states that “[t]he application of this Article shall not lead to any general monitoring obligation.”44 Therefore, paragraph 4(b) must be understood in such a way that the obligation to filter is only triggered after the rightholders have provided OCSSPs with sufficient information as to the copyrighted material.45 This interpretation is strengthened by a teleological interpretation in
conjunction with recital 66.46 It is necessary to remark here that Volksmann does not
seem to distinguish between subparagraphs (a) and (b) and talks predominantly about the obligation resulting out of subparagraph (a); however, within her argumentation it is quite clear that her interpretation extends to subparagraph (b).
41 Recital 66 paragraph 2 DSM.
42 Spoerri (n3) [17-19]; see also Bridy (n 1) 353: she argues that even though the provision itself is silent
on the fact, upload filters/content recognition technologies are still demanded by it. She supports her argument by claiming that the whole Article 17 DSM was adopted with the aim to make ContentID – YouTube’s filtering technology – accessible to all rightholders or rather to give them a stronger position in licensing negotiations (see also page 357); see also T Pravemann, ‘Art. 17 der Richtlinie zum Urheberrecht im digitalen Binnenmarkt: Eine Analyse der neuen europäischen Haftungsregelung für Diensteanbieter für das Teilen von Online-Inhalten’ (2019) 8 GRUR 783, 784.
43 Volksmann (n28) [31-32]. 44 Article 17(8) DSM. 45 Volksmann (n28) [21-22]. 46 ibid [23-27].
17 Spindler argues along the same lines and states that in conjunction with recital 66, paragraph 4(b) can only be understood in such a way that the filtering obligation only begins when provided with relevant and necessary information by the rightholders, meaning that without such information, the OCSSPs must not act.47 He concludes,
therefore, that paragraph 4(b) cannot be seen as a “general obligation to use upload filters”, but rather just an extended notice-and-take-down and notice-and-stay-down obligation, as established by Article 14 of the e-Commerce-Directive.48 Pravemann calls this obligation in subparagraph (b) notice-and-prevent since the OCSSPs are required to even prevent initial uploads of copyrighted content as soon as necessary and relevant information is given by rightholders.49 It is interesting here to consider Bridy’s point that a stay-down obligation necessarily entails the use of an upload filter that continuously monitors all content, representing a “general monitoring under any natural definition of general.”50 Spindler further points out that the Directive does not clarify in what way this information by the rightholders needs to be provided and that the OCSSPs do not have an obligation to check the notifications on their lawfulness.51 In a position paper, the Verbraucherzentrale Bundesverband also recognises this issue and therefore demands from Germany to remedy this issue in their national implementation.52 Public copyright-registers are proposed as suitable to counteract mischievous claims of so-called copyright-trolls.53 This issue is not to be underestimated, since malicious copyright claims can have a significant drawback on fundamental rights of the affected persons since eg YouTube tends to block first and ask later.54
c) Filtering intensity
18 Lambrecht argues that paragraph 4 will aggravate the power imbalance between copyrights holders and users of OCSSPs even further55 since the term used ‘best effort’
is too vague and paragraph 4 imposes a direct strict liability for OCSSPs. This will lead
47 Spindler (n16) 255-256. 48 ibid, 255-256.
49 Pravemann (n 42) 786; Spindler and Pravemann talk essentially about the same thing, just name them
differently.
50 Bridy (n1) 354-355; see also Spoerri (n3) [16] making similar arguments. 51 Spindler (n16) 256.
52 Bundesverband der Verbraucherzentralen und Verbraucherverbände, ‘Stellungnahme zur Umsetzung
der EU-Richtlinien im Urheberrecht’ (2019), 6-7.
53 ibid, 6-7.
54 See margin No. 53-58.
55 See Lambrecht (n23) 11: stating that there has been a power imbalance and some takedown request
them to filter content overzealously as to prevent costly litigation, meaning that in some cases OCSSPs might resort to automatically blocking any content which has any resemblances with copyrighted work, and therefore not consider exceptions and limitations of copyright protection.56 Spoerri also argues that OCSSPs might be tempted
to over-block user uploaded content as to be sure to evade the financial risks of paragraph 4.57 Senftleben also points out, that the focus of the proportionality test lies on
cost and efficiency factors which will likely lead to cheap and unsophisticated (upload) filters which will hence lead to over-blocking.58
19 The Verbraucherverband proposes to remedy those issues in creating incentives that would stop OCSSPs from over-blocking with one suggestion being to introduce fines for OCSSPs when wrongfully blocking legal content,59 a counterpart liability-regime for wrongful blockings that enforces the position of users, so to speak.
d) Leeway for Member States
20 Spindler argues that Member States cannot substantiate the standard of the efforts the OCSSPs need to take, eg stating that fingerprinting in itself is already enough and that OCSSPs need not take any effort beyond that. This is incompatible with paragraph 4 since it explicitly refers to high industry standards and does not establish specific obligations, but relies in a general way on necessary efforts taken by OCSSPs.60
Furthermore, the exclusion of automated filtering mechanisms cannot be done on the national implementation level, since paragraph 9 explicitly speaks of human review in the redress mechanism and argumentum a contrario automated mechanisms must be possible for paragraph 4(b).61
21 Finally, we need to establish the effect the results of the stakeholder dialogues, as provided in paragraph 10, will have on the leeway of the Member States in implementing paragraph 4. Even though Commission guidelines do not have a legally binding character, the Member States must take them into account in implementing the
56 Lambrecht (n23) 12. 57 Spoerri (n3) [38]. 58 Senftleben (n18) 10. 59 Verbraucherzentrale (n52) 7-8. 60 Spindler (n22) [9, 18]. 61 ibid [19].
DSM into national law.62 So far, no guidelines have been published and for now the stakeholder dialogues have been cancelled due to the Corona pandemic.63
II. Safeguarding freedom of expression and the arts
22 It is essential to analyse the procedural safeguards (exceptions and limitations) set out in Article 17 DSM as to work out its compliance with fundamental rights.64
1. Paragraph 7
23 The mandatory exceptions and limitations to copyright as granted in paragraph 7 were much awaited since the InfoSoc-Directive65 did not harmonise in this area and until now the freedom of expression needed to be safeguarded by remedial judgements of the Court of Justice of the European Union (CJEU).66 According to Spoerri, paragraph 7 aims to remedy the problem of over-blocking which is indirectly encouraged in paragraph 4.67 It reads as follows [shortened and highlighted in bold by the author]:
“The cooperation […] shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or
limitation.
Member States shall ensure that users in each Member State are able to rely on any of the
following existing exceptions or limitations when uploading and making available content
generated by users on online content-sharing services:
(a) quotation, criticism, review;
(b) use for the purpose of caricature, parody or pastiche.”
24 In November 2019, a vast number of European Academics (Quintais et al.) gave a recommendation on how to interpret and consequently implement Article 17 in a way that safeguards the freedoms and rights of users of OCSSPs as far as possible. They have concentrated on the exceptions and limitations provided in paragraph 7 and 9 of Article 17 since the licensing and preventive obligations in paragraph 1 and 4 must be
62 Spindler (n22) [32].
63 Past stakeholder dialogues can be watched at
<https://ec.europa.eu/digital-single-market/en/newsroom-agenda/event/copyright> accessed 15 July 2020.
64 Volksmann (n28) [43].
65 Directive 2001/29/EC of the European Parliament and of the Council of 22 May 2001 on the
harmoni-sation of certain aspects of copyright and related rights in the information society [2001] OJ L 167.
66 Lambrecht (n23) 9-10.
67 Spoerri (n3) [38]: however, he believes that this will not be enough, especially considering the
interpreted in light of them. According to these academics, the two subparagraphs of paragraph 7 must be dealt with individually, with the first subparagraph providing a general clause and the second subparagraph providing a specific clause on exceptions and limitations, together constituting user rights/freedoms which need to be protected by the Member States.68 The general clause in subparagraph 1 opens the door for the
exceptions and limitations provided for in Article 5 of the InfoSoc-Directive, under the caveat that they are already implemented by national law.69 It is appropriate here to mention that Article 5 InfoSoc-Directive does not entail a de minimis limit which would exclude private, low-level use of copyrighted material outside of the scope of Article 17 DSM. Such a de minimis limit cannot be implemented on a national level since this would not be in accordance with EU law.70
25 According to Quintais et al., the specific clause contains a set of closed mandatory71 exceptions and limitations “for all acts of uploading or making available by users on OCSSP platforms.”72 Those mandatory exceptions to the copyright of rightholders are quotation, criticism, review, caricature, parody and pastiche. They were awarded a special status as they are, as recital 70 paragraph 1 DSM declares, especially important to strike a balance between the involved fundamental rights.73 Those specific exceptions and limitations should be seen as autonomous concepts of EU law and therefore interpreted in the light of the CJEU case law which already exists for the exceptions in Article 5 of the InfoSoc-Directive.74 It is for the Member States in their national implementation to establish those user freedoms as to effectively safeguard them.75
26 In the cases of Painer and Deckmyn the Court stated that ‘quotation’ and ‘parody’ needed to be interpreted in such a way – meaning not too strictly – that they retain effectiveness for striking a balance between freedom of expression and intellectual
68 JP Quintais and G Frosio et al., ‘Safeguarding User Freedoms in Implementing Article 17 of the
Copy-right in the Digital Single Market Directive: Recommendations From European Academics’ (2019) 10(3) JIPITEC 276 [7-11].
69 Quintais and Frosio et al. (n68) [10]; see also Spindler (n16) 257.
70 C Solmecke, ‘Stellungnahme von Rechtsanwalt Christian Solmecke, LL.M.’ (2019)
<https://www.wbs- law.de/wp-content/uploads/2019/09/Stellungnahme-RA-Solmecke-zur-Umsetzung-von-Artikel-17-der-Urheberrechtsrichtlinie-1.pdf> accessed 18 June 2020 [5] and footnote 2 with further sources.
71 See recital 70 paragraph 1 where the exceptions and limitations are also called mandatory. 72 Quintais and Frosio et al. (n68) [13].
73 ibid [11]. 74 ibid [11]. 75 ibid [7-19].
property rights.76 The same must be true for the other limitations.77 Special focus should be given to ‘pastiche’, which has not been made use of effectively in the Member States until now. It can be used to include user generated ‘remixes’ that could resemble the ‘fair use’-concept of the Digital Millennium Copyright Act of the States.78
27 However, there is still the pressing matter of filtering technologies not being able to distinguish reliably between content that infringes on copyrighted material and content that is lawful due to being in the scope of an exception, like eg a parody.79 Paragraph 9 tries to address this issue.
2. Paragraph 9
28 Paragraph 9 demands the implementation of procedural safeguards as protection of the user freedoms established in paragraph 7, bridging the gap between (automated) copyright protection and (manually) safeguarding fundamental freedoms of the users80 since ordinarily any use of a limitation or exception from the copyright of a rightsholder must be decided in a legal review on a case-by-case basis.81 Paragraph 9 reads as follows [shortened and highlighted in bold by the author]:
“Member States shall provide that online content-sharing service providers put in place an
effective and expeditious complaint and redress mechanism that is available to users of
their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.
Where rightholders request to have access to their specific works or other subject matter disabled or to have those works or other subject matter removed, they shall duly justify the
reasons for their requests. Complaints submitted under the mechanism provided for in the
first subparagraph shall be processed without undue delay, and decisions to disable access to or remove uploaded content shall be subject to human review. […]
This Directive shall in no way affect legitimate uses, such as uses under exceptions or limitations provided for in Union law[...].
[...]”
76 Case C-145/10 Painer [2011] ECLI:EU:C:2011:798 [132]; Case C-201/13 Deckmyn and
Vrijheidsfonds [2014] ECLI:EU:C:2014:2132 [26]; see also Senftleben (n 18) 12.
77 Senftleben (n18) 13.
78 ibid, 17: the comparison of fair use is not made explicitly here, however by the context of the article at
large this fact can be deduced.
79 Verbraucherzentrale (n52) [5]. 80 Spindler (n16) 257.
29 This paragraph leaves significant leeway to the Member States in its implementation. Quintais et al. suggest that the Member States use this leeway in such a way that the user freedoms are safeguarded to a high degree. Therefore, they recommend that Member States restrict the filtering and blocking requirement of uploaded content to cases of “prima facie copyright infringements”82, meaning that only uploaded material that is
identical or equivalent83 (which needs to be interpreted strictly) to protected material is
to be blocked or filtered out automatically at the stage of uploading.84 It is suggested that OCSSPs then provide an easily accessible form to contest any such automatic blocking.85 The blocking shall then only be maintained when the content uploader does not contest the automatic blocking.86
30 For uploaded material that partially matches with copyrighted material provided by the rightholder, the authors recommend that the OCSSPs offer the possibility for the uploader to declare the uploaded material as falling under an exception.87 Any such declaration by the uploader shall then be regarded as initiating the ‘complaint-mechanism’ of paragraph 9, meaning that the rightholder must duly justify their request of removing or disabling the concerned content.88 This will then be subject to human review, presumably by so-called content managers, where the legal status and the fate of the uploaded material will be decided.89 In doing so, Quintais et al. argue that OCSSPs would comply with their new obligations according to Article 17 and the user freedoms would be safeguarded to a high degree.90 Solmecke argues in the same direction and proposes that Member States should in their national implementation obligate OCSSPs to provide for a ‘button’ of some sort within the upload-procedure which would indicate that an exception or limitation is applicable for the uploaded content. This would then automatically start the redress mechanism without prior blocking.91
82 Quintais and Frosio et al. (n68) [22].
83 For one way of defining prima facie infringement see COMMUNIA, ‘Article 17 Stakeholder Dialogue
input paper: Ensuring the protection of users’ rights in the Article 17 implementation guidelines’ (communia-association, 31 March 2020) <https://www.communia-association.org/wp-content/uploads/2020/04/COMMUNIA_stakeholder_dialogue_input.pdf> accessed 20 July 2020, 6.
84 Quintais and Frosio et al. (n68) [22]. 85 COMMUNIA (n83) 8-9.
86 ibid, 8-9.
87 Quintais and Frosio et al. (n68) [26-27]. 88 ibid [29].
89 ibid [30]. 90 ibid [31].
31 According to Spoerri, the redress mechanism in paragraph 9 is needed since the weighing92 between the freedom of expression and freedom of the arts versus the right to (intellectual) property will be regularly decided in favour of the latter one, at least in the case of a duly diligent OCSSPs,93 therefore leading to over-blocking of content.94
Adding to those the number of mistakenly flagged content, the amount of cases which will be subject to the redress mechanism is potentially immense,95 overwhelmingly so, if
it is kept in mind that in 2019 more than 500 hours of video material was uploaded per minute to YouTube.96 Spoerri argues that OCSSPs could claim that considering the principle of proportionality – as stated in paragraph 5 – and also the financial burden involved, there are only two possible ways for the OCSSPs to implement this redress mechanism, implying that paragraph 5 limits the way paragraph 9 can be interpreted.97 32 The first possibility relies on the reduction of human intervention to an absolute
minimum due to financial costs, requiring an aggressive filtering without having regard to limitations and exception to copyright like fair use at first instance. Only when the uploader actively shows evidence that the upload is protected by an exception or limitation the OCSSP could unblock the – rightfully uploaded – content.98 This interpretation entails the opposite of what Quintais et al. suggest in that only prima facie infringement shall be blocked immediately.
33 The other possibility would be to demand uploaders to certify in advance that their uploads do not constitute or include copyrighted material or alternatively fall under an exception or limitation. Accordingly, content will be blocked or refused for uploading should a rightsholder refuse to grant certification.99 This is comparable to the proposal of
Solmecke from above.
34 Eventually, disputes must be decided by a content manager – as demanded by paragraph 9 – which will create a considerable financial burden on OCSSPs.100 This
92 That this weighing is important can also be seen in recital 70 of the DSM.
93 This is since over-blocking is less risky than blocking only the clear cases, see Senftleben (n 18) 10. 94 Spoerri (n3) [38-41].
95 ibid (n3) [41].
96 J Hale, ‘More Than 500 Hours of Content Are Now Being Uploaded To YouTube Every Minute’
(tu-befilter, 7 May 2019)
<https://www.tubefilter.com/2019/05/07/number-hours-video-uploaded-to-youtube-per-minute/> accessed 18 July 2020.
97 Spoerri (n3) [44]. 98 ibid [42]. 99 Spoerri (n3) [43]. 100 ibid [45].
must also be seen as true for big companies such as YouTube as they have more uploads to deal with.
35 As almost any dispute must be decided by human review, Lambrecht identifies the review constituting private adjudication as an issue and therefore regards the persons involved to lack independence and impartiality.101 Kaye, the Special Rapporteur on the
Promotion and Protection of the Right of Freedom of Opinion and Expression, also emphasises on the lack of independence, which constitutes a significant obstacle since the redress mechanism is supposed to remedy potential human rights violations.102 Furthermore, Kaye fears that OCSSPs will not even grant users the level of protection as vaguely demanded in paragraph 9,103 for they act primarily in the interest of their stakeholder and might opt in for the legally and financially safest route of over-blocking content.104 One possibility to remedy this would be – as stated above – to introduce a counter-liability system for wrongful (over-) blocking105 to counterbalance the strong incentive coming from the liability scheme in paragraph 4.
36 Lambrecht also points out that users rarely appeal against takedown decisions, rendering the redress mechanisms in all probability ineffective.106 He further argues, even if the redress mechanism comes into action, triggering human review, before preventive measures are taken – which according to Quintais et. al lies within the power of the Member States to establish in their national implementations107 – it is doubtful that the law will be applied in a proportional manner since content managers might be biased by the preceding algorithmic flagging of content.108
III. Interim Conclusion
37 As has been shown, Article 17 DSM incorporates both the protection of intellectual property of copyright holders and the protection of user rights such as the freedom of expression and the arts. However, the way paragraphs 1 and 4 are worded will inevitably lead to the use of upload filters by OCSSPs, since it is not possible to obtain all licences
101 Lambrecht (n23) 12.
102 D Kaye, Special Rapporteur on the Promotion and Protection of the Right of Freedom of Opinion and
Expression (2018) OL OTH 41/2018, 8.
103 In his report Kaye speaks about Article 13(7) since his report was on the proposal for the Directive. 104 Kaye (n102) 9.
105 See above at margin no 18. 106 Lambrecht (n23) 12. 107 See margin no 29.
from all rightholders. Furthermore, neither the rightholders nor the OCSSPs are required to contract a licensing agreement. The strict liability of paragraph 4 will in all probability compel big OCSSPs to further develop their already elaborate filters, which then will become the new industry standard. Upload filters can in no way be excluded on national level. Paragraphs 7 and 9 try to soften the blow to fundamental rights of the users in safeguarding mandatory exceptions and limitations to copyright. However, they lack any incentive or financial risk to force OCSSPs to respect these exceptions and limitations. The literature therefore suggests that within the leeway of the Member States these safeguards are strengthened in various ways in the national implementation with the most important one being to limit the preventive blocking to prima facie copyright infringements and otherwise to start – after such an indication is made by the uploader – the redress mechanism. In order for the content managers in the redress mechanism to not be biased from the very beginning, a financial liability for over-blocking must be introduced.
C.
Fundamental rights part
38 It must first be noted that Article 17 DSM affects a plethora of fundamental rights of various actors. For one, there is the freedom of occupation, freedom to conduct a business and the right to property for the OCSSPs as established in the Charter of Fundamental Rights of the European Union (EUCFR). Further, there is the freedom of expression, freedom of the arts and right to the protection of personal data for the users of OCSSPs. Also, persons who consume content are affected in their freedom of information. Not to be forgotten are the rightholders who are affected in their right to (intellectual) property.109
39 This master thesis focuses on the freedom of expression in Article 11 and the right of artistic freedom in Article 13 of the Charter of Fundamental Rights of the European Union. There will be some short remarks on the scope of these two fundamental rights, analysing them in the context of the internet and specifically in the context of videos on YouTube.
40 As the DSM needs to be implemented by the Member States into their national legal order, their implementations fall within the scope of EU law and must not infringe on the rights as guaranteed in the EUCFR.110
I. Freedom of expression
41 The freedom of expression – established in Article 11 EUCFR111 – is of fundamental importance to the functioning of a democracy.112 It nonetheless does not constitute a
non-derogable right and is in fact limited by some EU legislation such as the e-Commerce-Directive113 (especially its take-down-regime in the context of copyright
109 Spindler (n16) 257.
110 L Woods, ‘Article 11: Freedom of Expression and Information.’ in Peers S and Hervey T and Kenner J
and Ward A (eds), The EU Charter of Fundamental Rights: A Commentary (London, 2014) 311 [11.02].
111 Freedom of expression as understood in Article 10 ECHR is also a general principle of the European
Union. See for this: Case C-219/91 Ter Voort [1992] ECR I-5485 [35] and Case C-260/89 Elliniki
Ra-diophonia Tileorassi AE (ERT) v Dimotiki Etairia Pliroforissis [1991] ECR I-2925 [44].
112 HD Jarass, Charta der Grundrechte der Europäischen Union: Unter Einbeziehung der vom EuGH
entwickelten Grundrechte, der Grundrechtsregelungen der Verträge und der EMRK Kommentar
(3th edn, CH Beck 2016), Article 11 [4].
113 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal
aspects of information society services, in particular electronic commerce, in the Internal Market ('Di-rective on electronic commerce') [2000] OJ L 178.
protection) or, as in this case, by the DSM.114 Section ‘D’ will deal with the (research) question whether Article 17 DSM lawfully limits the freedom of expression and the arts.
1. Scope
42 First, the scope of the right of freedom of expression has to be analysed: there is no need to elaborate on this fully, but rather concentrate on aspects that are important in the area around the DSM, ie the importance of freedom of expression on the platforms of OCSSPs and specifically the lawfulness of the use of (upload) filtering technologies. First of all, ‘expression’ does not only include speech but rather any form – including in electronic form115 – that can be used116 to impart any ideas and information,117 be it political, artistic, serious, frivolous or humorous.118 Both the substance of the ideas and information and the ‘form of communication’ chosen are protected.119 Its scope is interpreted dynamically to include changes that come with technological advancements120 and includes eg music, advertisements,121 listings of an electronic market place (eg eBay),122 blogs and other forms of social media (and providers of infrastructure),123 and the distribution and publication of opinions and information on the internet in general.124 This must include videos and other media on YouTube. The term ‘impart’ also suggests the right to choose the location and time of an expression to have impact on the ‘audience’.125 Advocate General Trstenjak – in her Opinion in MSD
Sharpe126 – states that the freedom of expression must include “the transmission of third-party ideas and information”127. It is therefore not inappropriate to include within
the freedom of expression the freedom of communication.128
114 Woods (n110) [11.22].
115 Case C-316/09 MSD Sharp & Dohme [2010] ECR I-3249, Opinion of AG Trstenjak [81]. 116 By individuals or legal persons.
117 De Haes and Gijsels v Belgium App no 7/1996/626/809/1996/626 (Judgment 24 February 1997) [48];
Woods (n110) [11.27].
118 Woods (n110) [11.28]. 119 ibid [11.27].
120 ibid [11.44].
121 Jarass (n112), Article 11 [10].
122 Case C-324/09 L'Oréal and Others [2010] ECR I-6011, Opinion of AG Jääskinen [157]. 123 Woods (n110) [11.27] and footnote 60 and the sources cited.
124 C Calliess and M Ruffert (eds), EUV/AEUV: Das Verfassungsrecht der Europäischen Union mit
Eu-ropäischer Grundrechtecharta Kommentar (5th edn, CH Beck 2016), Article 11 EUCFR [7].
125 Calliess (n124) Article 11 EUCFR [12]. 126 MSD Sharpe (n115).
127 ibid [81].
43 However, the freedom of expression does not only include the ‘speakers’ right to express himself, but also includes the right of the “audience” to receive (publicly accessible) information expressed by the ‘speaker’,129 freedom of Information so to speak.130 It also
includes the right to receive information on the internet.131
44 Freedom of expression not only entails negative obligations on the authorities, but also positive obligations.132 The authorities and in general anyone obligated by the freedom
of expression are held to protect (or at least not worsen) the freedom of expression between private persons.133
2. Limits and exceptions
45 Any impediments put on the protected acts as elaborated above, which are directly caused are to be seen as a restriction on the freedom of expression and information. Restriction can also be caused indirectly or by actual measures.134 Any delay of impartation of information is also infringing the freedom of expression.135 Generally, the threshold to assume a restriction is rather low.136 Restrictions may entail eg blocking access to specific content on the internet, filtering of content,137 take-down notices and licensing requirements138 based on intellectual property protection rights.139 When talking about restrictions it is important to consider the chilling effect of measures taken.140 Chilling effects can occur when the obstacles to express an opinion are too high
or opinion first must be allowed. In the context of YouTube, users might not upload their content due to over-blocking and unjustifiably restrictive understanding of exception to copyright.141
46 In his Opinion in L’Oréal v eBay AG Jääskinen stated that the notice-and-take-down system – established in Article 14(1)(b) e-Commerce-Directive – should not
129 Woods (n110) [11.30]. 130 Jarass (n112) Article 11 [15].
131 Case C-70/10 Scarlet Extended [2011] ECR I-11959 [50]; Jarass (n112) Article 11 [15]. 132 Woods (n110) [11.32].
133 Jarass (n112) Article 11 [24]. 134 ibid, Article 11 [20].
135 ibid, Article 11 [20].
136 Calliess (n124) Article 11 EUCFR [28].
137 See eg: Committee of Ministers, Recommendation CM/Rec(2008)6 of the Committee of Ministers to
Member States on measures to promote the respect for freedom of expression and information with re-gard to Internet filters.
138 Woods (n110) [11.33] and the footnotes 99 and 102 and the cited sources. 139 ibid [11.33].
140 ibid [11.33]. 141 Spindler (n16) 258.
unnecessarily infringe on the fundamental freedom of expression.142 This following quote shows how important it is to balance out the opposing fundamental freedoms:
“Obviously freedom of expression and information does not permit the infringement of intellectual property rights. […] Nevertheless, […] protection of trade mark proprietor’s rights in the context of electronic commerce may not take forms that would infringe the rights of innocent users of an electronic marketplace or leave the alleged infringer without
due possibilities of opposition and defence. […]”143
47 The use of filtering technologies was described in Netlog as to be able to infringe the freedom to receive and impart information of the users of a hosting service provider.144 One reason for that being that filtering systems might not adequately distinguish between unlawful and lawful content145 and that such an upload filter – as demanded by
SABAM – would not strike a fair balance between the right to intellectual property and freedom of expression.146 This is the same argumentation as in Scarlet Extended.147
48 However, it is also important to consider the Court’s ruling in Glawischnig-Piesczek148. There, the Court ruled that the blocking of future content uploaded by any user of the same or equivalent nature to the already blocked content will be in compliance with the general monitoring prohibition in Article 15 InfoSoc-Directive, as long as it is an injunction by a national court for a specific case.149 It must be pointed out, however, that the Court did not review the negative impact on fundamental rights, but only analysed the compliance with the general monitoring prohibition. Therefore, this ruling and its implications must be taken warily.
49 Any limitation must be prescribed by law and must be sufficiently clear to enable the citizens to adequately foresee the consequences and modify their behaviour accordingly. However, it is not required that citizens have absolute certainty of the consequences.150 Furthermore, it should not leave the applying authorities with too much leeway.151 Limitations must have a necessary legitimate aim, which are to be found in Article 10(2)
142 L’Oréal SA v eBay (n122) [155-158]. 143 ibid [158]. 144 SABAM v Netlog (n37) [48]. 145 ibid [50]. 146 ibid [51]. 147 Scarlet Extended (n131) [50, 52].
148 Case C-18/18 Glawischnig-Piesczek [2019] ECLI:EU:C:2019:821. 149 Glawischnig-Piesczek (n148) [37].
150 European Court of Human Rights, Sunday Times v United Kingdom (1979-80) Series A no 30 [49]: It
is good enough if the exact consequences then have to be elaborated by practice.
ECHR.152 Of particular significance – amongst others for the wide interpretation of it153 – is the aim of protecting the rights of others. Finally, measures taken must pass the proportionality test. They must be suitable, necessary, and reasonable considering the involved competing interests,154 meaning that restriction cannot “constitute
disproportionate and unacceptable interference, impairing the very substance of the rights guaranteed.”155 This calls for a case-by-case analysis in which the opposing
interests must be weighed against each other whereby political expressions weigh heavier than content for mere entertainment.156 Proportionality is assessed differently when it comes to the legislative choices of the Union, with a focus on the question whether a fair balance has been struck between the involved fundamental rights in the drafting of the provision.157
II. Freedom of the arts
50 Freedom of the arts in Article 13 EUCFR is to be seen as lex specialis to the freedom of expression in Article 11158. Thus, the majority of comments elaborated above in relation to the freedom of expression are applicable to the artistic freedom as well.159 Some suggest, that Article 13 merely enforces some specific variants of the freedom of expression.160 Accordingly, only some additional comments specific to the artistic freedom will be elaborated here.
51 The term “arts” includes all forms of artistic expression, including visual art such as films161 and is to be interpreted broadly.162 Artistic freedom covers the creation and the
conveyance of art.163 The personal scope encompasses, for one, the artist himself, but
also (legal) persons which make art available to the public, typical example for this being cinemas.164
152 In conjunction with Article 52(3) EUCFR. 153 Jarass (n112) Article 11 [31].
154 ibid, Article 11 [33-36]; see also Case C-283/11 Sky Österreich [2013] ECLI:EU:C:2013:28 [49-50]. 155 Case C-112/00 Schmidberger [2003] ECR I-5659 [80].
156 Jarass (n112) Article 11 [34-35]. 157 cp Woods (n110) [11.61]. 158 Jarass (n112) Article 13 [4].
159 This includes the requirements for limitations of artistic freedom.
160 D Sayers, ‘Article 13: Freedom of the Arts and Sciences’ in Peers S and Hervey T and Kenner J and
Ward A (eds), The EU Charter of Fundamental Rights: A Commentary (London, 2014) 379 [13.02].
161 Sayers (n160) [13.48].
162 Jarass (n112) Article 13 [5]; there is not yet a union definition of art, see Calliess (n 92) Article 13
EUCFR [3].
163 Jarass (n112) Article 13 [5]. 164 ibid, Article 13 [9].
52 When it comes to restrictions of the freedom of the arts, the European Court on Human Rights factors in the medium involved, the audience and the impact of the art to assess lawfulness.165 Otherwise the above mentioned for the freedom of expression applies.
165 Sayers (n160) [13.11]; in general it must be remarked that typically artistic freedom is restrained due to
D.
Striking a balance?
I. ContentID
53 First, the functioning and limits of YouTube’s own filtering mechanism ‘ContentID’ shall be quickly elaborated. This is important considering that the impact assessment by the Commission was made from a point of view that assumed filtering technologies to be quite capable.166 ContentID, which can be regarded as possibly the most sophisticated and expensive167 filtering technology168, creates a digital fingerprint of any uploaded content and compares it with a list of reference digital fingerprints which have been provided by the rightholders.169 When the system detects a (partial) match, the copyright holder is automatically notified and the video is claimed for the rightholder. The rightholders can then choose to either block it or claim its revenues for themselves.170 For this to work “all content from all users [must be monitored] all the time”171. However, it must be kept in mind that ContentID – as elaborate as it is – essentially only checks for duplicates and, consequently, cannot distinguish reliably between lawful and unlawful copying.172 The shortcoming of ContentID can also be seen in the fact that eg a video of white noise is being automatically flagged for copyright infringement173 or bird chirping in the background of a video is recognised as copyright-protected audio track.174
II. Two examples
54 The capability of ContentID is not known to the public.175 It will be shown here, based on two examples, that with the current system of YouTube, videos are taken down
166 Bridy (n1) 347.
167 See P Sawers, ‘YouTube: We’ve invested $100 million in Content ID and paid over $3 billion to
rightsholder’ (venturebeat, 7 November 2018) <https://venturebeat.com/2018/11/07/youtube-weve-invested-100-million-in-content-id-and-paid-over-3-billion-to-rightsholders/> accessed 18 June 2020: according to which Google spend $ 100 million for ContentID.
168 Spoerri (n3) [23].
169 Bridy (n 1) 330; see also YouTube Creators, ‘YouTube Content ID’ (YouTube, 28 September 2010)
<https://www.youtube.com/watch?v=9g2U12SsRns&feature=emb_title> accessed 18 June 2020.
170 Bridy (n1) 330. 171 ibid, 344. 172 ibid, 346.
173 C Baraniuk, ‘White noise video on YouTube hit by five copyright claims’ (BBC, 5 January 2018)
<https://www.bbc.com/news/technology-42580523> accessed 15 July 2020.
174 N Messieh, ‘A copyright claim on chirping birds highlights the flaws of YouTube’s automated system’
(thenextweb, 27 February 2020) <https://thenextweb.com/google/2012/02/27/a-copyright-claim-on-chirping-birds-highlights-the-flaws-of-youtubes-automated-system/> accessed 15 July 2020.
175 D Keller, ‘Facebook Filters, Fundamental Rights, and the CJEU’s Glawischnig-Piesczek Ruling’
automatically as the financial threat copyright holders pose for OCSSPs is given more weight when it comes to balancing intellectual property rights and user rights; or rather there is no balancing at all. As can be seen in the quote at the beginning of this thesis, Article 17 DSM has the distinct aim to make rightholders benefit from ContentID, which necessitates to consider how it functions as of now.
55 The first example is about a computer building video guide176 uploaded on 13 September 2019 on YouTube by ‘The Verge’, a technology news website. The building guide was riddled with errors, resulting in various tech-YouTubers177 reacting to it with their own videos.178 On 15 September, the YouTuber Kyle posted on his channel ‘Bitwit’ a reaction video that contained a vast majority of the original video of ‘The Verge’ as a greenscreen background which was overlaid by a smaller video of Kyle in the right hand corner. The ‘original’ video in the background was paused several times in order to be commented on and criticised by Kyle who drew attention to some of the mistakes that had been made.179 This video was surprisingly enough180 not caught or flagged by ContentID as infringing against the copyright of ‘The Verge’.181 However, it was removed on 12 February 2019 after a take-down request by Vox Media, the parent company of ‘The Verge’182 with Kyle receiving a strike on his channel. After a
counter-notification of Kyle, the video was back online on 14 February 2019.183 It is important to mention here that this example does not deal with an automated upload filter, but rather an automated mechanism – filter if you will – that immediately and without a warning
176 The original video was deleted by the uploader. However, quite a few re-uploads by other users can
still be found on YouTube (which of course are a prime example of copyright infringements).
177 YouTuber that focus on the latest technological news.
178 See K May, ‘Vox Media Attempts To Scrub Internet Of The Verge PC Build Video By Going After
YouTubers’ (WCCFtech, 13 February 2019) <https://wccftech.com/vox-media-attempts-to-scrub-internet-of-the-verge-pc-build-video-by-going-after-youtubers/> accessed 18 June 2020: where the au-thor provides a list of some of the reaction videos.
179 See video for an impression: Bitwit, ‘LYLE REACTS TO THE VERGE's PC BUILD VIDEO’
(YouTube, 15 September 2018) <https://www.youtube.com/watch?v=0vmQOO4WLI4&t=423s> ac-cessed 18 June 2020; also see TB Lee, ‘Vox lawyers briefly censored YouTubers who mocked The Verge’s bad PC build advice’ (arsTECHNICA, 19 February 2019) <https://arstechnica.com/tech-policy/2019/02/the-verge-briefly-censored-youtubers-who-mocked-its-bad-pc-building-advice/> ac-cessed 18 June 2020.
180 This should not be attributed to the system’s ability to distinguish between lawful and unlawful content
but rather its inability to even flag partly matching content.
181 There is no source to that since ContentID is non-transparent. However, it is clear that it was not seen
as infringing on copyright since no action was taken at the time of upload.
182 Kyle (Twitter, 13 February 2019) <https://twitter.com/bitwitkyle/status/1095578571244879872>
ac-cessed 18 June 2020.
183 Kyle (Twitter, 14 February 2019) <https://twitter.com/bitwitkyle/status/1095941247124963331>