• No results found

Online jihadi content combat: How serving public interest could ease the privatization of freedom of expression

N/A
N/A
Protected

Academic year: 2021

Share "Online jihadi content combat: How serving public interest could ease the privatization of freedom of expression"

Copied!
88
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Online jihadi

content

combat:

How serving public interest

could ease the privatization of

freedom of expression

Jesse Trommel

Leiden University

Faculty of Governance and Global Affairs MSc Crisis and Security Management Master thesis

Supervised by: Dr. J. Kortekaas Written by: Jesse Trommel Student number: s1154516 Date of submission: 10-06-2018 Word count: 24734 [incl. tables]

(2)

1

Table of contents

List of Tables ... 3

List of Abbreviations ... 4

1.0 Introduction ... 5

1.1 What is the problem? ... 5

1.2 The societal relevance ... 6

1.3 The academic relevance ... 8

1.4 Research goals and corresponding questions ... 9

1.5 Reading guide ... 10

2.0 Content governance through social network sites ... 11

2.1 What is considered a social network site? ... 11

2.2 Facebook and YouTube as social network sites ... 13

2.3 The privatization of freedom of expression ... 14

2.4 How to influence this privatization ... 17

3.0 Methodology ... 21

3.1 Feasibility of actors and incentives ... 21

3.2 Operationalization of the privatization of freedom of expression ... 22

3.3 Research method ... 24

3.4 Case selection ... 24

3.5 Data collection ... 26

4.0 The desired jihadi content combat criteria ... 30

4.1 The initial phase of the EU’s desired criteria ... 30

4.2 Difficulties with combatting online jihadi content ... 32

4.3 Building pressure towards jihadi content combat ... 34

4.4 Operationalization of the desired criteria ... 37

4.5 Partial conclusion on the desired criteria ... 40

5.0 The jihadi content combat policies and techniques ... 42

5.1 How algorithms can strengthen radicalisation ... 42

5.2 YouTube’s redirect method ... 44

5.3 Cooperation through the copyright method ... 48

(3)

2

6.0 Influencing the privatization of freedom of expression ... 54

6.1 A joint approach in removing jihadi content ... 54

6.2 The limitations in transparency and insight ... 58

6.3 The importance of the freedom of expression criteria ... 61

6.4 Measuring the actual influence ... 63

7.0 Conclusion and reflections ... 67

Reference list ... 70

With special gratitude to supervisor Dr. J. Kortekaas for your guidance and support throughout this thesis process, and to second reader Prof.dr. B. van den Berg for your additional feedback halfway through.

Furthermore I would like to honour the inventor of coffee for keeping me awake on those early mornings, and ice cream manufacturer OLA for save me from melting away on those incredibly hot days.

(4)

3

List of Tables

Table 1 Indicators on how to influence the privatization of freedom of expression Table 2 Case Study Protocol

Table 3 Labels of the different desired jihadi content combat criteria Table 4 The EU’s desired jihadi content combat criteria

Table 5 Meeting the EU’s desired criteria: Removal Table 6 Meeting the EU’s desired criteria: Cooperation Table 7 Meeting the EU’s desired criteria: Innovation Table 8 Meeting the EU’s desired criteria: Counter-narrative Table 9 Meeting the EU’s desired criteria: Transparency

Table 10 Meeting the EU’s desired criteria: Maintaining freedom of expression Table 11 Indicators on how to influence the privatization of freedom of expression

[rephrased]

(5)

4

List of Abbreviations

EC European Commission

EP European Parliament

EU European Union

RAN Radicalisation Awareness Network

IRU European Union Internet Referral Unit

UGC User-Generated Content

ISIS Islamic State in Syria and Iraq

SNS Social Network Site

SNSs Social Network Sites

UN United Nations

NATO North Atlantic Treaty Organization

NCTV National Coordinator for Security and Counterterrorism in the Netherlands

(6)

5

1.0 Introduction

1.1 What is the problem?

“Online platforms are becoming people's main gateway to information, so they have a responsibility to provide a secure environment for their users. What is illegal offline is also illegal online. While several platforms have been removing more illegal content than ever before – showing that self-regulation can work – we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights” (Ansip, as cited in European Commission 2018).

On 1 March 2018, this statement by Andrus Ansip, the Vice-President of the European Commission [EC] for the Digital Single Market, was released. It summarizes a major security challenge with which governments and society have to deal: combatting illegal content on the highly fragmented and privatized worldwide web. The severity of this challenge became publicly known when especially the terrorist organization Islamic State of Syria and Iraq [ISIS, also known as IS or ISIL] started using Social Network Sites [SNSs, also known as Social Media platforms] like Facebook, Twitter, Instagram and YouTube to share their jihadi propaganda material with the rest of the world. Their propaganda message and call to join the jihad proved to be successful, since over 42.000 foreigners travelled to the conflict area in Syria and Iraq to fight alongside one of the involved groups. More than 5.000 of them came from Europe, whereof the most eventually joined terrorist organizations ISIS, and to a lesser extent Jahbat-al Nusra (Radicalisation Awareness Network [RAN] 2017, 15). The European Union [EU] and its Member States searched for answers to the question why 5000 of their citizens left the European soil to fight in an armed conflict in the Middle East. Among many other explanations that are beyond the scope of this research, they focussed on these online propaganda campaigns that especially ISIS had set up. The EU concluded that the international dimension of the worldwide web and its SNSs have made radicalisation a cross-border phenomenon (EC 2014, 2-3). To reduce this threat, the EC started discussing a swiftly removal of this jihadi propaganda by the SNSs (EC 2014, 8-9), and the European Parliament [EP] called for an effective strategy to combat this threat (European Parliament 2015, 10-11). However, the two largest SNSs, Facebook and YouTube, first hesitated to actively cooperate with governments in the removal of this content (RAN 2015). In the course of 2016 they eventually acknowledged that a more active role was necessary to curb the problem of online radicalisation, which started the development and implementation of new jihadi content combat

(7)

6 policies and techniques (Brookings Institution 2016; Facebook 2016; Google in Europe 2016; Solon 2016; Yadron & Wong 2016;). These policies and techniques are specifically referred to as ‘combat’ policies and techniques, since they go beyond just the removal of jihadi content. Chapter 5 will elaborate on the relevant jihadi content combat policies and techniques, where removal is also part of.

However, as Ansip’s statement indicate, the developed and implemented jihadi content combat policies and techniques from SNSs appeared not enough to satisfy the EU. In September 2017, the EC already stated that “online platforms should […] adopt effective proactive measures to

detect and remove illegal content online and not only limit themselves to reacting to notices which they receive…” (EC 2017d, 10). This specific demand for proactive measures in content

removal on SNSs brings some ambiguities with it. First regarding the developed and implemented jihadi content combat policies and techniques from Facebook and YouTube, who were already ‘on board’ with governments in the jihadi content combat since 2016 (Facebook 2016; Google in Europe 2016). The EC’s statement makes it appear as if these policies and techniques are not meeting their desired jihadi content combat criteria. Secondly, having SNSs themselves proactively removing content, instead of just obeying to a legal order, sounds questionable towards the freedom of expression principle within the EU and its Member States. Since there is little academic and public knowledge on the developed and implemented jihadi content combat policies and techniques from SNSs, these ambiguities have not been clarified yet. Therefore, this thesis is aimed at gaining insight into the jihadi content combat policies and techniques of SNSs, and what these mean for the freedom of expression on such platforms. This because gaining such insight is highly relevant from both a societal and academic perspective.

1.2 The societal relevance

Research is backing up the part of Ansip’s statement that “Online platforms are becoming

people's main gateway to information…” (Ansip, as cited in European Commission 2018).

Because of this, these platforms also play an important role within society. This applies to SNSs in general, but especially to the two largest SNSs; Facebook and YouTube. The latter has crossed the number of 1.5 billion logged-in viewers a month. This means 20% of the world’s population is watching YouTube videos, on average over an hour a day (Wojcicki 2017a). Facebook even exceeds this by having 2.13 billion monthly active users, whereof 1.4 billion is logging-in on a daily basis (Facebook 2018). In addition to the huge number of users, SNSs are

(8)

7 increasingly being used as news source as well. 68% of the adult Facebook members in the U.S. used the platform to gather at least some news from it. On YouTube this number is much lower; 32% in 2017. However, YouTube has seen a significant increase in being used as news source, since this percentage used to be 21% in 2016 (Shearer & Gottfried 2017). Foster (2012) already concluded at an early stage that large SNSs and the increase in them being used as news sources enlarges their ability to, whether intentionally or not, influence their users and thereby society as a whole. This ability has only grow in the years that followed, since both the user numbers and the usage of SNSs as news source grew enormously. Despite being used as news source, the SNSs themselves do not want to be treated as media companies (Ulanoff 2014), and they lack a system of checks and balances on news of which they are facilitating the dissemination (Napoli 2015, 758).

Due to these numbers, it is safe to say that the development of SNSs has ensured that part of people’s social life currently takes place online. Multiple Science and Technology Studies also argue that “the development of new technologies will ultimately change the nature of society

and the way humans conduct their lives” (Bruggeman 2011, 159). This theory is called

‘technological determinism’ (Niculescu Dinca 2016, 41-42). It is based on the idea that; “for

most of us, most of the time, the technologies we use every day are of mysterious origin and design .We have no idea whence they came and possibly even less idea how they actually work. We simply adapt ourselves to their requirements and hope that they continue to function in the predictable and expected ways promised by those who sold them to us” (Wyatt 2008, 169). This

is especially the case with control technologies, such as the developed jihadi content combat techniques, because Bruggeman explains that these admittedly include a potential danger of the misuse of power and information, but are more acceptable than ever due to the war on terror. Having SNSs actively removing terrorist propaganda from their platforms has actually been demanded by the EU, but it is unclear whether the developed techniques can be [mis]used for other type of content as well. Therefore, gaining understanding of the jihadi content combat policies and techniques from our SNSs seems important from a societal perspective.

However, the actual influence of SNSs on society is being debated both publicly and academicly. Research from Pauwels and Schils (2016) suggested that the impact of radicalisation through SNSs is still mediated by factors from the offline world. But the debate is also being displayed in the discussion regarding the influence of fake news on SNSs’ users (see e.g. Allcott & Gentzkow 2017; Sunstein 2018), and in Facebook’s recent data scandal regarding Cambridge Analytica. This is about whether Facebook’s leaked user-data has

(9)

8 influenced the U.S. 2017 presidential elections (Bowcott & Hern 2018; Cadwalladr & Graham-Harrison 2018; Meredith 2018; Shubber et al. 2018). Despite questions regarding the actual influence of SNSs are highly relevant to solve as well, it is beyond the scope of this research and therefore serves to indicate the important role SNSs have taken into society. This important role is specifically about Facebook and YouTube. Partly due to their enormous numbers of users, Facebook and YouTube can be seen as the leading platforms in the social media market. This research will therefore focus specifically on the jihadi content combat policies and techniques from Facebook and YouTube. Section 3.4 will provide a more elaborated explanation for this choice.

1.3 The academic relevance

Despite the potential societal impact of Facebook and YouTube, knowledge about their jihadi content combat policies and techniques is lacking as well. However, awareness on the necessity of having the SNSs actively supporting jihadi content combat has partly been fuelled by research to the online activities of mainly ISIS. This contained, among other things, mapping the quantitative use of the Internet by terrorists (Gill et al. 2017), their delivery mechanisms (Huey 2015), and case studies to their use in different countries (von Behr 2013). Others dived into particular platforms like Twitter (Berger & Morgan 2015; Klausen 2015) and Telegram (Bloom, Tiflati, & Horgan 2017), to gain a better insight in ISIS’s use of it. Built on this, Conway et al. (2017) focussed on how disruption policies affect ISIS and their online traffic, and the EP requested a study to map and analyse the different disruption policies against online jihadi propaganda, in an attempt to maximize the efficiency and effectiveness of the EU’s counter-terrorism strategy (Reed, Ingram, & Whittaker 2017). But these studies lack an in-depth analysis of the policies and techniques that have been implemented, and a link towards the influence of these policies and techniques on freedom of expression.

However, such a link between freedom of expression and the governance of content on SNSs in general has been researched extensively before. By content governance is meant who decides what content is illegal and what content is accepted and belongs to freedom of expression. Briefly explained do most scholars argue that the rise of the Internet and its SNSs have caused a switch in information intermediation from the public to private sector. The SNSs themselves are currently deciding, within the boundaries of local law, what acceptable content and acceptable speech on their platform is (see e.g. DeNardis 2014; DeNardis & Hackl 2015; Napoli

(10)

9 2015). DeNardis and Hackl (2015) call this the ‘privatization of freedom of expression,’ and emphasize the governance power it provides the SNSs. Other scholars agree to a lesser extent with this argument, by elaborating on the available pressure tools that external actors can use to influence content governance on SNSs (Mueller 2015; Marlin-Bennett & Thornton 2012). These theories regarding the tension around privatization of freedom of expression by SNSs have been developed prior to the implementation of the jihadi content combat policies and techniques. However, these jihadi content combat policies and techniques are par excellence a subject that can add to the privatization of freedom of expression question. Therefore, research towards the influence of these jihadi content combat policies and techniques on the privatization of freedom of expression can add to the body of knowledge in two fields. Firstly by gaining insight in the underexplored phenomenon of jihadi content combat policies and techniques themselves. And secondly, by placing these insights in the existing theoretical framework on content governance by SNSs, which will further be explained in Chapter 2.

1.4 Research goals and corresponding questions

Because this thesis aims to add to the existing theory on content governance by SNSs, the first goal is to examine the phenomenon of privatization of freedom of expression and how this can still be influenced by outside actors. After the theoretical framework on the privatization of freedom of expression has been set up, the case of online jihadi content combat will be discussed. The mentioned statement from Andrus Ansip [Vice-President of the EC for the Digital Single Market] that “self-regulation can work - [but] we still need to react faster against

terrorist propaganda (European Commission 2018a), raises two questions: what does the EU

consider a sufficient reaction, and to what extent do the developed and implemented jihadi content combat policies of Facebook and YouTube meet that? To answer these questions it will first be necessary to examine what the EU’s desired criteria jihadi content combat policies and techniques towards the SNSs are, and in what context these have been drafted. Secondly, insight in how jihadi content is combatted by Facebook and YouTube will make it possible to compare the EU’s desired jihadi content combat criteria with these jihadi content combat policies and techniques. Through this comparison and insight on the jihadi content combat policies and techniques, it is possible to judge the influence that the developed and implemented jihadi content combat policies and techniques from Facebook and YouTube can have on the phenomenon of privatization of freedom of expression. These research goals are reflected in the following research questions:

(11)

10 To what extent do the jihadi content combat policies and techniques from Facebook and YouTube meet the desired criteria from the European Union, and thereby influence the privatization of freedom of expression?

o What is privatization of freedom of expression and how can it be influenced?

o What are the desired criteria regarding jihadi content combat policies and techniques from the European Union towards the Social Network Sites?

o What jihadi content combat policies and techniques did Facebook and YouTube develop and implement?

o To what extent do the jihadi content combat policies and techniques from Facebook and YouTube meet the desired criteria from the European Union?

o To what extent do the jihadi content combat policies and techniques from Facebook and YouTube influence the privatization of freedom of expression?

1.5 Reading guide

The sub questions listed in the previous Section will also be the guideline of this thesis. Chapter 2 will start with explaining what social network sites are, and why Facebook and YouTube are part of this sector. Subsequently, Chapter two will elaborate on how the development of these SNSs have caused the privatization of freedom of expression, and how this can be influenced by outside actors. Chapter 3 will operationalize the theories on the privatization or freedom of expression presented in Chapter 2, and explain the research method and other methodological choices that have been made. Chapter 4 will start with examine the desired jihadi content combat criteria from the EU, and operationalize these criteria. This in order to make them suitable to be used as guidelines for researching the jihadi content combat policies and techniques from Facebook and YouTube, which will be done in Chapter 5. Chapter 6 will further analyse to what extent the jihadi content combat policies and techniques from Facebook and YouTube meet the desired jihadi content combat criteria from the European Union. As will be explained in the methodology Chapter 3, this analysis is necessary to be able to answer the final sub question: to what extent do the jihadi content combat policies and techniques from Facebook and YouTube influence the privatization of freedom of expression? Chapter 7 will combine these sub answers to conclude with answer on the main research question, and reflect on the conducted research. But before we get there, Chapter 2 will explain how the development of SNSs have caused the privatization of freedom of expression.

(12)

11

2.0 Content governance through social network sites

2.1 What is considered a social network site?

First of all, the concepts of social network sites and social media platforms are often being used interchangeably in both society and the academic world, despite both concepts have been researched extensively. In addition to the interchangeability, SNSs or social media platforms still lack a straight forward and generally accepted definition. This is mainly due to two reasons. Firstly, the incredible speed at which technology and the SNSs themselves are developing, expanding, and evolving causes a lack of status quo on which a definition can be assigned to (Obar, Zube & Lampe 2012, 7). More and more different sorts of applications are being integrated into the platforms, by which SNSs expand their role on both the Internet and society in general (Roosendaal, Fennell, & van den Berg 2012, 78-79). The platforms therefore continue to change on a day-by-day basis. Secondly, SNSs use a number of techniques and forms of communication that have been around for a while already (Obar, Zube & Lampe 2012, 7). For example, Facebook integrated the Messenger function into their platform, which is an important feature for a user’s ability to network. But one could state this is a developed version of SMS or even e-mail, two older forms of communication. The integration of such older features, that are not necessarily social network features, makes it harder to define SNSs specifically.

The attempt to formulate a definition was more or less started by Boyd and Ellison (2007). They were talking about SNSs, and defined it as “web-based services that allow individuals to (1)

construct a public or semi-public profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3) view and traverse their list of connections and those made by others within the system. The nature and nomenclature of these connections may vary from site to site” (Boyd & Ellison 2007, 211). They were explicitly speaking about

Social Network Sites since they observed that users of SNSs were primarily connecting and communicating with people that were already part of their offline social network. When SNSs became mainstream in 2003, Boyd and Ellison also noticed an increase in the popularity of User-Generated Content [UGC]. Websites that focussed more on media sharing were adapting more and more components from SNSs. They even named YouTube as example since it turned into a video sharing platform where users can also get connect by following each other’s channels (Boyd & Ellison 2007, 216). Despite the integration of UGC and media sharing within SNSs, Boyd & Ellison did not consider naming these platforms social media already.

(13)

12 A few years later this social media term became more common with the further development of ‘Web 2.0.’ This is a term that describes a combination of software developments and techniques that created the ability to add animation, interactivity, and audio/video streams to web pages; the ability to publish frequently updated content in a standardized format; and the possibility to update web content without changing the display and behaviour of the whole page. Kaplan and Haenlein (2010, 60-61) argue that these Web 2.0 developments caused the popularity of content creation and the subsequently interactions between the users creating it. UGC and its corresponding interactions therefore shifted the vocabulary more towards the term ‘social media,’ which they define as “a group of Internet-based applications that build on the

ideological and technological foundations of Web 2.0, and that allow the creation and exchange of User-Generated Content” (Kaplan and Haenlein 2010, 61). However, by doing so they also

include Wikipedia in the social media spectrum (Kaplan and Haenlein 2010, 61), a platform that does not facilitate building an online social network and its interactions.

The difference in term usage by the authors can be explained by the time and subject of writing. Boyd and Ellison (2007) wrote about the history of the phenomenon, where the incorporation media elements was in an early stage. Kaplan and Haenlein (2010) focussed in their article on the challenges and opportunities of the media element on social media, for companies. Due to this focus on the media element, it makes more sense to name the phenomenon social media. Obar et al. (2012, 8) also noticed that both definitions lack the component the other is focussing on, without presenting an alternative themselves. Obar later argues that the one thing that social media platforms generally characterizes, is the need for UGC whereby its users are being connected with each other through their interactions (Obar & Wildman 2015, 745-747). It incorporates both the elements of building a social network and the sharing of UGC, but it is still not a definition. Prior to Obar, Hogan and Quan-Haase (2010) already gave up on presenting a definition. They argue that all sorts of media already have a social element in it. They emphasized that social media should be discussed by its primary affordance; “two-way

interaction with an audience, beyond any specific recipient” (Hogan and Quan-Haase 2010,

310). The lack of a specific recipient applies to both the network and media functions of SNSs, since UGC is created for all the user’s connections.

Therefore, a working definition that incorporates both the media and network characteristics will therefore be ideal. One that DeNardis and Hackl provide by defining social media as:

“platforms that provide three specific technological affordances: 1. the intermediation of user-generated content; 2. the possibility of interactivity among users and direct engagement with

(14)

13

content; and 3. the ability for an individual to articulate network connections with other users”

(DeNardis & Hackl 2015, 762). Since the mentioned authors agree that most platforms incorporated network and media elements, this research acknowledges this definition and sees Social Network Sites and Social Media as equal concepts, on reservation that both the network and media elements are present on the platforms. Both elements are also present in the online strategy of jihadi organisations like ISIS and Al-Qaeda. As Section 1.1 explained, they use media elements to recruit people for their jihadi network. Since the final goal of this strategy is to gain more support and sympathizers through networking, this thesis will use the concept of Social Network Sites. But to what extent do Facebook and YouTube meet the specific elements from the SNS definition from DeNardis and Hackl?

2.2 Facebook and YouTube as social network sites

Facebook started as a private network for Harvard students to support each other through getting their degree. At that time, in 2004, Facebook’s focus was mainly on the creation of user profiles. Due to the success at Harvard, the SNS expanded to other schools and created the ability for outside developers to build in application with which user could perform other tasks and personalize their profiles further. For example, adding their favourite movies to their profile (Boyd & Ellison 2007, 214-218). In 2006, Facebook decided to open their platform to the entire world, and dropped the school-only format.1 By doing so they started focussing more on the intermediation of UGC by introducing the ‘status update’ and ‘news feed,’ which enables users to interact with each other through their status updates. This created the opportunity to engage directly to each other’s content, by which Facebook made it easy to maintain social relationships between ‘friends,’ and lowered the barrier for ‘friends of friends’ to get connected with each other as well (Ellison et al. 2014, 856-859). Facebook’s algorithms are responsible for intermediating in the stack of UGC, and filter out what is relevant to your personal interests (Mosseri 2016). Due to these developments, Facebook currently includes all three characteristics defined by DeNardis and Hackl (2015), and may therefore be rightly called a SNS or social media platform. Where Facebook started purely as a social network site, the focus on UGC and the news feed caused a rightful balance between the network and media characteristics. As Facebook’s user numbers grew, so did the integration of many more features.

1 Facebook. Our History; September 26, 2006. Retrieved on 28.03.2018 from:

(15)

14 These include Video Calling, direct messaging via Facebook Messenger, Livestreaming and a Marketplace.2 However, the integration of these features do not harm the platform’s

classification as SNS, since they only facilitated the spread of UGC and ability to network on the platform.

YouTube was founded with a focus on sharing videos, which is still the platform’s primary function since the Google takeover in 2006. However, they did implement many social network features around their video sharing function, which blurred the border to becoming a SNS themselves (Boyd & Ellison 2007, 216; Wattenhofer, Wattenhofer & Zhu 2012). YouTube has always been based on user-generated video content, but the incorporation of a social experience is seen as the ingredient to success. They created the opportunity to user-content-user interactions by leaving comments on videos, by which users can directly engage to the content and its creator. This is an addition to the ability to leave comments on a specific user channel, which is a user’s personal page to which others van connect by ‘subscribing’ to it (Wattenhofer, Wattenhofer & Zhu 2012). This is how users can set up networks via YouTube. Content intermediation is done by algorithms that are presenting recommendation videos to users. These recommendations are based on what YouTube’s algorithms think is in the user’s interest, mainly based on previously watched videos (Covington, Adams & Sargin 2016). Despite translating a user’s YouTube network into an offline one is harder due to the lack of a private messaging, user still have the ability to connect to one another via comments on each other’s content and channels. YouTube started with a focus on UGC intermediation, but became successful by incorporating network characteristics by enabling interactivity on UGC and the channels creating it. Because of this, YouTube also meets the three characteristics that have been set up by DeNardis and Hackl (2015). A detailed explanation on the case selection of Facebook and YouTube will further be presented in Section 3.4.

2.3 The privatization of freedom of expression

Long before Facebook, YouTube or the phenomenon of a SNS existed, the Internet on which it runs was a military experiment run by the U.S. government. After it was transformed into a civilian network, the U.S. started exporting the Internet in the mid-1990s (Naughton 2016). Over the years it developed into a worldwide web with a highly fragmented infrastructure across

(16)

15 public and private sector. This fragmentation makes Internet governance challenging, and often seen as a multistakeholder governance model (DeNardis and Raymond 2013, 2-3). The UN therefore felt it necessary to develop a working definition on Internet governance, which indicates the complexity and resulted in the following: “Internet governance is the development

and application by governments, the private sector and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programmes that shape the evolution and use of the Internet” (de Bossey 2005, 4).

The rise of SNSs and their UGC is such an evolution on the Internet that had to be governed, since there is no central authority that globally decides what content is allowed and what should be removed from these platforms. SNSs should still obey to the law in the countries they are operating in, because “what is illegal offline is also illegal online” (EC 2018a), but can determine their own boundaries within these laws and the ‘grey area’ in between legal and illegal content. Governments can usually not remove information from SNSs, unless the sites are hosted on government servers, which causes content governance to be mostly executed by the SNSs themselves (DeNardis 2014, 213). Napoli (2015, 753) also argues that the decisions made by the SNSs themselves have become leading in content governance, because governmental regulatory jurisdictions are harder to define and enforce on the Internet. This has causes a lack of clear applicability and relevance of these laws online. Lievens, Valcke and Valgaeren (2011) already concluded in an early stage that proper content governance on SNSs requires a shift in regulation from top-down and command-and-control, to a certain self- and co-regulation that involves different actors. They claim that imposed legislation is often obsolete in comparison to the high speed development of SNSs. Self- and co-regulation will therefore make legal instruments adapt to the development speed in a better way.

However, the self-regulation of content removal by the SNSs brings some other issues to the table. These platforms now have to walk a fine line between pushing back against governmental removal requests that are considered unreasonable, and removing content that is highly undesirable to most of its users. By doing so, the SNSs are having an arbitrator role when it comes to judging the admissibility of content, and are practically carrying out the law enforcement function online. This phenomenon is known as ‘delegated censorship,’ because governments handed their law enforcement function, regarding online content, over to the SNSs by letting them settle it largely themselves. Through this delegated censorship, governments granted these private sector companies a lot of Internet governance power (DeNardis 2014, 213-219). DeNardis (2014) argues their law enforcement function is increasing transparency

(17)

16 reports show a growing number of governmental removal requests, and this upward spiral has not yet been broken. Prior to 2016, YouTube [who is featured in Google’s transparency report since it is part of that company] received less than 2500 worldwide governmental content removal requests every six months. However, the second half of 2016 and first half of 2017 show an increase to 11.000 and 13.000 governmental content removal requests towards YouTube.3 More recent numbers have not been published yet, but Facebook noticed a similar increase in content removal requests since 2015 (Sonderby 2015).

Despite the increasing amount of requests, SNSs can decide not to comply with a content removal request. For example when the request is considered unreasonable and not in violation with the law (DeNardis 2014, 213-219). SNSs are having the ability to decide for themselves. This is illustrated by a recent case, whereby Russian court requested YouTube and Instagram to remove videos and photos posted by the Russian opposition figurehead Alexei Navalny. Both platforms were threatened with blocking access to their services in Russia, when they decided not to comply. The content in question was about a Kremlin-linked oligarch being accused of bribing a governmental representative on his luxury yacht (Bennetts 2018). Instagram eventually decided to remove the requested content, to the fury of Navalny himself who responded by saying that “Instagram decided to comply with Russian illegal censorship

requests” [Tweet].4 On the other hand, YouTube decide not to remove the video,5 and the

Russian government did not live up to their access-blocking threats (Troianovski 2018). It illustrates the law enforcement function and corresponding content governance power of each individual SNS, whereby they are the ones that decide what acceptable speech on their platform is. Due to the huge popularity of the SNSs, making them the place where communication nowadays takes place, these decisions are increasingly taken over by society in general as well. DeNardis (2014, 12-13) therefore claims that a transition in content governance from the public to the private sector has been ensured. She later calls these developments the ‘privatization of freedom of expression’ (DeNardis & Hackl 2015, 766-769).When we put these arguments into one definition, the privatization of freedom of expression is: The switch in online content

governance from the public to the private sector, whereby Social Network Sites have taken over

3 Google. Transparency Report: Government requests to remove content. Retrieved on 29/04/2018 from:

https://transparencyreport.google.com/government-removals/overview?removal_requests=group_by:products&lu=removal_requests

4 Retrieved from: https://twitter.com/navalny/status/964107913831899137 5 It can be found at: https://www.youtube.com/watch?v=RQZr2NgKPiU

(18)

17

the governmental law enforcement function online, and thereby decide what acceptable content and acceptable speech on the Internet is (DeNardis 2014; DeNardis & Hackl 2015; Napoli

2015).

2.4 How to influence this privatization

This privatization of freedom of expression is strengthened by a high degree of liability protection of the SNSs, against illegal UGC on their platforms. SNSs are, often undesirable, facilitating the spread of negative content like bullying, gossip, rioting and terrorist propaganda, but are protected against prosecution of this spread, by U.S. and EU law (Balkin 2008). The development of this law had two seemingly contradictory goals. Firstly, to protect SNSs that do not intervene in their user’s content, to promote freedom of expression on their platform. Secondly, to protect the SNSs that do intervene in their user’s content against being identified as publishers. When platforms intervene in their content, they will usually be seen as publishers by law. Publishers are liable for the content on their platform, but mediators of UGC are protected against this liability. This law protects the SNSs that are removing content that is not clearly illegal against being classified as publishers instead of mediators, to prevent them from becoming liable for their UGC (Mueller 2015, 805). The law thereby prevents SNSs from ending up in an endless amount of lawsuits on a third party’s content (Balkin 2008, 107-108). In the EU, article 14 of the directive on electronic commerce state that:

“Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:

(a) The provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or

(b) The provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information” (Directive 2000/31/EC).

Despite the law is from 2000 and Facebook and YouTube did not yet exist, it does apply to them nowadays since SNSs store information on request of the users of these platforms, namely their UGC. Due to this, SNSs cannot be held liable for illegal content; as long as they are not aware of its existence on their platforms or the circumstances that makes certain content illegal, and when a court order that obligates them to remove content identified as illegal is followed.

(19)

18 However, practice shows that SNSs actually are being held publicly responsible for content of their users. When governments notice unacceptable speech on a certain platform, they demand removal and often also make an effort to get the ability to track and monitor the users posting and sharing it. Mueller claims that by doing so, these governments operate under the assumption that if they have had those capabilities beforehand, they would have prevented the bad things from happening, meaning the spread of illegal content (Mueller 2015, 806-807). He argues that governments and the public opinion are having the delusion that these human activities we see online can also be controlled by online means. Mueller calls the problem of this delusion ‘the fallacy of displaced control,’ whereby governments and public opinion focus more on the pursuit to control the tool used to express something than on punishing the users responsible for their unacceptable behaviour.6 Due to this, SNSs are increasingly pressured by governments and public opinion to execute Internet governance control (Mueller 2015, 806-807). For example, this is illustrated by the EC’s statement from September 2017, in which they argue that “online platforms should […] adopt effective proactive measures to detect and remove

illegal content online and not only limit themselves to reacting to notices which they receive…”

(EC 2017d, 10). They thereby focus on the jihadi propaganda tool, which is a SNS having ‘illegal’ content on their platforms, instead of punishing the users posting the content. However, whether punishing the ISIS sympathizers spreading their extremist propaganda is a feasible alternative to demanding removal from SNSs is questionable, but also beyond the scope of this thesis to find out.

The extent to which a SNS grants to desires of content removal, depends on the balance of powers and coincidence of these desires from different involved actors. These actors can be governments, civil society, advertisers and investors, site owners, site users, and others that do not fit one of these categories. The pressure tools these actors have to enforce their fallacy of displaced control are: legal requirements, economic incentives and incentives of social mores; the customs, conventions, standards, and values of society in which SNSs operate (Marlin-Bennett and Thornton 2012, 494). Often are the legal requirements within a country the framework of a company’s ‘terms of service’ within that country. These terms of service are basically the private contractual agreements with the platform’s users, to which they should obey to be allowed to use the platform (DeNardis 2014, 219). It can be supplemented with local

6 Whether punishing the ISIS sympathizers spreading their extremist propaganda is a feasible alternative to

demanding removal from SNSs is a question that is not going to be discussed here, since it is beyond the scope of this thesis.

(20)

19 or general economic incentives and incentives of social mores as well, but because each country has different laws, the content on SNSs also differs per country.

Economic incentives are usually coming from advertisers and investors, who are hugely important to the revenue of SNSs. Intervening in content by creating rules of behaviour could make the platform more attractive for [potential] advertisements and investments, which is an economic incentive to do so. For example, some advertisers have complained at Google and YouTube that their advertisements appeared next to hate speech videos that were promoting extremist views. This does obviously not contribute to a positive image for these companies, which is generally the purpose of advertising. They therefore pressured Google and YouTube to remove such hate speech videos if they want to keep them on board as advertisers (Solon 2017; Hughes 2017). Removing videos with hate speech could also be in a broader perspective due to incentives of social mores. For example, half of the U.S. states have not included cyberbullying in their broader bullying laws, which does not necessarily makes it an illegal act (Hinduje & Patchin 2016). Despite this, Facebook7 and YouTube8 incorporated a ban on cyberbullying in their terms of service across the whole USA, since it is socially considered unacceptable behaviour.9

In addition, the incentives of social mores or economic incentives can be backed up by legal requirements. Going back on the statement regarding proactive measures again. When this will be turned into legislation, Directive 2000/31/EC on the liability exemption will be changed dramatically, and SNSs should implement proactive measures instead of just waiting for a legal order to remove illegal content. Obviously, such legal requirements can only come from legislative powers. Incentives of social mores can originate from all possible involved actors that are mentioned above, but generally require cooperation between multiple actors to be honoured. Economic incentives are primarily being fuelled by advertisers and investors, while external legal requirements are coming from governments or intergovernmental organizations. Collective indignation and a combination of incentives and actors will always put more pressure on SNSs to start removing certain content (Marlin-Bennett & Thornton 2012). The degree to

7 Facebook. Community Standards. Retrieved on 28.03.2018 from:

https://www.facebook.com/communitystandards

8 Google. Policies, reporting, and enforcement: YouTube policies. Retrieved on 28.03.2018 from:

https://support.google.com/youtube/answer/2802268?hl=en

9 Whether the ban on cyberbullying is actually being enforced by the SNSs is another discussion beyond the

(21)

20 which the SNSs obey to these external requests influences the privatization of freedom of expression these platforms have obtained. When governmental or civil society request are being granted regularly, the privatization of freedom of expression will weaken. However, researching all the mentioned actors and incentives is not feasible for this thesis. The upcoming Chapter 3 will therefore elaborate on why this thesis will only focus on the pressure tools that have been used by the EU.

(22)

21

3.0 Methodology

3.1 Feasibility of actors and incentives

Despite content removal can be pressured by multiple actors and through external legal requirements, economic incentives and/or incentives of social mores, not all actors have to be transparent about the communications with SNSs regarding their content removal requests. For example, when the advertisers were upset about having their commercial being played in the middle of an ISIS promotion video on YouTube, they eventually reached out to the media to increase the pressure on the SNSs. But there is no insight into any communication that took place between advertisers and YouTube before the problem became public (Solon 2017; Hughes 2017). To get access to the scope and true motives behind both the drawback of existing advertisers and the keeping off of potential advertisers, requires a serious amount of inside information that has not been recorded in publicly available documents. Due to the limited scope and available time for this research, examining the economic incentives regarding the development of jihadi content combat policies and techniques will not be feasible.

A part of this reasoning also applies to incentives of social mores, which could be stimulated by all actors. Measuring the removal demands posed by civil society, SNS-users, and site owners entails similar transparency issues that harm the feasibility. But there are more difficulties. Going back to the UN-developed definition on Internet governance, civil society is, alongside governments and the private sector, one of the three stakeholders that shapes the evolution of the Internet (de Bossey 2005, 4). However, in practice civil society suffers from inadequate participation in the Internet governance process. They are known for having a lack of consensus on Internet governance issues, which creates difficulties in representing the wide range of opinions. The problems with coordination and representation causes civil society to suffer from legitimacy challenges in the decision-making process regarding Internet governance issues (Carr 2015, 656-659). In practice this means the other stakeholders flaunt with incorporating the interests and desires of civil society in the policy decisions, while not being contested or held accountable by civil society in any significant way (Franklin 2013, 157). The same applies to the SNS’s users, who also suffer from a lack of consensus and representation. In addition, site owners are often hard to reach and discussing content removal issues indoors. This means that researching the incentives of social mores from civil society, site owners, site users, advertisers and investors, regarding jihadi content combat, also incorporates feasibility problems.

(23)

22 Therefore, this research will solely focus on the legal requirements and incentives of social mores that have been used by the EU. In contrast with other actors, EU institutions like the EP and EC have transparency as a core value. This means that the desired criteria, regarding jihadi content combat on SNSs, have been documented and are publicly available. Therefore these desired criteria and the corresponding pressure tools that have been used to enforce them are feasible to research. The EU can pressure the SNSs to incorporate the EU’s desired criteria in their jihadi content combat policies and techniques, through incentives of social mores or by imposing legal requirements. Besides the EU, individual governments and other intergovernmental institutions, like the UN or NATO, can use similar strategies. However, due to limitations in the scope of the research, this thesis focusses solely on those from the EU. Because the EU includes elements from both an intergovernmental and a supranational institution, it should be representative for the desired jihadi content combat criteria from its Member States. Content removal requests from individual governments will always be there, but this thesis is focussed on the desired jihadi content combat criteria in general. Germany is in this case an exception, since they actually have a law in force that obligates SNSs to remove illegal content within 24 hours after notification, while this has not been an implemented law in the EU in general (Oltermann 2018). However, for the rest the EU will likely be the leading body pressure SNSs towards jihadi content combat.

3.2 Operationalization of the privatization of freedom of expression

This EU-focus obviously has consequences for the operationalization of the influence on the privatization of freedom of expression. As mentioned, the EU’s desired jihadi content combat criteria can either be backed up by legal requirements and solely derive from incentives of social mores. This means the privatization of freedom of expression can be weakened in two ways: by having Facebook and YouTube obey to the EU’s desired criteria that have been backed up by legal requirements or solely to the desired criteria that are derived from incentives of social mores. On the other hand, Facebook and YouTube can also decide to push back and ignore the EU’s desired jihadi content combat criteria, when these are considered unreasonable. By doing so, both companies make sure the privatization of freedom of expression is maintained, since they are the ones determining the boundaries of freedom of expression. Maintaining the privatization of freedom of expression is done by ignoring the EU’s desired criteria that are derived from incentives of social mores, or have been backed up by legal requirements. The latter will most likely draw them into a lawsuit, despite the described case in Russia has proven

(24)

23 otherwise. In addition to obeying or ignoring the EU’s desired criteria, there is a third option to maintain the privatization of freedom of expression. Going back to the definition, the privatization of freedom of expression includes that the SNSs are the ones that largely decide what acceptable content is. This means that interventions in content that has not been demanded, which is for example situated in the ‘grey area’ of illegality, also adds to the SNSs’ law enforcement function and thereby strengthen the privatization of freedom of expression. Table 1 provides an overview of the indicators on whether the privatization of freedom of expression has been weakened or maintained.

Table 1: Indicators on how to influence the privatization of freedom of expression

Theory Definition Indicators

Privatization of freedom of expression (DeNardis & Hackl 2015)

The switch in online content governance from the public to the private sector, whereby Social Network Sites have taken over the governmental law

enforcement function online, and thereby decide what acceptable content and acceptable speech on the Internet is (DeNardis 2014; DeNardis & Hackl 2015; Napoli 2015).

Indicators on how the privatization of freedom of expression can be

influenced:

Weakening the privatization of freedom of expression through:

o Obeying to the EU’s desired

criteria that have been backed up by legal requirements.

o Obeying to the EU’s desired criteria that are derived from incentives of social mores.

Maintaining the privatization of freedom of expression by: o Ignorance of the EU’s desired

criteria that have been backed up by legal requirements.

o Ignorance of the EU’s desired criteria that are derived from incentives of social mores. o Interventions in content that have

not necessarily been desired by the EU.

(25)

24 3.3 Research method

One of the research goals formulated in Sections 1.3 and 1.4, is to add knowledge on the existing theory on privatization of freedom of expression. Since this theory is at the foundation of this thesis, after which the influence on it through specific cases will be examined, it conducting a deductive research. Other goals are to gain insight in the EU’s desired jihadi content combat criteria, and the developed and implemented jihadi content combat policies and techniques from Facebook and YouTube. According to Yin (2003), gaining in-depth understanding of a under researched phenomenon can best be done through a qualitative case study. She also argues that case studies suit topics that lack extensive research. The existence of a knowledge gap on the jihadi content combat policies and techniques has already been identified in Section 1.3. Therefore, the most suitable way to gain this insight, is through qualitative content analysis of both the EU’s desired criteria and the policies and techniques from the two SNSs.

In addition, Baxter & Jack (2008, 544-545) argue that content analysis on a wide variety of documents suits obtaining contextual insight and understanding a complex and underexposed phenomenon the best. The context is in two ways important. First to gain understanding on how the EU’s desired jihadi content combat criteria came into being. Chapter 4 will explicate how the EU’s increased their pressure on SNSs to actively combat jihadi content on their platforms, and in what context. Secondly, SNSs are known for innovating their techniques very rapidly with little transparency and pre-announcements, which complicates social media studies in general (Ellison & Boyd 2013, 17). Therefore the desired jihadi content combat criteria from the EU will provide the context in which the development and implementation of the jihadi content combat polices and techniques from Facebook and YouTube can be understood. Since both the EU’s desired criteria and the jihadi content combat polices and techniques from Facebook and YouTube are approached as context dependent and under researched phenomena that lack insight, qualitative content analysis will be the preferred research method of this thesis.

3.4 Case selection

This qualitative content analysis will be conducted on the cases of Facebook and YouTube. As previously indicated, the number of monthly active users of both Facebook [2.1 billion] and YouTube [1.5 billion] are enormous, just as the gap towards the number of users on other SNSs as displayed in Figure 1. Because Facebook and YouTube are leaders in the SNS market, they have the greatest resources to invest in expertise and the development of new policies and

(26)

25 techniques themselves (Facebook 2017b). In addition, YouTube’s development potential is supported by Google, because they are part of the Google branch from supersize company Alphabet Inc. Since the giant search engine faces similar issues with online jihadi content, they cooperated from the beginning in the development of jihadi content combat policies and techniques (Walker 2017). Because of their resources, Facebook and YouTube are leading the development of these policies and techniques for the SNS market, and share information with other categories of information intermediaries as well (Facebook 2017b; Walker 2017; Solon 2016). This already resulted in the replication of Facebook’s and YouTube’s jihadi content combat techniques by other platforms.

This leading position is confirmed when doing a quick scan on the two other large SNSs. Instagram is the third platform with 800 million monthly users (Kallas 2018), but owned by Facebook since 2012.10 Therefore the policies and techniques regarding jihadi content combat

(27)

26 are in line with those from Facebook (Facebook 2017b), which undermines the added value to study Instagram as a separate case. The gap towards Twitter, the fourth-largest platform with around 300 million users, is huge (Kallas 2018). Twitter has been of significant importance to ISIS and other jihadi organization in the spread of their propaganda globally, for the psychological warfare against their local enemies, and to drive traffic to other SNSs like Facebook and YouTube as well (Klausen 2015, 17-18). But despite Twitter’s importance to the jihadists, the platform is less interesting for this research. Twitter is using the same content removal method as Facebook and YouTube [which will be explained in Section 5.3], and it further relies predominantly on their own human reviewing teams (Twitter Inc. 2016a; Twitter Inc. 2016b). Twitter is not using any additional techniques to combat jihadi content on their platform, and the platform is much smaller than Facebook and YouTube, which results in less impact on the privatization of freedom of expression. Therefore, taking Twitter into account as a separate case will have less added value to this research. Facebook and YouTube are the ones at the forefront of the development of jihadi content combat policies and techniques, and have the greatest impact on society due to their number of users. Because other large SNSs [and other Internet companies like search engine Bing (Walker 2017b)] are taking over the jihadi content combat of Facebook and YouTube, the results from studying these two cases will to a large extent be generalizable for the whole SNS-market. Therefore, this research will focus on the jihadi content combat policies and techniques from both Facebook and YouTube.

3.5 Data collection

To be able to examine both the desired jihadi content combat criteria from the EU and the jihadi content combat policies and techniques from Facebook and YouTube, data will solely be gathered from publicly available documents and statements. Within these documents, the terms ‘terrorist content’ and ‘extremist content’ are used interchangeably. Therefore it is important to clarify that in this thesis the terms ‘jihadi content,’ ‘terrorist content’ and ‘extremist content’ allude to the same type of content. For the EU’s desired criteria the sources will be press releases, communications between different EU bodies [European Commission, European Parliament and European Council], (proposed) legislation from the EP, and statements from individual commissioners or the EC in general. Some documents that have been examined as well are on tackling online hate speech in general, instead of specifically aimed at jihadi or terrorist content [as the EU generally names it]. This because in the document, the authors either

(28)

27 dedicated a specific section to combatting terrorist content online or combined the two issues into the formulation of their desired criteria, for example in EC 2015, EC 2016g, EC 2017c, EC 2017d, and EC 2018a. The publicly available sources that have been examined for desired jihadi content combat criteria have been released in between January 2014 and September 2017. This because September 2017 indicates a change in the EU’s attitude with a proposal to weaken the liability exemption that SNSs enjoy, as explained in Section 2.4. In their statement regarding this, the EC claimed they will “carefully monitor progress made by the online platforms over

the next months and assess whether additional measures are needed” (EC 2017d, 20). If

additional legislative measures are considered necessary, the EC claims this will have been created in May 2018. However, while writing this thesis, no indication of such legislation has been found yet. Therefore, the statement from September 2017 is chosen as the end point for this thesis, and examine whether Facebook and YouTube met the EU’s previous desired criteria. Chapter 4 will elaborate more on the timeframe in which the EU’s desired jihadi content combat criteria have been formulated.

Eventually, the EU’s desired jihadi content combat criteria that can been found in this timeframe will be categorized. These different categories will further serve as guidelines in the search to what jihadi content combat policies and techniques Facebook and YouTube have developed and implemented. By doing so, the analysis of these policies and techniques will be more focussed towards answering the first part of the main research question: to what extent do the jihadi content combat policies and techniques from Facebook and YouTube meet the desired criteria from the European Union? Jihadi content combat policies and techniques contain policies and techniques that have been developed, transformed, extended or adjusted to tackle the specific problem of jihadi content on the platforms. Criteria for the researched documents will therefore be that these include policies and techniques that are aimed at combatting jihadi content. Again, data has also been gained from documents on tackling online hate speech in general, since some policies and techniques that have originally been designed to tackle hate speech are extended or remodelled to tackle the specific jihadi content problems. Chapter 5 will elaborate on this in detail. Table 2 provides the detailed case study protocol that will be followed to find an answer to the question: to what extent do the jihadi content combat policies and

techniques from Facebook and YouTube meet the desired criteria from the European Union, and thereby influence the privatization of freedom of expression?

(29)

28

Table 2: Case Study Protocol

Step 1 Close read statements, legislative documents and policy documents from the EC, EP and Council on combatting jihadi content online, and elaborate on the context in which these have been drafted.

Step 2 Categorize the found desired jihadi content combat criteria in an overview table and assign labels to the identified categories.

Step 3 Use these categories as guidelines in order to close read statements and policy documents from [companies affiliated with] Facebook and YouTube on their developed and implemented jihadi content combat policies and techniques. Step 4 Categorize the jihadi content combat policies and techniques from Facebook

and YouTube, and elaborate on the ones that correspond with the EU’s desired criteria.

Step 5 Analyse whether each individual desired jihadi content combat criteria from the EU has been met by the jihadi content combat policies and techniques from Facebook and YouTube separately.

Step 6 Analyse the extent to which the privatization of freedom of expression has been influenced by the degree to which the desired jihadi combat criteria have been met by the jihadi content combat policies and techniques from Facebook and YouTube.

Because SNSs have a reputation towards implementing innovations in their techniques with little transparency (Ellison & Boyd 2013, 17), solely conducting content analysis on their jihadi content combat policies and techniques through publicly available statements and documents is a risk. It is possible that these SNSs are presenting half a truth in these statements, and are leaving things behind. In addition to this, doing interviews with representatives from Facebook and YouTube is not likely to change this issue, since journalistic research has already shown that employees working on jihadi content combat are hard to reach and pre-instructed (Hopkins 2017; Kreling, Modderkolk & Duin 2018; The Guardian 2017). Therefore, there is no other option than to rely on the publicly available statements and documents. This also applies to the desired jihadi content combat criteria from the EU. It could very well be possible that any additional desired criteria have been made known to SNSs within the EU Internet Forum or any other closed-door meetings. The EU Internet Forum documents are not complete since many passages are rendered illegible (see e.g. EDRi 2015). Therefore the pressure and desires behind

(30)

29 the scenes cannot be taken into account either. This shortcoming is also reflected in the lack of desired criteria from other actors other than the EU, as explained in Section 3.1 already.

(31)

30

4.0 The desired jihadi content combat criteria

4.1 The initial phase of the EU’s desired criteria

Having jihadi terrorists use the Internet for recruiting, setting up networks among sympathizers, mobilization, psychological warfare, online indoctrination, and various operational reasons is not something new. It has also been done by Al Qaeda during the U.S. invasion of Iraq in 2003 (Weimann & Von Knop 2008, 886-887), and the potential danger of it was emphasized again with the outbreak of the civil war in Syria since 2011. Despite ISIS’s predecessor was not yet a major power in that conflict in 2013 (John 2015), the EC already acknowledged the potential danger of online propaganda (Malmström 2013) since the number of European foreign fighters that left to fight in the civil war in Syria was on the rise (Barrett 2014, 10-11). Because people outside the conflict zone were influenced by messaging from inside the conflict zone, the EC argued that threats regarding violent extremist activities were not solely coming from centralised and hierarchical organisations, but also increasingly from radicalized individuals that are called ‘lone wolfs.’ Especially the usage of SNSs and the Internet in general by extremists is highlighted, combined with the acknowledgement that traditional law enforcement techniques are not sufficient to counter this threat (EC 2014).

Therefore, the EC argued that a “positive and carefully focused message needs to be spread

[…] to offer vulnerable Internet users an easily accessible alternative to terrorist propaganda”

(EC 2014, 8-9). In addition, more cooperation with these SNSs on getting extremist propaganda removed is necessary. This cooperation with the private sector should create ways “to make it

easier for the public to flag offensive or potentially illegal material” and “to make available easily accessible alternative messages that stimulate critical thinking” (EC 2014, 8-9). With

this acknowledgement of the online jihadi propaganda threat and the first official statement on desired criteria to combat this threat. Similar criteria have already been developed in 2012 by the Radicalisation Awareness Network [RAN] (RAN Prevent 2012, 9; RAN VVT 2012, 4). The RAN is an EC-launched network to connect people throughout Europe who are involved in radicalisation prevention, to facilitate discussion and the spread of best practices among the different actors (EC 2013a). However, the RAN-developed criteria cannot be seen as official EU desired jihadi content combat criteria, because the expressed views are purely those of the RAN working group and not an official position of the European Commission (RAN VVT 2012, 1). Therefore, January 2014 is the starting point in the search towards the EU’s official desired jihadi content combat criteria.

Referenties

GERELATEERDE DOCUMENTEN

The content moderation systems deployed by such platforms to ensure that content  posted on the platform complies with these terms, conditions, and standards have the potential

Further issues arise as well: whether algorithmically generated content should be considered speech, whether the controllers of algorithms are content providers or

Of the subgroup (n = 173) who would potentially support the concept of PMSC peacekeeping operations, 48% was undecided on donating to such a cause and just under 10% would be

In this chapter I have discussed the major points from Clausewitz’s theory that I have used to analyze the current situation in Iraq, in relation to the use of private military

On the basis of the inves- tigation material analysed, we can describe the nature of jihadi cooperations that were active in the Netherlands in the period between 2001 and 2005;

This thesis is designated to analyze the content and the essence of freedom to conduct business in the EU and to evaluate how it is affected when balancing this right

Setting the terms of the debates in this way evacuates in turn any question about the function- ing of media institutions and their role in the dissemination of Islamo- phobia

It states that there will be significant limitations on government efforts to create the desired numbers and types of skilled manpower, for interventionism of