• No results found

The propaganda devices of the Internet Research Agency: revisiting the theories of 20th century

N/A
N/A
Protected

Academic year: 2021

Share "The propaganda devices of the Internet Research Agency: revisiting the theories of 20th century"

Copied!
42
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Erasmus Mundus Master's in Journalism, Media and Globalization

The propaganda devices of the Internet Research Agency: revisiting the theories of 20th century

Master’s Thesis

Sabīne Bērziņa Student number: 12846929 Date of completion: 25-05-2020 Supervisor: Tom Dobber

(2)

1 Abstract

This study examines how the propaganda devices or techniques first described between (and in the aftermath of) the two world wars have been used by Russia’s Internet Research Agency on Twitter. The attention was focused on tweets about Europe.

The results show that the view of Europe that the Internet Research Agency was spreading was filled with fear appeals – a device that includes posting atrocity stories and references to threats. It was the most commonly used propaganda technique. Setting the climate of conflict also takes assigning who is to be seen as good and who – as evil. This study examined what roles – victims, villains or saviors - different actors were assigned in the tweets. The trolls most frequently sought to point out who the villains are. This role was taken by Brexit, terrorists and refugees. Although this role was used rarely, Poland and various right-wing actors were portrayed as saviors, but Africa, France and regular people were presented as victims.

This research contributes to bridging the gap between often forgotten propaganda theories and current affairs. It also adds to the knowledge about the content spread by Internet Research Agency because the previous studies have mostly focused on the tweets that were written in the context of the US presidential election.

(3)

2 Introduction

Due to several recent events of global significance – for example, the Russian invasion of Ukraine, the previous and the upcoming US presidential election, the outbreak of coronavirus, etc. – the word “propaganda” is making headlines again. At times it even becomes the central focus when these topics are discussed in the media. However, scholars warn that despite the interest, the subject is still not well-understood in many ways. Research happens over several disciplines, therefore there is a lack of consistently used definitions, bodies of literature, and techniques to study it (Baines & O’Shaughnessy, 2014; Silverstein, 1987). Besides that,

propaganda is also subject to evolving because of the change in media landscape (Lukito, 2020; Sparkes-Vian, 2019).

Although the first attempts to systemically study propaganda can be found in ancient Greece (Jowett & O’Donnell, 2018), there was a significant rise of interest and attempts to study political propaganda between (and after) the two world wars (Severin & Tankard, 2001). Later, the concept was traded for a more extensive focus on honest and dishonest practices of

persuasion in advertising and attitude change in general (Jowett & O’Donnell, 2018). Now, the topic is prominent again, but to what extent the theoretical ideas from the last century are still relevant, is largely unknown. This thesis is an attempt to bridge these theories about how propaganda with a recent attempt of a state led political propaganda campaign on social media.

In 2017, the US Office of the Director of National Intelligence declared that a Russian organization called the Internet Research Agency (IRA) helped elect the US president, Donald Trump by conducting information operations on social media (Background to “Assessing

Russian Activities and Intentions in Recent US Elections”: The Analytic Process and Cyber Incident Attribution, 2017). A year later, Twitter identified and released a database of IRA

created content (Twitter, 2018). It was made available to researchers and will be used in this study as well.

Because of the availability of pre-identified social media content that was part of the campaign (Twitter, 2018), IRA propaganda has been studied in a variety of different aspects. However, in the relation to the scandals surrounding the election interference in the US, the academic research on IRA has also been primarily focused on the US (for example, Bastos & Farkas, 2019; Linvill, Boatwright, Grant, & Warren, 2019; Xia et al., 2019a).

(4)

3

The study makes no claim on the effectiveness of those messages. It is often doubted if the Russian content actually could have left as much of an impact as often perceived (Bail et al., 2020; Benkler, Farris, & Roberts, 2018; McCombie, Uhlmann, & Morrison, 2020). Yet studying it is not irrelevant. While the impact of IRA involvement in the US election is still being

questioned, the scope and ambition of the campaign still took many by surprise (Sanovich, 2017). To add, there are other examples where the influence of Russian disinformation was more visible. For example, in the context of the war in Ukraine, the Kremlin first denied their participation in invasion of Crimea stating it was a civil war (Schrek, 2019). Although Russian president Vladimir Putin later did openly talk about their involvement (Schrek, 2019), the initial denial managed to cause confusion (Galeotti, 2015).

This study is an attempt to both contribute to filling the gaps in the knowledge of IRA operations by studying the campaign based on the theoretical discussion of propaganda tactics of the previous century and to expand on the knowledge of this particular campaign by studying IRA tweets mentioning Europe.

To do so, a research question was posed:

RQ: What propaganda techniques are used by the IRA in tweets about Europe, and to what

extent?

Theoretical framework

Propaganda: definitions and characteristics

Firstly, it is important to address what is meant by propaganda. The definition of propaganda used in this thesis was written by Jowett and O’Donnell who describe it as “the deliberate, systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist” (Jowett & O’Donnell, 2018, 7). It means that this definition would not include unintentionally false communication or any interpersonal or small group communication (like parents consciously or unconsciously using untrue information to make their children behave a certain way).

Propagating something is an organized, purposeful, conscious action to influence masses. Propagandists are people and organizations who promote their interests, at times at the expense of their audience but, most importantly, the attempts at persuasion are not done with the well-being

(5)

4

of the audience in mind, the propagandist does not have to believe in their own message (Jowett & O’Donnell, 2018). The propagandist also often does not want their identity to be known (Jowett & O’Donnell, 2018). A state campaign to spread messages using false identities fits the criteria. In relation to Russia, the contradictory nature of the content of some of those messages, which will be discussed in more detail further, points to the fact that there were no particular things that the propagandists believed in and wanted to spread out of conviction.

In the context of this thesis, it is also important to note that this definition of propaganda, just like several other definitions (Benkler et al., 2018; Doob, 1948), does not include

characteristics or specific techniques that would allow to recognize it and decide whether a piece of content is propaganda. However, the question about the characteristics of it is no less

important than drawing clear lines between what propaganda is and is not. What does a person or a group do when they aim to systemically manipulate is a central part of the research question of this thesis.

This is also a question very difficult to provide a consistent answer to. There are many propaganda practices and many disciplines studying them (Silverstein, 1987; Sparkes-Vian, 2019). An attempt is still important to reduce how fragmented the discipline of propaganda studies is, taking into consideration its societal relevance - even more so because the broad use of internet means that propaganda strategies must be adapted and changed (Lukito, 2020). The theories require updates and re-testing, too.

The circumstances described as welcoming to spreading propaganda by Jacques Ellul in

Propaganda: The Formation of Men's Attitudes, one of the classic texts on the topic, originally

published in 1962, make such a study more relevant. He writes that propaganda can be most effective when a man is in a mass, yet alienated (Ellul, 1973). Ellul also theorizes that effective propaganda requires an environment from which the individual hardly ever emerges from and does not find additional reference points that might change his thinking. Although abstract, both ideas create associations with phenomena tied to communication on social media such as echo-chambers that drive polarization and radicalization (Benkler et al., 2018; Wojcieszak, 2010).

Ellul wrote that, for the purpose of creating an environment one never emerges from, a complete control of the media is necessary (Ellul, 1973). However, the times are changing and so are the rules. Although the traditional media in Russia are strictly regulated, social media are less

(6)

5

so. To reach out to social media users Russia adopted such tools of propaganda as trolls and bots early on (more about it in the next section) (Sanovich, 2017).

The use of fake identities, of course, is not new in propaganda. During World War II, British and American officers called uncovered propaganda (obviously coming from official sources) white propaganda, and black propaganda if the source was concealed (Doob, 1948). The social media environment opens a lot of new possibilities for a wide use of black propaganda. Everyone can easily say anything pretending to be anyone. IRA tweets are always black propaganda – trolls themselves never reveal their real identities and intent. Some IRA trolls set up elaborate personas imitating people in the US and talking about their lives and pets in between political opinion posts (Xia et al., 2019b).

Some of the texts about propaganda from the last century used in this thesis were written with wartime propaganda in mind (for example, Doob, 1948; Lasswell, 1927). Researchers use these texts to inform their studies and theories about peace time political propaganda nonetheless (Conway, Elizabeth Grabe, & Grieves, 2007; Severin & Tankard, 2001), but there has been less attention and attempts to study and develop theories about the differences. Ellul theorized that there is pre-propaganda before propaganda – attempts to modify opinions are preceded with less direct attempts to set the climate, create ambiguities without obvious aggression, spread useful stereotypes (Ellul, 1973). That is something that Russia has been accused of doing (Benkler et al., 2018) with its propaganda campaigns in the last couple of years.

The Internet Research Agency (IRA)

Coordinated domestic Russian propaganda efforts, previously called “Internet brigades”, have been described as early as 2010 in academic literature (Deibert & Rohozinski, 2010), even earlier in the media – in 2003 (Benkler et al., 2018). There are reports of a military engagement in the cyberspace during the Russian-Georgian war in 2008 (Deibert & Rohozinski, 2010) and other initiatives, like massive blogging campaigns, to confuse, harass, and discredit certain people (Benkler et al., 2018). The campaigns are described as attempts to make it difficult to tell the truth from a lie, rather than promoting a particular viewpoint (Averin, 2018; Benkler et al., 2018; Sanovich, 2017).

The New York Times interviewed Platon Mamatov, a head of a public relations company, who claimed to have run a workplace from 2008 to 2013 with about 20 to 30 people employed to generate such content (Chen, 2015). Such organizations are now frequently referred to as troll

(7)

6

farms or troll factories (trolls are users who construct the identity of someone wishing to be a part of a particular group, but whose real intentions are causing disruption and exacerbating conflict (Hardaker, 2010)). Mamatov stated that they fulfilled tasks for Russian president’s party members (Chen, 2015). The same report mentions that in 2015, a large troll farm in St. Petersburg, the IRA, employed about 400 people tasked with targeting different communities, including people in the US.

In its indictment against IRA for election interference, the US claims that the organization made social media accounts of fictitious US people, attempted to make them opinion leaders and targeted several groups of people with different and contradictory interests and opinions. For example, some of the fictitious online groups were called Secured Borders, United Muslims of

America and Army of Jesus. Twitter, YouTube, Facebook, and Instagram were listed as platforms

where the group operated (Department of Justice, 2018).

Because Twitter gathered up those messages in a database and allowed everyone to download and use it, it has become the most studied platform of IRA content. Although these studies also confirmed the group sought to exacerbate existing conflicts and divides and played the roles of a lot of different people (Bastos & Farkas, 2019; McCombie et al., 2020), it would not be true to think that the group did not promote a particular presidential candidate in the US election. The posts on Twitter have been analyzed to have been more favorable to Donald Trump – trolls from both sides of the political spectrum were involved, but left-leaning trolls spent less time attacking Trump and supporting Clinton than right-wing trolls spent supporting Trump and attacking Clinton (Lisnvill et al., 2019). It has also been found that the most conservative

individuals have been most exposed to such content (Hjorth & Adler-Nissen, 2019).

However, the US was not the only country targeted by the IRA in the recent campaign, as can be seen from the database (Twitter, 2018). Although the content of these tweets is less studied, it is known, for example, that some IRA users wrote German language posts with anti-refugee content while others made content that was supportive of the chancellor Angela Merkel (Dawson & Innes, 2019). However, as the authors of the study note, it was at the summer of 2016 when she faced pressure for her migration policy and calls for resignation. The paper also

discussed tactics to attract followers like following and unfollowing different accounts to get their attention.

(8)

7

Another study about German and English language accounts focused on troll behavior like retweeting each other’s messages to amplify the content and studied the use of some hashtags (like #eu, #merkel, and #refugee) (Llewellyn, Cram, Hill, & Favero, 2019). From the analysis of such hashtags it was found that pre-Brexit, the IRA was more actively tweeting pro-leave

messages than pro-remain. The authors conclude that it looks like the trolls were instructed to tweet about Brexit en-masse (Llewellyn et al., 2019). It still leaves space to study Europe related tweets, especially their content, without limiting the attention to specific personalities, topics or hashtags.

The tactics

As the studies on the IRA's Europe content have been focused on select hashtags and particular topics, just like the US studies are primarily focused on presidential candidates and political identity groups, there is very little to be told about the overall tactics that were used. It is hard to tell how consciously crafted the messages are but there are some media reports that show that IRA employees were not working on the content individualistically, were not guided by intuition but rather coordinated efforts at least for some attacks and were given instructions about what to post from higher-ups (Chen, 2015; Lukito, 2020; MacFarquhar, 2018; Meduza, 2017; Shuster & Ifraimova, 2018). Therefore, it is a worthy task to try to grasp the common, most frequently used elements in the campaign.

The war on “them”

One of the main principles to drive state conflict is to present it as idealism-driven, Harold D. Laswell wrote in a book examining the propaganda during World War I (Lasswell, 1927). To lessen the psychological resistance to war in the population, the opponent must be seen as an enemy and an aggressor. He writes that the ethnocentrism makes it possible to re-interpret conflicts as struggle to preserve and propagate a certain type of civilization – there can be a racial, religious or almost endless other angles to use. Lasswell even describes twists on this tactic such as German tailors declaring war on the immoral, decadent fashion of Paris (Lasswell, 1927).

Role players of actors are needed to advance such normative ideas (Conway et al., 2007). Men seek “scapegoats” and “messiahs” that are at the center of these narratives – scapegoats get the people in the disagreeable circumstances, but messiahs get them out (Lasswell, 1927).

(9)

8

More recently, Kathleen Hall Jamieson describes “us-versus-them” narratives made to scare the audience with cultural and economic changes benefiting “them” (Jamieson, 2018). An analysis of American populist talk radio shows also noted it is a tactic frequently used by the hosts to provide simple explanations of the problems of listeners to direct their anger at (Kay, Ziegelmueller, & Minch, 1998). The struggles between good and bad are often initiated by a third group of victims. It represents a society and its members (Conway et al., 2007).

Therefore, these ideas are important to examine in this study as well. The answer to these questions will help identify the actors and the roles they take in the worldview that the IRA is spreading:

RQ1: Who are the main actors in the tweets of IRA about Europe? RQ2: What roles are the actors in IRA tweets assigned?

Spreading fear

To create a climate of danger, an aggressor is needed. Therefore, fear appeal is considered to be another classical component of wartime propaganda (Conway et al., 2007). Laswell wrote that atrocity stories create indignation. Emphasis is put on acts that would generate outrage like “wounding of women, children, old people, priests and nuns, upon sexual enormities, mutilated prisoners and mutilated non-combatants” (Lasswell, 1927, 82). For example, it has been found that a significant part of the early propaganda material of the terrorist group Al-Qaeda was focused on injured or dead children (Baines & O’Shaughnessy, 2014).

Studies show that fear appeals are especially effective when paired with information about actions to take in order to reduce the threat (Jowett & O’Donnell, 2018). Jacques Ellul (1973), who analyzed propaganda after the world wars, noted that one of the main goals of a propagandist is to turn diffuse, vague opinions into active expressions of them. Action would make the propaganda effects irreversible – the person would need the ideas to rationalize the actions (Ellul, 1973). Even liking, sharing, and commenting increases the commitment to a particular sentiment (Jamieson, 2018). Therefore, the study will seek to answer how often the fear appeals are used and how often they are accompanied with a call for action:

RQ3: To what extent is fear appeal invoked in the tweets?

(10)

9 The seven propaganda devices

The Institute of Propaganda Analysis (IPA) is considered one of the most influential groups to study propaganda between the two world wars (Conway et al., 2007; Severin & Tankard, 2001). It became known because of a material describing seven propaganda tactics - name calling, glittering generalities, transfer, testimonial, plain folks, card stacking, and bandwagoning (Lee & Lee, 1939).

Name calling refers to giving something or someone a negative label that would become associated with it so that people would not take a deeper interest in the subject (Lee & Lee, 1939). For example, in the study about populist radio talk show hosts, Rush Limbaugh was found to often resort to this device, labelling people as, for example, "Butch Reno", "Lesbian Donna Shalala" or "CommuNazi" (Kay et al., 1998). There is also a tactic to achieve the opposite effect by assigning positive labels. The IPA called the device glittering generality. For example, in a study about the devices used by Fox News personality Bill O’Reilly, O'Reilly labelled the Iraqi war coalition forces as “good guys” (Conway et al., 2007).

Another device – transfer – involves using the reputation (good or bad) of a person, event or a concept to compare it to another one to give it an emotional coloring (a vivid example would be to compare something to the September 11 acts of terror). Testimonial is using someone who is either trusted or distrusted to help endorse or reject another person or idea (Lee & Lee, 1939). Two other related tactics are plain folks and bandwagoning. Plain folks means identifying something as belonging to the regular people as opposed to the elite (Lee & Lee, 1939). For example, O’Reilly used it to refer to himself as “your humble correspondent” and sometimes said “we Americans” (Conway et al., 2007). Bandwagoning invites others to either hold a certain opinion or engage in some action because of its popularity (Lee & Lee, 1939). Ellul (1973) described such tactics evoking majority effect as an essential mean of propaganda campaigns but Jamieson (2018) notes that the perception that one’s community supports something increases the chances that it will be uncritically accepted.

These devices were first used to analyze speeches by a German Nazi party supporter in the U.S., Catholic priest Charles Coughlin (Lee & Lee, 1939). Although it is often disputed how many of the tactics described are actually effective, they are still used at times (Allen, 2002; Baines & O’Shaughnessy, 2014; Conway et al., 2007; Kay et al., 1998; Severin & Tankard, 2001). This study does not deal with measuring the effects, so the list will be used here too.

(11)

10 RQ5: To what extent are the propaganda devices identified by the IPA used in IRA tweets about

Europe?

Methods

The questions and hypothesis were tested using quantitative content analysis of tweets about Europe (N = 1530) from the IRA database made public by Twitter (Twitter, 2018). Overall, the database contains 8 768 632 IRA tweets in various languages. Only English language tweets were analyzed in this study. To select the tweets about Europe, the column with the text of the tweet was filtered for tweets containing the word “Europe”. That left a population of 6123 tweets sent from March 2014 to May 2018. A quarter of those tweets were sampled using simple

random sampling.

The sample is limited because only one word was used in the search query (tweets do not have to mention the word “Europe” to be about Europe). However, I considered such a method for selecting tweets to be the most straightforward and reasonably comprehensive.

Additionally, the IRA database did not allow the coder to understand whether the tweet was a part of the thread. If a tweet was a reply, the original tweets could not always be found in the database since trolls did not only interact with other trolls. Therefore, it was decided that each tweet would always be treated separately.

Tweets that did not actually discuss Europe (like this one – “Back in the gym tomorrow???? @Oswaldo_Rascon @Haley36977826 @filthyRichMYDEA @smeyer_12 @jayyellaa @EuropeanGatsby @meganlatterner @BBSleoo”) or tweets that were

incomprehensible (like this one – “V_of_Europe SlicksTweetz TheHoneybee_ PGHowie2”) were dismissed and replaced with another tweet – 77 tweets were replaced altogether. However, the tweets were used for analysis as long as they were understandable enough to derive some

meaning from them even if there was an error in the text (like the tweet here – “RT @asamjulian: Pope Francis says Europe is in danger of "falling apart." This comes after his repeated calls for countries to accept more…”).

To tie this study to previously done research, the operationalization was chosen to be similar to the study by Conway, Grabe, and Grieves about propaganda techniques that were used

(12)

11

by Bill O’Reilly (Conway et al., 2007). Some things were added, like the question of whether any calls of action appear in the tweets.

To answer the research questions that arose from the theories on propaganda devices described previously, this study will measure ten variables (the codebook with detailed instructions can be found in the appendix A):

1) Actors and their roles

Emergent coding (Blair, 2015) was used to detect actors in the tweets. Particular people, groups of people, states, organizations etc. were seen as actors. First, they were coded exactly as mentioned in the tweet and in the second round of coding actors were sorted into larger groups (for example, such actors as migrants, refugees, and Muslims or EU and European leaders were categorized together). To allow a detailed view of how actors were represented, a separate group for an actor was formed if there were at least 15 tweets where it was mentioned, meaning it was relevant enough to be mentioned in at least 1% of the tweets. For example, there is a group called “Germany” that includes Germany the country, cities in Germany, and German politicians except Angela Merkel who was mentioned frequently enough to form a group of her own. The group called “Merkel” included actors Merkel and Angela Merkel. Other European states that were not mentioned as frequently as Germany became a part of the group called “European states/cities”. The full list of actors and, where needed, more information on how actors were grouped can be found in appendix B.

It was also measured what roles the actors were assigned – victim, savior, and villain (or messiah and scapegoat, as described by Lasswell (1927)). A victim is an actor presented as suffering or being endangered to suffer from some evil force, a villain is an actor that puts

something or someone in danger or is described as bad, dangerous, immoral, and to be feared but a savior is someone who is presented as either making or able to make the situation better. These explanations about what the roles mean are also coded this way in the previously mentioned study of Conway, Grabes, and Grieves (2007).

These roles can manifest in a variety of ways – Europe was sometimes placed in the victim role because its values are supposedly getting destroyed by immigrants, other times it was a victim of bad policy of the villain - the EU. “Iron lady Maggie Thatcher” was described by the IRA as needed now because of her “crushing socialism”, and therefore was categorized in a savior role. When none of these roles applied, the role was coded as “0”.

(13)

12

As opposed to all other RQs, the unit of data collection here was an actor mentioned in the tweet, not the tweet itself, because one tweet could contain mentions of several actors each taking a different role.

2) Fear appeals

Presence or absence of fear appeals in a tweet were measured. Those are atrocity stories, talk about threats or conflicts, claims that the readers or a certain group of people is in danger, or discussions of hypothetical threats like attacks that could take place. It is similar to the definitions for coding in a study about the use of fear appeals in political speeches about terrorism (De Castella, McGarty, & Musgrove, 2009). This variable, just as the variables further on, were coded “0” for absent or “1” for present.

3) Calls for action

Since, as explained previously, fear appeals are demonstrated to be more effective when there are suggestions of what should be done, presence or absence of such calls for action were also measured. Conway, Grabes, and Grieves’ (2007) study did not include this variable.

Not only were encouragements for actions such as donating and protesting understood to indicate a presence of this device, but liking and sharing content online can be seen as increasing commitment to a particular sentiment (Jamieson, 2018), invitations to like and share were also considered to be calls for action. However, this variable was not limited to calling the audience to some action. For example, saying that some EU country should stop admitting refugees was also considered a call for action.

4) Seven propaganda devices

Presence or absence of the propaganda devices listed by Institute for Propaganda Analysis (Lee & Lee, 1939) also used in the study about propaganda techniques by Bill O’Reilly (Conway et al., 2007) was measured. The detection of the use of those devices is based on the way it was done in those studies.

4.1. Name calling – giving something a negative label to make the audience reject it without examining evidence. For example, name calling is using such tags or

nicknames as “un-American”, “femi-nazi”, or “globalist”, if it is meant as something negative.

(14)

13

4.2. Glittering generality – giving something a positive label to make the audience accept it without examining evidence. In a way, this device is the opposite of name calling – giving something a positive tag or a nickname like “patriot army”.

4.3. Transfer – using the status of an idea or a person and transferring it to another. An example of this would be comparing an event to another historical event to invite associations of how tragic it is.

4.4. Testimonial – using a respected or hated person to endorse or reject something, for example, when an expert calls an idea or policy either good or bad.

4.5. Plain folks – identifying something as one of the people, not elite, or emphasizing how someone or something is a part of the elite (like calling something elitist or calling a politician “just a regular guy”).

4.6. Bandwagon – suggesting that because of the high approval rate of something, others should hold the same opinion or consider taking the same action.

Another device described by the Institute of Propaganda Analysis is called card stacking. It means selective use of facts, lies, and half-truths (Lee & Lee, 1939). This device was not measured in this study because fact checking all the tweets would not be feasible.

Reliability

The tweets were coded manually by one coder. To address the reliability issues of the study that could arise while coding the texts, another coder was asked to code one-tenth (153) of the tweets. Intercoder reliability was measured and Krippendorff's alpha was calculated. Most tests resulted in sufficiently high values: roles (α = .83), fear appeals (α = .91), call for action (α = .92), name calling (α = .86), glittering generality (α = .85), transfer (α = .83), testimonial (α = .70), plain folks (α = .72). The only variable with a low score was bandwagon (α = .32) and therefore it was excluded from further analysis. It was a very rarely used variable, appearing in only 17 tweets.

Results

Overall 1530 IRA tweets and the 3426 actors mentioned in them were analyzed. The analysis shows that overall the IRA devoted the most space to tweets about refugees, Muslims,

(15)

14

and Islam, usually placing them in the role of villains accused of causing violence and disruption. This is also reflected in the most commonly used propaganda device – fear appeals.

Actors and roles

The first research question (RQ1) asked what the main actors are in IRA tweets about Europe. Given that the tweets were selected by filtering for those that contain the word “Europe”, it is unsurprising that Europe was also the most commonly mentioned actor. From all actors counted, 23.9% were Europe. The other popular actors are a little more telling of the topics of interest of the IRA – the group containing actors such as refugees, migrants, Muslim, Islam, and Arabs were 11.6% of the actors mentioned. The US and related actors such as US institutions and politicians, for example, were 6.3% of all actors, the EU and its institutions or leaders were 6.2%, and Europeans/regular people – 4.4% of actors. The full summary that offers the list of the count of times the groups of actors were mentioned and what the group consists of can be found in Appendix B.

Most of the time the roles of villain, victim, or savior were not employed. Overall, only 17.6% of actors were assigned to the role of villain, 12.8% to victim, while only 0.9% of actors were described as saviors.

There are differences between the roles different groups of actors are put in. For example, one of the groups of actors that was placed in the savior role most often was Poland (n = 15), found in 20% of tweets. An example would be when it was said that Poles know Islam because their 17th century king Jan Sobieski already saved Europe from Islam once. Right wing parties, organizations, and politicians (n = 60) also took this role in 11.7% of tweets. This consists of actors like Viktor Orban, Marine Le Pen, Pegida, and the far-right in general, all represented as saviors in 11.8% of tweets. For example, one tweet claimed that Europe will fall to ISIS if France does not elect Marine Le Pen. The next actor with the highest rate (7.8%) of being described as savior was Donald Trump (n = 64). For example, when he was said to be good to the US and the world because he would not encourage immigration.

Brexit (n = 17) when it was described as disrupting Europe, terrorists (n = 104) and the actor group that included refugees, migrants, and Muslims (n = 398) were talked about as villains the most often – in 70.6%, 64.4%, and 56.5% of the tweets, respectively. Africa (n = 42) and France (n = 42), usually emphasizing how it has suffered from terror attacks, were painted as

(16)

15

victims in 31% and 33.3%. The full list of the actor groups and how frequently they were assigned each role can be found in the Appendix C.

When comparing all the EU member states that were mentioned frequently enough to form an actor group of their own, it can be seen that there are distinct villains and victims (χ² (18) = 62.56, p < .001). For example, 20% of the tweets assigned Poland to a savior role, compared to 4.7% of tweets mentioning UK (n = 64), and 2.5% of tweets mentioning Germany (n = 40). As can be seen on Table 1, no other states were ever placed in this role. Greece (n = 60) and Sweden (n = 18) became the villains most frequently but France was assigned the victim role more than others.

Table 1

Roles assigned to EU member states/regions

Actor Role, %

Savior Villain Victim None n

Belgium 0 2.9 20.3 76.5 34 France 0 11.9 33.3 54.8 42 Germany 2.5 10 12.5 75 40 Greece 0 25 1.7 73.3 60 Poland 20 0 6.7 73.3 24 Sweden 0 22.2 22.2 55.6 18 UK 4.7 3.1 18.8 73.4 64

When the comparison of roles was done between the group consisting of actors like refugees, Muslims, Islam (n = 398), Europeans/Regular people (n = 152), and EU/EU institutions (n = 213), the differences were again distinct (χ² (6) = 237.56, p < .001). Table 2 illustrates that other actors do not compare to refugees, Muslims, and Islam when it comes to those that the role of villain is assigned. Between these categories, Europeans and regular people became victims the most often – 27.6% of tweets placed them in this role.

(17)

16

Table 2

Roles assigned to refugees, regular people, and EU/EU institutions

Actor Role, %

Savior Villain Victim None n

EU/EU institutions 0.5 15.5 3.8 80.3 213

Europeans/regular people 1.3 7.9 27.6 63.2 152

Refugees/Muslims/Islam 0.3 56.6 3.3 39.9 398

Fear appeals and calls for action

The data on the most frequently used propaganda devices shows that fear appeals and calls for action were widely used. There is no such a device that could be called a backbone of the IRA strategy. Nearly half of the tweets did not use any of the devices studied in this research, just like most (68.7%) of the actors mentioned in the tweets were not assigned any of the roles analyzed. However, if any device could come close to be called a backbone, it would be fear appeals – atrocity stories and talks of threat –used in 35% of the tweets. For example, some tweets mentioned Islamic terrorists murdering children or that Brexit is unraveling Europe.

Table 3

The share of tweets where each of the propaganda tactics was used

The name of the tactic Tweets where the tactic was used (%) n

Fear appeals 34.5 528

Call for action 11.4 175

Name calling 7.8 119

Glittering generality 1.4 21

Transfer 9.7 148

Testimonial 11.7 179

(18)

17

Call for action was used in a little over 11% of all the IRA tweets. It included appeals for people to join a patriot army, European countries to leave the corrupt EU, calls to ban Islam, and calls to “wake up”.

To measure whether tweets with a call for action were usually tied to tweets with fear appeals, as it was stated in the hypothesis, a chi-squared test was done. The results show that the association between the use of fear appeals and calls for action is significant (χ² (1) = 24.95, p < .001). However, the association was very weak (tau = .02, p < .001).

Devices of Institute of Propaganda Analysis

Table 3 also shows how frequently the other devices were used. The second most popular device was testimonial, meaning using someone with either a good or a bad reputation to make something more or less appealing with their testimony. It included, for example, the Dalai Lama saying that Europe has taken too many migrants, Trump saying that “grumbling European phonies” should “stop being deadbeats,” and Jean-Claude Juncker saying that Europe is divided.

Transfer was used in nearly a tenth of tweets. Often those were tweets talking about a leak that never actually happened (Polityuk, 2014) in a nuclear power plant in Zaporizhzhya in

Ukraine. Despite of it, it was tweeted about in mass, comparing it to Fukushima nuclear disaster. Name calling like describing the March of Europe as a “butt-hurt march” and Ukraine as

“Nukraine” was used in 7.8% of tweets.

Glittering generality is the opposite of name calling as it attaches positive labels and examples of it are “common sense conservatism” and calling Margaret Thatcher “iron lady”. Plain folks means emphasizing that something comes from or is popular within the regular people, not elite, or is not a part of the regular people, as can be seen in this tweet – “Merkel is a traitor to the European people's Like Clinton to the Americans. #Munich

https://t.co/LZndh4v8kr”. These devices were used rarely relative to others, in below 3% of the tweets.

Discussion and conclusion

The so-called troll factories and the internationally reporting state-owned Russian media are a couple of the reasons why studies of propaganda have recently regained interest. Although there have been attempts to systemically study it for a long while, especially in the aftermath of

(19)

18

the First and the Second World Wars, there have not been attempts to connect this knowledge with the recent uses of propaganda. This master’s thesis is an attempt to take the first steps in bridging the gap – using this knowledge to research Twitter propaganda by Russia’s IRA.

Several classical devices were studied. For example, what actors are used and what roles are assigned to them in order to see who is presented as good and who as evil; if tweets try to appeal to peoples’ fears and if they invite people for action; if there is name calling and attempts to categorize people and ideas in terms of what is belonging to the regular people and the elite (Conway et al., 2007; Doob, 1948; Ellul, 1973; Lasswell, 1927; Lee & Lee, 1939). Overall, these devices cannot be discounted as being not useful, although they did not help explain about a half of the tweets studied here. This study showed they can be defined in a reliable way and used for other propaganda studies but this list needs additions. Therefore, a qualitative study trying to find common motives in IRA rhetoric could be useful. Further on, there are other suggestions for further research.

It was found that IRA tweets mostly featured actors that were not assigned the role of a victim, a villain, nor a savior. It is due to the large number of tweets that spread different news stories. Some featured sports or entertainment news, perhaps as an attempt at building trust in the audience (an account that only posts news stories on particular topics might not look authentic (Xia et al., 2019a)) or attracting audience that is not particularly interested in politics (Kriel & Pavliuc, 2019). Others tweeted and retweeted hard-news stories but the posted fragment did not instantly reveal the attitude and role placement toward the subjects of the story (an example of that would be a tweet such as this – “Europe turns to Morocco in Paris attacks investigation #world #news”). The large share of the tweets that have seemingly nothing to do with furthering the agenda of the IRA has been observed in other studies as well (Linvill et al., 2019).

Nonetheless, it does not necessarily mean that these news stories do not represent a particular outlook on the world that could be seen by studying the agenda of the posts. For example, it has been found that accounts imitating people that hold left-wing ideas were posting such content more frequently than a right-wing account that had more political content (Linvill et al., 2019).

Further research could also focus on accounts that try to create fake personas instead of just tweeting news stories. For example, Xia, Lukito, Zhang, Wells, Kim, and Tong (2019) did a case study of a highly successful account posing as a US woman, Jenna Abrams, that was used across multiple platforms and attempted to perform personal authenticity by sharing personal

(20)

19

opinions and life stories. More focus on this phenomenon, like the “opinions” of such fake online personas and the success of the information spread in this way would be both interesting and relevant to study. If the accounts like this are seen as less suspicious by different members of the audience and are more successful at helping spread propaganda messages, they might get used more frequently further on.

The roles of villain and victim were used in 18% and 13% of the tweets, respectively, savior – in not even a percent (Poland and right-wing political players were placed in this category most frequently). It shows that when it comes to tweets about Europe, IRA sought to place blame and indicate the victims, rather than show a path to follow by emphasizing who are the “good guys”. Very few role players categorized as heroic was something that was also found in the study about propaganda devices used by Bill O’Reilly (Conway et al., 2007).

The most frequent villains were Brexit threatening the unity of Europe, terrorists and a collective of actors like refugees, Muslims, migrants, Arabs, and Islam. Africa and France were described as victims most frequently, followed by group of actors describing regular people like Europeans, sons and daughters, children. Overall, the tweets were more supportive of right-wing players and followed their agenda, which falls in line with other recent studies on IRA (Badawy, Addawood, Lerman, & Ferrara, 2019; Linvill et al., 2019).

Harold Lasswell (1927), talking about war time propaganda, described such division of who is evil and who is the victim as a necessity to paint a conflict as an attempt to preserve a civilization and the struggle against it as idealism driven. This, according to Lasswell, helps against the psychological resistance of the population.

Further, the research sought to see what other propaganda devices were used and how frequently. For example, in the study about propaganda devices used by Bill O’Reilly (Conway et al., 2007) it was concluded that the backbone of his communication strategy was a propaganda device called name calling which describes attaching negative labels or nicknames. He used it on average 8.8 times a minute. Fear appeals – spreading atrocity stories and talking about risks and dangers to physical harm and to culture – was the most popular one for the IRA. It was used in more than a third of tweets. The large share of tweets about crime and security have been noted in other studies about IRA as well (Bastos & Farkas, 2019). Creation of a climate of fear is seen as another essential component to create resentment (Lasswell, 1927).

(21)

20

In theory, it is described as more effective when recommending an action that would reduce the threat (Jowett & O’Donnell, 2018) because action turns ideas into something important to hold onto to justify prior behavior (Ellul, 1973). However, this logic was not followed by IRA. Although some tweets with fear appeals also called for an action, the study showed that connection between those two propaganda devices is weak, meaning that they mostly they were not used in the same tweets.

Using well known people to testify for or against something was the second most common device. One reason could be that cues such as what different leaders think can

sometimes help strong partisans understand what ideas they should adopt and justify to preserve their identity (Bakker, Lelkes, & Malka, 2019). Taking the common use of the device, another useful element to study in the future would be understanding who is taking a role of an expert in the tweets. It would allow to find out who the IRA is seeking to give voice to and on what occasions. For example, the former president of the European Commission Jean-Claude Juncker was only used to testify that Europe is getting more divided.

Another commonly used device was transfer that was found in nearly a tenth of all tweets. The device means comparing one thing to another to invoke a certain evaluation and emotions (like comparing one event to another tragic event) (Lee & Lee, 1939). As mentioned previously, nearly a half of those tweets discussed a leak in an Ukrainian nuclear power plant that officials say never actually took place (Polityuk, 2014). This made-up accident was then compared to Fukushima. What attracts attention is that this is not the first time IRA accounts spread

information about a disaster that never took place. New York Times has previously reported on IRA mass-tweeting made up information about an explosion causing toxic hazard in Louisiana (Chen, 2015). Although none of those stories spread widely enough to do significant damage, such experiments are noteworthy. The successful spread of misinformation and disinformation about coronavirus (BBC, 2020) and, for example, its relation with 5G mobile technology (Gallagher, 2020) shows that in certain contexts campaigns on some health hazards can become successful.

Overall, none of the devices researched in the study were left unused. However, glittering generality was the least popular. The device is essentially the opposite of another, more

frequently used device – name calling. It might be related to how rarely tweets also mentioned actors in a role of savior. IRA did not attach a lot of positive labels to people or ideas, again

(22)

21

indicating that rather than hinting that Europeans should follow particular leaders or thoughts, the organization sought to indicate who are the enemies.

It would be interesting to study whether that changes at any point and how the use of other tactics will develop. However, it would require Twitter to keep detecting and publishing more Russian troll activity. That has not happened on as large a scale as in October 2018 when the database studied here was published.

(23)

22 References

Allen, A. (2002). Just whose side is God on? British Journalism Review, 13(4), 41–49. https://doi.org/10.1177/095647480201300406

Averin, A. (2018). RUSSIA AND ITS MANY TRUTHS. pp. 59–67. NATO Strategic Communications Centre of Excellence.

Background to “Assessing Russian Activities and Intentions in Recent US Elections”: The Analytic Process and Cyber Incident Attribution. (2017).

Badawy, A., Addawood, A., Lerman, K., & Ferrara, E. (2019). Characterizing the 2016 Russian IRA influence campaign. Social Network Analysis and Mining, 9(1).

https://doi.org/10.1007/s13278-019-0578-6

Bail, C. A., Guay, B., Maloney, E., Combs, A., Sunshine Hillygus, D., Merhout, F., … Volfovsky, A. (2020). Assessing the Russian Internet Research Agency’s impact on the political attitudes and behaviors of American Twitter users in late 2017. Proceedings of the

National Academy of Sciences of the United States of America, 117(1), 243–250.

https://doi.org/10.1073/pnas.1906420116

Baines, P. R., & O’Shaughnessy, N. J. (2014). Al-Qaeda messaging evolution and positioning, 1998–2008: Propaganda analysis revisited. Public Relations Inquiry, 3(2), 163–191. https://doi.org/10.1177/2046147X14536723

Bakker, B., Lelkes, Y., & Malka, A. (2019). Understanding Partisan Cue Receptivity: Tests of Predictions from the Bounded Rationality and Expressive Utility Perspectives. The Journal

of Politics. https://doi.org/10.1086/707616

Bastos, M., & Farkas, J. (2019). “Donald Trump Is My President!”: The Internet Research Agency Propaganda Machine. Social Media + Society, 5(3).

https://doi.org/10.1177/2056305119865466

BBC. (2020). Ofcom: Covid-19 5G theories are “most common” misinformation - BBC News. Retrieved May 22, 2020, from BBC website: https://www.bbc.com/news/technology-52370616

Benkler, Y., Farris, R., & Roberts, H. (2018). Network Propaganda (Vol. 1). https://doi.org/10.1093/oso/9780190923624.001.0001

(24)

23

Methods and Measurement in the Social Sciences (Vol. 6).

https://doi.org/10.2458/v6i1.18772

Chen, A. (2015). The Agency - The New York Times. Retrieved April 4, 2020, from The New York Times website: https://www.nytimes.com/2015/06/07/magazine/the-agency.html Conway, M., Elizabeth Grabe, M., & Grieves, K. (2007). VILLAINS, VICTIMS AND THE

VIRTUOUS IN BILL O’REILLY’S “NO-SPIN ZONE.” Journalism Studies, 8(2), 197–223. https://doi.org/10.1080/14616700601148820

Dawson, A., & Innes, M. (2019). How Russia’s internet research agency built its disinformation campaign. Political Quarterly, 90(2), 245–256. https://doi.org/10.1111/1467-923X.12690 De Castella, K., McGarty, C., & Musgrove, L. (2009). Fear Appeals in Political Rhetoric about

Terrorism: An Analysis of Speeches by Australian Prime Minister Howard. Political

Psychology, 30(1), 1–26. https://doi.org/10.1111/j.1467-9221.2008.00678.x

Deibert, R., & Rohozinski, R. (2010). Introducing Next-Generation Information Access Controls. In J. Palfrey (Ed.), Access Controlled: The Shaping of Power, Rights, and Rule in

Cyberspace. The MIT Press. https://doi.org/10.7551/mitpress/8551.001.0001

Department of Justice, U. (2018). UNITED STATES OF AMERICA v. INTERNET RESEARCH

AGENCY IN THE UNITED STATES DISTRICT COURT FOR THE DISTRICT OF COLUMBIA UNITED STATES OF AMERICA. Retrieved from

https://www.justice.gov/file/1035477/download

Doob, L. W. (1948). Public Opinion and Propaganda. In American Political Science Review (Vol. 42). https://doi.org/10.2307/1950145

Ellul, J. (1973). Propaganda: The formation of men’s attitudes. Vintage Books.

Galeotti, M. (2015). “Hybrid war” and “little green men”: How it works, and how it doesn’t. In

Ukraine and Russia: People, Politics, Propaganda and Perspectives. Retrieved from

http://www.e-ir.info/publications/

Gallagher, R. (2020). 5G coronavirus conspiracy theory driven by coordinated effort | News | Al Jazeera. Retrieved May 22, 2020, from Al Jazeera website:

https://www.aljazeera.com/ajimpact/5g-coronavirus-conspiracy-theory-driven-coordinated-effort-200410182740380.html

Hardaker, C. (2010). Trolling in asynchronous computer-mediated communication: From user discussions to academic definitions. Journal of Politeness Research, (6), 215.-242.

(25)

24

https://doi.org/10.1515/JPLR.2010.011

Hjorth, F., & Adler-Nissen, R. (2019). Ideological Asymmetry in the Reach of Pro-Russian Digital Disinformation to United States Audiences. Journal of Communication, 69(2), 168– 192. https://doi.org/10.1093/joc/jqz006

Jamieson, K. H. (2018). Cyberwar: How Russian Hackers and Trolls Helped Elect a President:

What We Don’t, Can’t and Do Know. Oxford University Press.

Jowett, G., & O’Donnell, V. (2018). Propaganda & persuasion. SAGE Publications.

Kay, J., Ziegelmueller, G. W., & Minch, K. M. (1998). From Coughlin to contemporary talk radio: Fallacies & propaganda in American populist radio. Journal of Radio Studies, 5(1). https://doi.org/10.1080/19376529809384526

Kriel, C., & Pavliuc, alexa. (2019). Reverse engineering Russian Internet Research Agency tactics through network analysis. | StratCom. Retrieved May 17, 2020, from Defence Strategic Communications website: https://stratcomcoe.org/ckriel-apavliuc-reverse-engineering-russian-internet-research-agency-tactics-through-network

Lasswell, H. D. (1927). Propaganda Technique in the World War.

Lee, A. M., & Lee, E. B. (1939). The Fine Art of Propaganda: A Study of Father Coughlin’s

Speeches.

Linvill, D., Boatwright, B., Grant, W., & Warren, P. (2019). “THE RUSSIANS ARE HACKING MY BRAIN!” Investigating Russia’s Internet Research Agency Twitter Tactics during the 2016 United States Presidential Campaign. Computers in Human Behavior, 292.-300. https://doi.org/10.1016/j.chb.2019.05.027

Llewellyn, C., Cram, L., Hill, R. L., & Favero, A. (2019). For Whom the Bell Trolls: Shifting Troll Behaviour in the Twitter Brexit Debate. Journal of Common Market Studies. https://doi.org/10.1111/jcms.12882

Lukito, J. (2020). Coordinating a Multi-Platform Disinformation Campaign: Internet Research Agency Activity on Three U.S. Social Media Platforms, 2015 to 2017. Political

Communication, 37(2), 238–255. https://doi.org/10.1080/10584609.2019.1661889

MacFarquhar, N. (2018). Inside the Russian Troll Factory: Zombies and a Breakneck Pace - The New York Times. Retrieved April 5, 2020, from The New York Times website:

https://www.nytimes.com/2018/02/18/world/europe/russia-troll-factory.html

(26)

25

Russia’s troll farms. Intelligence and National Security, 35(1), 95–114. https://doi.org/10.1080/02684527.2019.1673940

Meduza. (2017). An ex St. Petersburg ‘troll’ speaks out Russian independent TV network

interviews former troll at the Internet Research Agency — Meduza. Retrieved April 5, 2020, from Meduza website: https://meduza.io/en/feature/2017/10/15/an-ex-st-petersburg-troll-speaks-out

Polityuk, P. (2014). Ukraine denies radioactive leak on Zaporizhzhya nuclear plant - Reuters. Retrieved May 17, 2020, from Reuters website: https://www.reuters.com/article/us-ukraine-

crisis-nuclear/ukraine-denies-radioactive-leak-on-zaporizhzhya-nuclear-plant-idUSKBN0K81GO20141230

Sanovich, S. (2017). Computational Propaganda in Russia: The Origins of Digital

Misinformation. Retrieved from

https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Comprop-Russia.pdf

Schrek, C. (2019). From “Not Us” To “Why Hide It?”: How Russia Denied Its Crimea Invasion, Then Admitted It. Retrieved April 15, 2020, from RFERL website:

https://www.rferl.org/a/from-not-us-to-why-hide-it-how-russia-denied-its-crimea-invasion-then-admitted-it/29791806.html

Severin, W. J., & Tankard, J. W. (2001). Communication Theories: Origins, Methods and Uses

in the Mass Media.

Shuster, S., & Ifraimova, S. (2018). How Russia Spreads Fake News, Explained by a Former Troll | Time. Retrieved April 5, 2020, from Time website: https://time.com/5168202/russia-troll-internet-research-agency/

Silverstein, B. (1987). Toward a Science of Propaganda. Political Psychology, 8(1), 49. https://doi.org/10.2307/3790986

Sparkes-Vian, C. (2019). Digital Propaganda: The Tyranny of Ignorance. Critical Sociology,

45(3), 393–409. https://doi.org/10.1177/0896920517754241

Twitter. (2018). Elections integrity. Retrieved January 26, 2020, from https://about.twitter.com/en_us/values/elections-integrity.html#data

Wojcieszak, M. (2010). ‘Don’t talk to me’: effects of ideologically homogeneous online groups and politically dissimilar offline ties on extremism. New Media & Society, 12(4), 637–655. https://doi.org/10.1177/1461444809342775

(27)

26

Xia, Y., Lukito, J., Zhang, Y., Wells, C., Kim, S. J., & Tong, C. (2019a). Disinformation, performed: self-presentation of a Russian IRA account on Twitter. Information

(28)

27 Appendix A

CODEBOOK

Unit of Data Collection: A tweet or an actor mentioned in a tweet Coders: Sabine Berzina and Cady Rasmussen

The variables are measured within the tweets. There are ten variables, but a separate coding sheet was made for the first two variables because the units of analysis differ.

If the tweet is incomprehensible or does not actually concern Europe, code all the variables with 999.

Examples:

1) Back in the gym tomorrow???? @Oswaldo_Rascon @Haley36977826

@filthyRichMYDEA @smeyer_12 @jayyellaa @EuropeanGatsby @meganlatterner @BBSleoo;

2) V_of_Europe SlicksTweetz TheHoneybee_ PGHowie2;

3) RT @jjliberty: Best Dog Blog on earth ck it out --&gt;&gt; Europe Part 3: Chef Crusoe's Italian Dessert &amp; Lake Como http://t.co/FOLjTS8gZ0.

Coding sheet 1

Each actor in a tweet detected in step one becomes the unit of analysis.

1. Actor (V: actor).

Each actor mentioned in the tweet should be listed in the coding sheet. For example, actors can be particular people, groups of people, states, organizations, etc. Each actor is listed, for example, The US, refugees, migrants, Angela Merkel, etc. Hashtags or parts of hashtags can also be actors but not usernames.

Later, actors will be re-coded in larger groups (for example, the group “Merkel” consists of actors Angela Merkel and Merkel). The full lists of groups and the actors they consist of can be found in the appendix B.

(29)

28

If there are no actors in the tweet, code it as 0.

2. Role (V: role). Each actor becomes a unit of data analysis and is assigned a role.

2.1. Savior (an actor that is presented as making or capable of making the situation better) -> coded as 1

2.2.Villain (an actor presented as being bad, causing fear, doing something harmful or unacceptable, posing danger to someone or something) -> coded as 2

2.3.Victim (an actor presented as facing difficulties or fear, being marginalized or suffering the consequence of some event or action) -> coded as 3

2.4.None of those or hard to understand -> coded as 0

Examples:

1) The roles in this tweet – “@sammydurrani Do you even know what's going on in Europe? Have you heard about Muslim neighborhoods in London for instance?” - should be coded as 0 (actors – Europe, Muslims, Londom). The tweet text does not give enough context to tell what the roles are with certainty (the evaluation for Muslim neighborhoods is not given).

2) “Austria has received more asylum applications than births. Europe is turning into caliphate. https://t.co/LUaXoCAYVR https://t.co/7VJ6zKOplf”

Actors: Asylum applicants (villain); Austria (victim); Europe (victim).

3) “The outcome of white supremacy. Syria, once beautiful and prosperous country is in tatters now. Just like Africa after European colonization https://t.co/HHtvYoOYPJ” Actors: Syria (victim); Africa (victim); Europeans (villain)

4) “Canada confirms will give troops for eastern European NATO force #world #news” Actors: Canada and Eastern European NATO force, roles coded as 0.

5) “#Brussels #IslamKills WHAT IS HAPPENING WITH MUSLIM REFUGEES IN EUROPE? READ WHAT'S GOING ON WAKE UP USA”

Actors: Brussels (0); Islam (villain); Refugees (0); Europe (0); USA (0).

6) “RT @geertwilderspvv: BOMBSHELL NEXIT POLL: More than half of Dutch voters now want to LEAVE the European Union https://t.co/YYREnHRh9r”

(30)

29

Actors: Dutch voters and EU both coded as zero because the tweet in itself does not allow to understand the attitude towards these processes.

Coding Sheet 1 example

Tweet ID Tweet text Actor Role

1 text Name of actor 1 1

1 text Name of actor 2 0

1 text Name of actor 3 3

2 text Name of actor 5 2

3 text Name of actor 5 2

Coding sheet 2

A tweet is the unit of analysis.

Important – several devices can be used in one tweet and therefore one tweet can be placed in several categories.

3. Fear appeal (V: fear). Fear appeals are stories designed to cause indignation.

For example, Harlod D. Laswell (1927) wrote that in wartime propaganda accent is often put on atrocity stories that cause outrage like wounding women and children. Fear appeals can also be used by people like parents and teachers, for example, to get others to act in a certain way (Jowett & O’Donnell, 2018).

Present if:

a. the tweet talks about a threat of or a conflict;

b. the tweet mentions how some situation places the audience in danger; c. the tweet mentions how a group of people places the audience in danger; d. the tweet discusses how the society is changing for worse;

e. the tweet mentions atrocities, cruel behavior;

(31)

30

3.1. Present -> coded as 1 3.2. Absent -> coded as 0

Examples:

1) “RT @YoungDems4Trump: Radical Muslim migrants attack and torch a bus in France whilst screaming "Allahu Akbar". Welcome to occupied Europe h…”

2) “Today you don't give a fuck about the Muslims, Europe. Tomorrow they will come to your house and rape your children https://t.co/duCbUXg9Uw”

3) Tweets about how a group of people is destroying culture;

4) “RT @DanielleRyanJ: Political correctness is destroying Europe. Johnson is right. Tolerance of intolerance is the ultimate in stupidity. htt…”

4. Call for action (V: action).

Present if any action to take is recommended. A tweet counts as a call for action regardless of who is the target of the message – the audience, the EU or anyone else. Even vague hints at how to behave (for example, “don’t accept this situation!”) are considered a call for action.

4.1. Present -> coded as 1 4.2. Absent -> coded as 0

Examples: following an account, donating to a campaign, expressing an opinion publicly,

standing up for or retweeting something.

Presence or absence will also be detected for these propaganda devices listed by Institute for Propaganda Analysis (Lee & Lee, 1939). The original names of the devices were kept.

5. Name calling (V: name calling). Name calling means giving something a negative label to make the audience reject it without examining evidence.

5.1. Present -> coded as 1 5.2. Absent -> coded as 0

(32)

31

Examples: Labelling something as un-American or anti-European, labelling a tax a death tax, calling someone or something as femi-nazi, commie, multiculturalist, globalist or liberal in a negative context, describing asylum seekers as invaders, illegal immigrants or Muslim mob. Just stating an opinion (ex: saying that something is bad or silly) is not considered name calling. For example, saying that the March of Europe is useless would not classify as name calling, however there are tweets describing it as “the butt-hurt march” which would be name calling.

6. Glittering generality (V: glittering generality). Glittering generality means giving something a positive label to make the audience accept it without examining evidence.

6.1. Present -> coded as 1 6.2. Absent -> coded as 0

Examples: calling a group of people good guys, or a militant group a patriot army.

7. Transfer (V: transfer). It means using the status of an idea or a person to compare it to something else to cause certain associations.

7.1. Present -> coded as 1 7.2. Absent -> coded as 0

Examples:

1) comparing another event to September 11 to invoke the image of a tragedy or comparing a war to the Iraq War to invoke a particular image.

2) “#StopIslam The refugees will bring horror and terror to our country, look at Europe #IslamKills”

3) “That’s awful!! Radioactive leak had taken place at the Europe's largest nuclear plant, Zaporizhzhya. Trash! #FukushimaAgain”

(33)

32

8. Testimonial (V: testimonial). It means using respected or hated person to endorse or reject something.

8.1. Present -> coded as 1 8.2. Absent -> coded as 0

Examples: reference to a person who called another person’s idea brilliant or horrible.

9. Plain folks (V: folks). It means identifying someone or something (or the author of the tweet) as one of the people not elite or emphasizing how someone or something is not part of the people but of elite;

9.1. Present -> coded as 1 9.2. Absent -> coded as 0

Examples of this are: talking about the interests of regular people, calling something elitist, using descriptions such as we – Europeans or we – Americans or stating that Europeans are shocked by something.

10. Bandwagon (V: bandwagon). For the purposes of this study, bandwagon means suggesting that because of the high approval of something, others should hold the same opinion. As

mentioned previously, several propaganda devices can be present in a single tweet, therefore there could potentially be overlap with the plain folks category. However, it is important to note that the bandwagon category would go further than just separating regular people and their needs and opinions from the elite – it would seek to encourage others to follow the opinion of the mass. It does not necessarily mean only opinion of the people, it would also include saying that most experts believe in something, therefore it should be believed by others too.

10.1. Present -> coded as 1 10.2. Absent -> coded as 0

(34)

33

Examples: saying that no sane person could object to something or all Europeans / all people

want something, calling people to be on the right side of the history.

Coding sheet 2 example

Tweet ID

Tweet text

Fear Action Name calling

Glittering generality

Transfer Testimonial Folks Bandwagon

1 Text 1 0 0 0 0 0 0 1

2 text 0 0 0 0 0 0 1 0

3 text 1 1 1 0 0 0 0 0

(35)

34 Appendix B

Groups of actors

1. Africa (42 actors):

Africa, African, African education, African land, African moors, African people, African women, Africans, Ghana, N Africa, Northern Africa, Senegal, Somali, Gambia, Great African civilization, Japan, Nigeria, African men

2. Asia (35 actors):

Asia, Asia stocks, Asian shares, Asian stocks, Asians, Beijing, Bhutan, China, Chinese shares, Hong Kong, North Korea, Pakistan, Pakistanis, Tibet

3. Athletes/sports organizations (42 actors):

European athletics, Briton Willett, European Games, Lynx, Katarina Elhotova, European fans, Ryder Cup, European Ryder Cup team, African Soccer Players, Spanish clubs, Torrance, World Cup, European Soccer Clubs, European Soccer, Louis Smith, Lionel Messi, Fisher, FIFA head, Ronaldo, Knox, Serena Williams, Willett, Olympic official, European Sports Week, Rory Mcllroy, Ciganda, Hull, Paris European champion, Federov, European Athletics chief, European Soccer Clubs, 2026 World Cup, Darren Clarke, European clubs association, FIFA, Lee, Charles Hill, Levy

4. Belgium (34 actors): Belgium, Brussels

5. Brexit (17 actors) – when Brexit was described in the context of doing something like unraveling Europe.

6. Businesses (55 actors):

Aeroflot, Airlines, Altice, Amazon, Amgen cholesterol drug, Apple, Calogon Carbon, Companies, EU companies, European firms, European grocery giants, European oil and gas majors, European parcel firms, European pharma industry, Exxon Mobil, Gazprom, Giant, Gilead, Google, Heineken, Hewlett-Packard, HSBC Holdings, Kia Motors, LNG, Malaysia Airlines, Netflix, Novartis, Renault, SAP, Tim Cook, Uber, Visa, Visa Europe, Vodafone, Volkswagen, VW, Walgreens, Wall street, Yahoo European Entertainment, Youtube, Facebook

(36)

35

Bishop, Catholic priest, Catholics, Christians, European Christians, Jesuit monk, Pope, Pope Francis, The Christian Slavic Peoples Of Eastern Europe, The Pope

8. Eastern Europe (16 actors):

East Europe leaders, Eastern Europe, Eastern European NATO force, Eastern European state broadcaster, Eastern Europeans

9. EU/EU institutions (213 actors) – includes EU, its institutions and leaders:

Council of Europe, Donald Tusk, EP, EU, EU council, EU countries, EU digital chief, EU flag, EU leaders, EU sanctions, EU superstate, EU-27, Euro, European leaders, European Space Agency, European antitrust regulators, European army, European authorities, European Central Bank, European chamber, European Commission, European Commission president, European Council, European Council president, European court, European court of Justice European creditors, European diplomats, European flight safety authority, European government, European judges, European lawmakers, European leaders, European officials, European parliament,

European powers, European regulators, European Union, European Union foreign ministers, European Union leaders, Europol, Eurozone, Faceless bureaucrats in Brussels, Finance ministers, Former European commissioner, Jean Claude Juncker, Juncker, President of the European

Parliament, Pro-Europe rally, The European Parliament, Tusk, Vestager, Vice President of European Commission, Federica Mogherini

10. Europe (820 actors) – includes actors that might refer to Europe the continent not the EU: Europe, European politicians, Europe leaders, European bank, European monarchy, European royals, European Zionist lapdog politicians

11. European economy/stocks/markets (40 actors):

European markets, European shares, European stocks, Europe’s economy, stocks, European bonds and gold, European corporate debt, European entertainment stocks

12. European states/cities (100 actors) – including countries as actors if they were not mentioned frequently enough to form a group of their own:

Amsterdam, Austria, Baltics, Barcelona, Bulgaria, Bulgarian militia, Central Europeans, Copenhagen, Countries, Czech delegation, Czech Republic, Czechs, Denmark, Dutch counterterrorism official, Dutch intelligence, European capitals, European cities, European countries, European nation, European neighborhoods, European streets, Finland, Gothenburg, Hungarian Intelligence Expert, Hungarians, Hungary, Hungary police, Italian PM, Italian police,

Referenties

GERELATEERDE DOCUMENTEN

License: Licence agreement concerning inclusion of doctoral thesis in the Institutional Repository of the University of Leiden. Downloaded

Financial support for the research came from the Cosmopolis programme and the Japan Student Services Organization. I am also thankful to the office staff of the Dutch Studies

The assumption was that, after the decline of the Safavids, the increased insecurity permitted the East India Company (the EIC), who hung on thanks to the Royal Navy and the

Not only did they import large amounts of Bengali and Chinese sugar, and later Javanese sugar, from Indian ports, they also offered their sugar to the merchants at cheap prices and

Nadri, “The Dutch Intra-Asian Trade in Sugar in the Eighteenth Century,” International Journal of Maritime History 20, no.. Appendix 10: Family-tree of

“The Armenians and the East India Company in Persia in the Seventeenth and Early Eighteenth Centuries,” The Economic History Review 26, no.. Shyamji Krishnavarma: Sanskrit,

Uit Nederlandse documentatie met betrekking tot concurrenten, met name Engelse en Franse handelaren die regelmatig in suiker uit Bengalen, China en zelfs Java handelden, blijkt dat

His main interests are the history of the early modern Persian Gulf and the maritime trade of the