• No results found

Speaking (Un)Truth? : politicians’ Moral Rhetoric and Communicative (Un)Truthfulness on Facebook

N/A
N/A
Protected

Academic year: 2021

Share "Speaking (Un)Truth? : politicians’ Moral Rhetoric and Communicative (Un)Truthfulness on Facebook"

Copied!
50
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Speaking (Un)Truth?

Politicians’ Moral Rhetoric and Communicative (Un)Truthfulness on Facebook

Sophie Minihold 11354151

Master’s thesis

Graduate School of Communication

Research Master’s programme Communication Science

Supervised by Dr. Michael Hameleers 28/06/2019

(2)

Abstract

Today’s polarized online media environment may pave the way for politicians to spread communicative (un)truthfulness. Although research has started to conceptualize the nature and political consequences of discourses surrounding mis-, and disinformation, comprehensive insights on its scope in Social Network Sites (SNSs) and how citizens receive this rhetoric is lacking. Advancing this knowledge, this article draws on an extensive Automated Content Analysis (N = 11,911) of Austria’s political party leaders’ Facebook postings from 2009 to 2019. Three dictionaries of (un)truthful communication were compiled based on extant literature and an inductive pre-study: (1) the epistemic status of truth; reality, (2) accusations of unintended untruthfulness; misinformation, and (3) attributions of intended misleading; disinformation. In line with the predictions, such discourses of (un)truthfulness were surrounded by a moral tone to express “right” and “wrong” judgements. Disinformation discourses are most likely linked to moral rhetoric unlike the construction of reality or misinformation. Interestingly, the outcome suggests that the extreme right-wing FPÖ’s party leader uses more accusations of intended untruthfulness, but less reality and misinformation discourses than other party leaders. The FPÖ, however, is more likely to pair the latter two discourses with moral rhetoric than other parties. Facebook postings with and without moral words about the epistemic status of truth trigger user engagement in form of likes. These results provide novel insights into the dominance of discourses of truth and morality on social media.

Keywords: (un)truthful communication, reality construction, misinformation, disinformation, moral rhetoric, Austria, FPÖ, Automated Content Analysis, Facebook

(3)

Speaking (Un)Truth?

Politicians’ Moral Rhetoric and Communicative (Un)Truthfulness on Facebook

The current “Post-Truth” era (Oxford Dictionary, 2016) is characterized by a fraught relationship between politicians and the media. Omnipresent in that regard is the buzzword “Fake News” which refers to untruthful communication or attributions of untruthfulness to (political) actors. An example from Austria in 2018 illustrates this phenomenon. The former Vice-Chancellor of Austria and former party leader of the right-wing populist FPÖ, Heinz-Christian Strache, discredited the Austrian public broadcaster (ORF) and its employees on Facebook. Strache posted a photo mimicking an advertisement of ORF, featuring the prominent news anchor Armin Wolf and the text: “There is a place where lies become news. It’s the ORF. The best of fake news, lies and propaganda, pseudo-culture and forced licensing fees.” (Mandlik, 2018). Strache later apologized for this statement, but nevertheless referred to the satirical character of his posting – hereby legitimizing his attributions of blame to the media and political opponents. Politicians who label certain media outlets as untruthful are not a recent phenomenon but rather fit into a long tradition in political communication (Katsirea, 2019). The literal term “Fake News” can be traced back to the beginning of World War 1 (Sunstein et al., 2018) and World War 2 where the German translation “Lügenpresse” has been used to denounce critical journalists and delegitimise unpatriotic media outlets (Blasius, 2015). Today, both terms experience a revival in the political communication of Austrian political parties. However, this dilemma of politicians blaming the media of untruthful reporting is not reserved to Austria. The most famous case is Donald Trump who coined the term during the 2016 presidential elections in the United States (Allcott & Gentzkow, 2017) and other political figures from, among other countries, Syria, Venezuela, Russia, Myanmar (Erlanger, 2017), Germany (Schwarz, 2016), the Netherlands (van

(4)

der Pol & van den Ven, 2018) followed his example. Thus, this study argues that this concept of delegitimizing media reports is a worrying, international trend of politician’s political

communication.

These practices of weakening the status of the media and the legitimacy of journalism by using terms related to “Fake News” is one reason, and at the same time an outcome, of the present “Post-Truth” climate. We can see a growing public and scientific debate on the epistemic state of “truth”. In the current era of “post-factual” relativism (Van Aelst et al., 2017), facts are considered challengeable and hence regarded as having the same status as opinions. Likewise, at times, personal beliefs and emotions are given more weight than facts (Rochlin, 2017). Some argue that searching for the “truth” in public discourse becomes irrelevant (Berghel, 2017) or are concerned that we are on the way to a dystopia where experts are labelled as untrustworthy if they report facts which are contrasting the notion of some (Lewandowsky, Ecker, & Cook, 2017). How did we end up in this post-factual era? Scholars argue that fragmented media environments contribute. A lot of public discourse happens online, where anyone can disseminate untruthful communication alongside verified, accurate information (Van Aelst et al., 2017). This means that journalistic gatekeepers can be avoided – and different actors of untrue communication get direct access to the public.

Despite increasing scholarly attention for mis- and disinformation, we lack empirical data on the discourse surrounding claims of (un)truthful communication in Social Network Sites (SNSs): Who delegitimizes whom and in what ways? The main goal of this study is therefore to provide an overview of the discourse in which both “Fake News” and “Post-Truth” have been embedded by relevant actors who can influence the formation of public opinion: politicians. This

(5)

study places a special focus on political moralizing in SNSs which can be used as a mean of enhancing influence.

On SNSs like Twitter and Facebook, users select interesting content for themselves and, additionally, algorithms recommend content which might be of interest to them and is

supposedly in line with their beliefs. The latter might lead to decreased information diversity for the users online and might reinforce previously held views (Pariser, 2011). Some even worry about users’ capability to engage in moral thinking online because of this algorithmic pre-selection of what is “right” or “wrong” (Peters, 2018). Following this line of thought, if SNS users are exposed to politicians who share their political standpoints and disseminate information where they distinguish for the users what is “right” or “wrong”, “good” or “bad” by using moral rhetoric, they might become persuaded of the politician’s message (Clifford, Jerit, Rainey, & Motyl, 2015). However, the extent to which self-selected or algorithmic personalisation have an effect is debated. Some argue that empirical evidence for the existence of so called “filter bubbles” is limited (Borgesius et al., 2016).

The implications of the “Post-Truth” era for society are manifold. The “Post-Truth” era and moral rhetoric, are both said to increase political polarization (Van Aelst et al., 2017). This means that cleavages between policy supporters and opponents grow because it becomes

increasingly difficult for them to find common ground or to accept compromise (Anderson et al., 2014). Public discourse becomes especially polarized online where it is easy for like-minded people to connect and reinforce each other’s views in so-called “echo chambers” (Del Vicario et al., 2016). On Facebook, an echo chamber might be cultivated on the official pages of

(6)

disseminate similarly framed information and as a consequence shape people’s perception about issues (Lecheler & De Vreese, 2011).

Another implication is the decreased media trust as the Reuters Digital Newsreport shows. In Austria, trust in the media overall dropped by four percent in 2018 to 41.0 % of trust in news overall (Newman, Fletcher, Kalogeropoulos, Levy, & Nielsen, 2018). News consumption on SNSs increased while news consumption for TV decreased (European Commission, 2018). However, the “Fake News” dilemma does not only affect media institutions, it also sets the tone for the public discourse on the truthfulness of politicians. Furthermore, there might be

differences between political actors and their use of moral rhetoric and claims of untruthful communication in SNSs. An emphasis of morality is more common among populist actors (Mudde, 2004) who contribute to the erosion of trust in established media by, for example, truth deflection (Dahlgren, 2018). This study aims to gather information about the extent to which moral rhetoric is used in discourses by Austrian politicians.

Against this backdrop, an explorative approach is used to investigate (1) to what extent rhetoric construction of (un)truthful communication takes place in Facebook postings of Austrian politicians, (2) if and to what extent reality, mis-, and disinformation are linked to the use of moral rhetoric, (3) if and to what extent right-wing populist parties communicate in that regard, and (4) how and in what ways user engagement on Facebook is triggered by these discourses.

The Public Debate on the Epistemic Status of Knowledge

“Post-Truth” and “Fake News” are popular labels in the current public debate on knowledge and characterize the media-politics relation worldwide. However, both terms are socially and politically charged because they are simplifying and attention-grabbing (Corner, 2017) as well as ambiguous in their meaning (Jasanoff & Simmet, 2017; Tandoc Jr, Wei Lim, &

(7)

Ling, 2018). So far, research is lacking on the rhetoric surrounding both concepts. To advance our understanding of the discourse in which debates about (un)truth are held online, this study first highlights the public debate on knowledge, offers insights on political moralizing, and explains the role of different political actors in constructing this rhetoric.

“Post-Truth” References in Politician’s Online Discourse

“Post-Truth” can refer to nostalgia of a time when the public could comfortably rely on government officials and experts for having the authority over facts (Marres, 2018). More recently, a shift has taken place in which experts who try to hold the authority over facts might be labelled as untrustworthy if it threatens the world view of powerful others (Lewandowsky et al., 2017). The debate over knowledge and “truth” has been a political power game (Foucault, 1970; McIntyre, 2018). Indicative is that politicians influence voters by expressing opinions on information rather than trying to convince them with facts (Paccagnella, 2018; Rochlin, 2017). Some argue that emotion and personal beliefs are the new form of evidence (Rochlin, 2017). The Oxford Dictionary defines “Post-Truth” likewise as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief” (para. 1, 2016). By relying on opinions rather than facts, knowledge became debatable in public discourse. This is the main issue of this “post-factual” relativism because it becomes increasingly difficult to find common factual ground for discussions (Van Aelst et al., 2017).

To fully understand the current debate on the status and shared understanding of

knowledge, a thorough analysis of the political climate in which it flourishes is necessary. This thesis argues that (1) a lost common perception of reality, (2) increased political polarization, (3)

(8)

a fragmented media landscape, (4) the discourse on “Fake News”, (5) and decreased trust in (media) institutions contribute to the emergence and perpetuation of the “Post-Truth” world.

First, the debate on what is knowledge and what is not can be traced back to a loss of shared reality (Arendt, 1955). According to Arendt, the perfect preparation to convince people of a certain Weltanschauung is to get rid of their own sense of “true” and “false”. The Atlantic summarized this dilemma as “nothing seems true and everything seems possible”(Friedman, 2017, para. 13). Once the line between fact and fiction is blurred, the stage is set for politicians to offer their versions of reality to the public.

Second, political polarization is an important societal trend (Lewandowsky et al., 2017) that fuels the emergence of online (un)truthfulness discourses (Humprecht, 2018). Its definition is debated in the literature (Boxell, Gentzkow, & Shapiro, 2017; Prior, 2013). However, political polarization can generally be described as a growing gap between partisans of opposing political parties with different policy preferences. The concept has been extensively researched in the US with a focus on Republicans versus Democrats, but less in other countries with multi-party systems. Research in the US shows that partisans are more likely to trust co-partisans (Iyengar & Westwood, 2015). This means that a partisan bias might influence voters in assessing the

credibility of information if it is disseminated by a trusted politician. In Austria, a multiparty democracy, the literature suggests polarization between the right-wing populist Freedom Party of Austria, FPÖ, and all other parties (Meffert & Gschwend, 2010). However, more recent and thorough analysis on political polarization is lacking.

Third, a fragmented media landscape with an increased use of online mediacontributes to a growing relativism because everyone can disseminate information, whereas algorithms do not filter out untrue content (Van Aelst et al., 2017). In Austria, 85.0 % of the population have access

(9)

to the Internet according to the 2018 published Reuters Digital News Report. Two thirds of them (63.0 %) use Facebook which makes it the third most often used Social Network Site (SNS) after Youtube (66.0 %) and WhatsApp (67.0 %), however only 12.0 % use Twitter (Peters, 2018) In comparison, Germans use Facebook less (52.0 %) and the Dutch (65.0 %) use it more often than Austrians.

Scholars have been questioning the “truth carrying capacity” (Del Vicario et al., 2016, p. 564) of SNSs like Facebook because of “filter bubbles” (Pariser, 2011), in which algorithms base the content on the user’s previous online behaviour, and echo chambers (Peters, 2018), where previously held beliefs are reinforced by like-minded people. Others argue that neither echo chambers on SNS (Gentzkow & Shapiro, 2011), nor “filter bubbles” (Borgesius et al., 2016) are as effective as anticipated. However, Facebook has been shown to foster selective exposure among its users (Bakshy, Messing, & Adamic, 2015) and the combination of expressing opinions online and polarized networks, key characteristics of echo chambers, have been shown to contribute to the circulation of false information (Törnberg, 2018). Furthermore, anyone can post content online which means that traditional gatekeepers who aimed for verification of information prior to publishing are circumvented (Ernst, Engesser, Büchel, Blassnig, & Esser, 2017; Wendler, 2014).

Two implications of this relativistic climate have been identified. First, trust in institutions is declining (Bennett & Livingston, 2018). The Eurobarometer report of 2018 investigated the level of trust in news sources of 28 EU Member States. Austrians hardly trust information or news accessed on SNS (17.0 %) which makes them come in last place EU-wide (European Commission, 2018). While overall trust in the media decreased, Austrians indicate that they trust the media that they use themselves more (55.0 %) (Newman et al., 2018). The

(10)

numbers are similar worldwide. For Austria, the report concludes that the decreased trust might be connected to the recent divisive elections centred around the FPÖ, the increased use of social media and the influence of politicians on public opinion formation. In line with that argument, politicians can contribute to the decrease of trust in the media (Ladd, 2010, 2012). One way of politicians influencing the public opinion about (un)truthful communication is to undermine the status of the media through verbal attacks such as blame attribution (Iyengar, 1991). However, we lack research on how political actors construct online discourse around knowledge and “truth”. Therefore, the first research question covers online discourse on reality construction:

RQ1: To what extent do Austrian politicians make use of "reality" discourse in their

Facebook postings?

(Un)truthful Communication: Attributing Mis- and Disinformation on SNSs

The second implication regards the discourse surrounding “Fake News”. This ambiguous term captures concepts related to untruthful communication regardless of the communicator and the context it is used in (Corner, 2017; Tandoc Jr et al., 2018). In scientific literature, the term has been operationalized as advertising, fabricating, parody, manipulation, satire or propaganda (Tandoc Jr et al., 2018). Other scholars argue that the differentiation between misinformation and disinformation informs best about the complexity of the issue (Wardle, 2017). Misinformation is false information which has been shared undeliberately (Floridi, 2005; Thorson, 2016; Wardle, 2017) while disinformation defines false information which has been created and disseminated deliberately (Bennett & Livingston, 2018; Floridi, 2005; Wardle, 2017).In that regard, it is irrelevant if the communicator is affiliated with media institutions or not. It does, however, matter which labels are used to describe information because these labels are linked to different meanings. Politicians could choose from an abundance of words to express themselves. If they

(11)

use categories such as mis- or disinformation to comment on (media) content, they inform about the motivation behind its spread, how their voters should perceive the content or if and how to counter it. Consequently, politicians could create a crisis of legitimacy because they can deny or ascribe credibility to certain information. This means that politicians set the atmosphere in which public discourse happens. This paper therefore focusses on untruthful communication labelled as misinformation and disinformation.

First steps have been taken to automatically detect misinformation and disinformation online (summarized in Søe, 2018). However, these research projects focused on the algorithmic differentiation of true versus false information while a crucial difference, the deliberate or undeliberate spread of false information, has been neglected (Søe, 2018). To investigate the discourse of misinformation and disinformation online, a different approach is needed. More specifically, we need to understand how these different references of untruthfulness are constructed and attributed. Therefore, based on literature and an inductive qualitative content analysis, a word list was created to identify the salience of these discourses in which the concepts of misinformation and disinformation are embedded in a politician’s online discourse. The second research question therefore explores:

RQ2: To what extent and in what ways do Austrian politicians make use of discourse

evolving around misinformation and disinformation in their Facebook postings?

Moral Rhetoric in Communicated (Un)Truthfulness

Some scholars argue that the reinforcement of prejudices online restrains people from engaging in moral thinking themselves (Peters, 2018). The task of taking a moral stance is left for trusted others in their SNS bubble, for example, politicians. Moral rhetoric is widely used in the political discourse among all parties to signal their voters what is right or wrong, good or bad

(12)

and this tactic has been shown to be persuasive (Clifford et al., 2015) as well as polarizing (Anderson et al., 2014). Political moralizing influences public opinion because it links political issues to deeply held beliefs and signals how they ought to be perceived (Anderson et al., 2014). The Moral Foundations Theory (MFT) shows that political moralizing does more than signalling “good” or “bad”. Its Moral Foundation Dictionary consists of five different categories each of which includes words related to virtues and vices: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion and sanctity/degradation (Graham et al., 2012). Research in the US

demonstrated that liberals focus mostly on the dimensions of Care and Fairness when making moral judgments. While the category care/harm focusses on the protection and care of

individuals, fairness/cheating is concerned with cooperation, mutuality but also cheating. If politicians want to reach US conservatives, they should use moral discourse involving words from the remaining three categories Loyalty, Authority, and Sanctity (Graham, Haidt, & Nosek, 2009).

The category loyalty/betrayal covers patriotic motivations, and self-sacrifice for the group, but also the betrayal and disloyal behavior of allies. Authority/subversion conglomerates words relating to (dis)respect and (dis)obedience towards authorities. Finally, the

sanctity/degradation category covers virtues like sacredness and beauty, while its vices are about pollution and degradation (Sagi & Dehghani, 2014). Moral rhetoric is assumed to be used in discourses around (un)truthful communication because it is a powerful political tool through which political attitudes of voters could be fuelled and steered in the way the politician intends it (Anderson et al., 2014). Guiding the public opinion finding process in a specific way seems to be part of the intention to use reality, mis-, and disinformation discourses because these discourses are about taking sides. Especially the mobilizing effect of political moralization (Jung, 2018)

(13)

might be useful if contradicting information exists and politicians want to convince voters of the “true reality”. However, research on the link between political moralizing and (un)truthful political online communication has not been done yet. To fill this gap, the following hypothesis is forwarded:

H1: If politicians engage in discourse on reality construction, misinformation or

disinformation, they are more likely to use moral rhetoric than when their discourse lacks references to reality construction, misinformation, and disinformation.

Other than misinformation, disinformation refers to intentionally misleading information. If this intention of spreading false content by whomever is highlighted by the contributors of disinformation discourse, sides are taken. It is argued that the judgement of who is right, and who is wrong is linked to moral rhetoric more than discourses of misinformation or reality construction because this distinction of good versus bad is the key pillar of moral rhetoric. Hence, hypothesis two is formulated:

H2: If politicians engage in discourse on disinformation, they are more likely to use

moral rhetoric than in discourses on misinformation.

Right-Wing Extremists and their (Un)Truthfulness Discourses

Research of the MFT shows that depending on the political affiliation of people, different moral values are deemed to be important. Politicians therefore need to adapt their rhetoric

accordingly to reach their voter base (Graham et al., 2009). While research in the US continuously highlights the difference between liberals and conservatives when it comes to political moralizing, one study in the European context differentiates between centre-left/right, mainstream and extreme parties using GALTAN1 dimensions (Hooghe et al., 2010). They

1

(14)

concluded that ideologically extreme parties moralize political issues more than non-extreme parties when debating EU treaties (Wendler, 2014). However, more empirical data is needed to expand our knowledge regarding the use of moral rhetoric by political actors with different political leanings.

The right-wing FPÖ2, classified as ideologically extreme, follows a right-wing populist political discourse (Mudde, 2000). Morality is a key characteristic in the political discourse of populist parties as they divide society in good versus bad people (Mudde, 2004). Wendler (2014) has shown that the FPÖ moralizes indeed more than other Austrian parties. Furthermore,

populists are said to benefit from the current “Post-Truth” climate as their very basis is built on opposing the elite’s version of reality if it counters the parties’ narrative (Waisbord, 2018). Therefore, it is assumed that they contribute to the discourse surrounding “truth” and “fake”. Thus, the following hypothesis is put forward:

H3: Right-wing extremist parties such as the FPÖ use (a) moral rhetoric and (b) reality

construction, (c) mis-, and (d) disinformation discourse more often in their Facebook postings compared to other political parties in Austria.

User Engagement in Postings featuring Morality and (Un)Truthfulness

Reinforcement of previous held beliefs can take the form of an “Illusory truth effect” (Goldstein, Hasher, & Toppino, 1977). The more often people are exposed to a certain discourse, the more likely they believe it is true– even if it is false information. If citizens are continuously exposed to politicians’ discourse on reality, or mis-, and disinformation in their Facebook

newsfeed, they might adopt the discourse and express agreement with it. More so if the discourse

2

(15)

is paired with moral rhetoric which might serve as a mean to process information more easily because it suggests how information ought to be perceived. Easily understood information is more likely judged to be true (Reber & Schwarz, 1999). Indeed, Twitter research on politicians has shown that tweets including moral rhetoric or emotional cues have a greater likelihood of being shared (Brady, Wills, Jost, Tucker, & Van Bavel, 2017). As research on that topic is lacking, the following hypothesis explores user engagement to Facebook postings featuring (un)truthfulness and morality:

H4: If politicians make use of moral rhetoric in their discourse on reality construction,

mis-, and disinformation in Facebook postings, they trigger more user engagement than in postings including discourse on reality construction, mis-, and disinformation without moral rhetoric.

Method Data collection and sample

This research draws on an extensive Automated Content Analysis (ACA) of Facebook postings of Austrian political party leaders, covering June 2009 through May 2019 (N = 11,911). The Austrian context is suitable for analysis because of the popularity of Facebook among politicians to reach voters. The publicly available data were collected from the Facebook pages of party leaders by using the data extracting tool Netvizz (Rieder, 2013). First, using Python 3.7.3 (Rossum, 1995), the data was organized in a panda’s data frame (McKinney, 2010) by using the glob module to find path names matching a specified pattern. Then, the content of the Facebook postings was cleaned from stop words3, punctuation and non-ASCII characters4 were converted to their equivalents (Pedregosa et al., 2011). New variables were added, such as party

3

The Natural Language Toolkit (NLTK) was used for cleaning. 4

(16)

membership and the politician’s name. Subsequently, a dictionary-based approach was applied. The data were analyzed in IBM SPSS Statistics 26 and visualized in R 3.5.1 (R: A Language and Environment for Statistical Computing, 2010)

Measures

Word lists. The dictionaries used to identify the discourses in which the epistemic status

of knowledge (75 words), misinformation (83 words), and disinformation (108 words) are

embedded, were compiled from literature (e.g. Van Aelst et al., 2017; Tucker & Guess, 2018) and an inductive pre-study. For the pre-study, the Facebook postings of Austrian political leaders were investigated during the National Council election period of 15th July - 15th October 2017.

The word list used to identify moral rhetoric was based on the Moral Foundations Dictionary (Graham et al., 2009), comprises 789 entries and was translated to German. Two rounds of reliability testing were conducted during which all word lists (see Appendix A) were refined by replacing too generic words with more specific ones. Validity was established once no more false positives were detected by the Python script.

To assess the reliability of the code to automatically detected relevant words, a random sample consisting of 114 units of analysis was coded manually. After adjusting the word lists, an inter-coder reliability test was repeated with a random sample of 100 units of analysis.

Comparing automatically and manually coded units of analysis, the word list referring to the epistemic status of knowledge (α = .97, percentage agreement = 97.3 %), misinformation (α = .95, percentage agreement = 98.3 %), and disinformation (α = .79, percentage

(17)

agreement = 96.9 %) as well as moral rhetoric (α = .97, percentage agreement = 98.3 %) showed high inter-coder reliability5.

Using a dictionary-based approach means that each string of text in the Facebook postings was checked for presence of an item from each word list using regular expression operations. The word counts were then stored as variables in the respective word lists. For the analysis, binary variables were constructed for epistemic status of knowledge, mis-, and disinformation, as well as for moral rhetoric to indicate the presence or absence of words.

Political parties. Only leaders of National Council parties who had a public Facebook

page during their term were included in the data collection because their statements might have had more weight compared to other political parties. This includes Christian Kern and Pamela Rendi-Wagner from the Social Democratic Party (SPÖ), Reinhold Mitterlehner and Sebastian Kurz from the Austrian People’s Party (ÖVP), Heinz-Christian Strache from Austria’s Freedom Party (FPÖ), Matthias Strolz and Beate Meinl-Reisinger from the New Austria and Liberal Forum (NEOS), and Peter Pilz as well as Maria Stern from Liste Pilz/JETZT. Almost half of the Facebook postings are from Strache (n = 5460 out of N = 11,911) who pioneered using Facebook as a party leader already in 2009. Others joined the SNS years later and post less frequent.6

User engagement. Facebook users have various means to react to a posting. With “likes”

and reactions (“love”, “haha”, “sad”, “wow”, “angry”, and “thankful”) they show minimum effort engagement, while writing comments takes more time and is therefore a stronger sign of involvement as are “shares” because they are incorporated on their own Facebook wall as postings. A principal component analysis (PCA) of four items shows that only one component

5

Krippendorff‘s α of .67 is a benchmark in content analysis (Hayes & Krippendorff, 2011; Krippendorff, 2004). Higher values mean better reliability.

6

(18)

has an Eigenvalue above 1 (2.12) and there is a clear point of inflexion after this component in the scree plot. All items correlate positively with the first component. The variable “comments” has the strongest association (factor loading is .82). However, the reliability of the scale is bad, Cronbach’s α = .53. Deleting an item would not make the scale more reliable. Therefore, the different means of engagement such as the number of likes (M = 1333.76, SD = 2454.04)7, shares (M = 280.28, SD = 2434.17), comments (M = 200.65, SD = 389.62) and reactions (M = 99.29, SD = 390.86) will be analysed separately to detect differences.

Controls. Included control variables were the identification number of Facebook

postings, the ID of the publisher, the publication date, the type of posting such as status, video, photo, or link, as well as the words per posting.

Results

To give an overview of the data, Table 1 presents the relation of moral rhetoric in the discourse on reality construction, as well as mis-, and disinformation in Facebook postings by party leaders in Austria. While almost half of the analysed Facebook postings (n = 5750) had at least one reference to morality, a fifth of the postings referred to (un)truthfulness (n = 2175). Overall, we can infer that morality is significantly, albeit with small effect sizes, associated with all three discourses of (un)truthfulness.

Table 1. Crosstabs for (Un)Truthfulness Discourses and Morality in Facebook postings

discourse Reality Misinformation Disinformation

Morality

present absent present absent present absent relative (total) 8.8 % (1046) 3.7 % (436) 2.3 % (271) 0.4 % (43) 2.7 % (318) 0.5 % (61) 7

(19)

counts χ2

(df) 337.3 (1) 186.8 (1) 199 (1)

ϕ 0.17*** 0.13*** 0.13***

Note: N = 11,911; *** p <0.001; χ2 = Chi-square; df = degrees of freedom; ϕ = Phi.

Moral Rhetoric in Political (Un)Truthfulness Discourses

To answer H1 and H2, binary logistic regressions have been conducted in which the interaction between morality and discourses on reality construction, misinformation or

disinformation and all three combined, were estimated. In both regressions, the number of words per Facebook postings were considered. First, it has been hypothesized that if politicians engage in discourse on reality construction, misinformation or disinformation, they are more likely to moralize the discourse (H1). As can be seen in Table 2, the likelihood that moral rhetoric is used versus it not being used is significantly higher in all three discourses. These results are

supportive of H1. Merging the discourses on reality construction, mis-, and disinformation in one binary variable, it becomes apparent that postings with these discourses are indeed more likely to contain moral rhetoric than postings which lack reference thereof (b=0.41, standard error

(SE)=0.07, p < 0.001, odds ratio = 1.50, 95% confidence interval (CI) = (1.32, 1.70)).

Table 2. The Use of Moral Rhetoric in (Un)Truthfulness Discourses

IV: discourses B (SE) OR 95 % C.I.

(Constant) -1.60 (0.04)*** Reality 0.19 (0.07)** 1.21 (1.06, 1.40) Misinfo 0.74 (0.19)*** 2.09 (1.43, 3.05) Disinfo 0.76 (0.16)*** 2.13 (1.54, 2.94) Nr of words 0.08 (0.00)*** 1.08 (1.08, 1.09) Nagelkerke R2 0.32 χ2 (df) 3206.76 (4)***

(20)

Going more into detail, H2 postulates that, if politicians engage in discourse on

disinformation, they are more likely to moralize the discourse compared to when they partake in reality or misinformation discourses. H2 was confirmed because, the likelihood of morality being used in disinformation discourses versus moral rhetoric not being used is highest (b=0.76,

SE=0.16), p < 0.001, odds ratio = 2.13, 95% confidence interval (CI) = (1.54, 2.94) (see Table 2). This means that the odds of disinformation discourses and moral rhetoric occurring together in Facebook postings are 113.0 % higher than them appearing separately. The difference to the use of moral rhetoric in misinformation discourses is minimal (4.0 %) while the difference to moral rhetoric in reality construction discourses is considerable (92.0 %) (see Figure 1).

Note: DV: morality; SE: standard error; CI: confidence interval; OR: odds ratio; df: degree of freedom. Two-tailed tests. Unstandardized regression weights. SEs reported between

parentheses. *p <0 .05; **p < 0.01; and ***p <0.001.

Figure 1. Odds Ratios demonstrating the probability of moral rhetoric used in reality

construction, mis- and disinformation discourses in Facebook posting by Austrian political party leaders. Dots represent odds ratios and lines represent 95 % confidence intervals. N = 11,911.

(21)

Right-Wing Extremism, Moral Rhetoric and (Un)Truthfulness Discourses

A binary logistic regression analysis which focussed on the FPÖ and their use of discourses on reality, mis-, and disinformation, morality, as well as the interaction of

(un)truthfulness discourses and moral rhetoric, was conducted. H3 hypothesized that the FPÖ has a higher likelihood to engage in all three (un)truthfulness discourses as well as in using moral words. Controlling for words per Facebook post, the analysis indicates that the FPÖ is 109.4 % more likely to engage in disinformation discourses in their Facebook postings compared to other political party leaders in Austria.

No significant difference for the use of reality construction and misinformation were found (see Figure 2). Interestingly, the FPÖ party leader is 28.0 % less likely to use moral rhetoric than other party leaders in their Facebook postings. However, FPÖ's leader Strache is 55.0 % more likely to combine moral rhetoric and reality construction and 219.0 % more likely to link postings of misinformation with moral words than any other party leader. For

disinformation, however, the interaction with morality is not significant. Furthermore, the FPÖ party leader is more likely to write shorter Facebook postings in comparison to other party leaders (b=-0.01, SE=0.00), p < 0.001, OR = 0.9, 95% CI = (0.99, 1.00)). The FPÖ is more likely to engage in disinformation discourses compared to other parties but does not more frequently engage in reality construction or in misinformation discourses. These results are partly

supportive of H3. Moral rhetoric, however, significantly and positively moderates the relationship of reality construction and misinformation (see Table 3). This means that even though the FPÖ party leader is not more likely using reality construction, and misinformation more often than others, he certainly combines these discourses significantly more often with moral rhetoric than other party leaders. This resonates with a stronger Manichean discourse in

(22)

right-wing populism – where references to (un)truthfulness are paired with morality. The contrary is the case for disinformation discourses. On Facebook, he is not more likely to pair them with moral words.

Figure 2. Odds Ratios demonstrating the probability of the FPÖ using reality construction, mis-, and disinformation discourses as well as morality and the interaction of these

discourses in Facebook postings. Dots represent odds ratios and lines represent 95 % confidence intervals. N = 11,911.

(23)

The Impact of Moral Rhetoric and (Un)Truthfulness Discourses on Engagement Scores

H4 assumes if politicians make use of moral rhetoric in their discourse of reality

construction, mis-, and disinformation in Facebook postings, they receive more user engagement than in postings without moral rhetoric. To test this assumption, a MANOVA was employed. As Figure 3 and 4 show, likes are by far the most popular form of user engagement with a politicians Facebook post. Compared to mis-, and disinformation discourses, only postings featuring reality construction receive significantly more likes and reactions. If politicians combine this discourse with moral rhetoric, likes are the only user engagement that is triggered significantly. The results confirm H4 partly because moral rhetoric does not automatically lead to more user engagement such as likes, reactions, shares or comments in mis-, and disinformation discourses, but it significantly increases the amount of likes in reality construction discourses (see Table 4).

Table 3. The Use of (Un)Truthfulness Discourses by the FPÖ compared to other Political

Parties

IV: discourses B (SE) OR 95 % C.I.

(Constant) 0.07 (0.03)* 1.07 Reality -0.15(0.10) 0.86 (0.71, 1.05) Misinformation -0.27 (0.34) 0.76 (0.39, 1.47) Disinformation 0.74 (0.29)* 2.09 (1.19, 3.70) Morality -0.33(0.04)*** 0.72 (0.67, 0.78) Reality x Morality 0.44(0.12)*** 1.55 (1.22, 1.98) Misinfo. x Morality 1.16(0.37)** 3.19 (1.56, 6.52) Disinfo. x Morality -0.17(0.32) 0.85 (0.46, 1.58) Nr. of words -0.01(0.00)*** 0.99 (0.99, 1.00) Nagelkerke R2 0.024 χ2 (df) 464.16 (8)***

Note: DV: party membership; SE: standard error; C.I.: confidence interval; OR: odds ratio; df: degree of freedom. Two-tailed tests. Unstandardized regression weights. SEs reported between parentheses. *p <0.05; **p < 0.01; and ***p <0 .001.

(24)
(25)

User Engagement in Discourses with Morality in Facebook:

User Engagement in Discourses Without Morality in Facebook

Figure 3. Grouped bar charts demonstrating user engagement to Facebook postings in which morality was used split by discourse. Bars represent mean scores and lines the standard error. N = 11,911.

Figure 4. Grouped bar charts demonstrating user engagement to Facebook postings in which morality was used split by discourse. Bars represent mean scores and lines the standard error. N = 11,911.

(26)

Figure 5. Grouped bar charts demonstrating user engagement to Facebook postings in which morality was not used split by discourse. Bars represent mean scores and lines represent the standard error. N = 11,911.

(27)

Table 4. Testing the Effects of Reality, Misinformation, and Disinformation Discourses and the Moderation of Moral Rhetoric on

User Engagement in Facebook Postings; including MANOVA Results

likes reactions comments shares

IV: discourses morality Mean (SE) 95 % C.I Mean (SE) 95 % C.I Mean (SE) 95 % C.I Mean (SE) 95 % C.I reality no no 1176.48 (209.98) (764.88, 1588.07) 118.60 (33.46) (53.00, 184.19) 214.35 (33.39) (148.89, 279.81) 235.09 (208.74) (-174.06, 644.25) yes 1482.44 (94.16) (1297.86, 1667.01) 156.16 (15.01) (126.75, 185.58) 249.94 (14.98) (220.56, 279.29) 342.10 (93,60) (158.62, 525.58) yes no 1527.56 (234.44) (1068.03, 1987.10) 156.88 (37.36) (83.65, 230.12) 238.67 (37.28) (165.58, 311.75) 433.44 (233.05) (-23.38, 890.26) yes 1483.03 (11.99) (1261.53, 1704.52) 179.54 (18.01) (144.24, 214.84) 258.30 (17.97) (223.08, 293.53) 352.87 (112.33) (132.69, 573.05) F(η2 ) reality 5.421*(0.0004) 6.563**(0.0005) 1.851 1.940 reality x morality 5.512*(0.0004) 0.392 0.451 1.598 Mis- information no no yes 1479.93 (176.06) (1134.82, 1825.03) 102.45 (28.06) (47.45, 157.45) 224.36 (28.00) (169.48, 279.24) 376.76 (175.02) (33.69, 719.82) 1337.47 (83.97) (1172.87, 1502.07) 140.22 (13.38) (113.99, 166.46) 216.13 (13.36) (189.95, 242.30) 306.32 (83.48) (142.69, 469.94) yes no yes 1224.12 (376.45) (486.21, 1962.02) 173.03 (59.99) (55.43, 290.63) 228.66 (59.87) (111.31, 346.01) 291.78 (374.22) (-441.75, 1025.31) 1627.99 (1319.63, 195.48 (146.34, 292.12 (243.08, 388.66 (82.12,

(28)

(157.32) 1936.36) (25.07) 244.63) (25.02) 341.16) (156.39) 695.20) F(η2 ) misinformation 0.006 3.271 1.337 0.000 misinfo. x morality 1.573 0.049 1.071 0.149 Dis- information no no yes 1276.89 (208.50) (868.19, 1685.59) 130.32 (33.23) (65.19, 195.45) 203.12 (33.16) (138.12, 268.11) 321.13 (207.27) (-85.15, 727.41) 1559.70 (90.46) (1382.39, 1737.02) 151.41 (14.42) (123.15, 179.67) 247.27 (14.39) (219.07, 275.47) 320.27 (89.92) (14.01, 496.53) yes no yes 1427.15 (324.87) (790.36, 2063.94) 145.16 (51.77) (43.67, 246.65) 249.90 (54.67) (148.63, 351.17) 347.40 (322.94) (-285.62, 980.42) 1405.76 (146.33) (1118.93, 1692.60) 184.29 (23.32) (138.58, 230.01) 260.97 (23.27) (215.35, 306.59) 374.70 (145.47) (89.57, 659.84) F(η2 ) disinformation disinfo. x morality 0.000 0.648 1.045 0.048 0.669 0.093 0.313 0.006 Note: N = 11,911; *p < 0.05; **p < 0.01; and ***p < 0.001.

(29)

Discussion and Conclusion

In recent years, the buzzwords “Fake News” and “Post-Truth” have been shaping public debates on- and offline. These catchy umbrella terms simplify complex matters (Corner, 2017) into an opposition between right and wrong, and serve as a collection of messages with

ambiguous meanings referring to (un)truthful communication (Jasanoff & Simmet, 2017; Tandoc Jr et al., 2018). Politicians worldwide use them when they lash out, for example, on media institutions for reporting against their notion of “truth” or political opponents. The worldwide discussed (e.g. Murphy, 2018) Facebook posting of the former FPÖ party leader and former Vice-Chancellor Heinz-Christian Strache discrediting the public broadcaster ORF of spreading propaganda, integrates into a long FPÖ tradition of accusing the ORF for subjective reporting and a lack of independence.

However, what is considered as “truth” and independence is very much subjective as the Ibiza scandal of May 2019, which rocked Austria to its core, showed. While this study was conducted, a video was leaked showing Strache offering business deals to a foreign investor, among other things, to transform the Austrian media landscape like Viktor Orbán has done in Hungary. In the video, Strache discussed how introducing government compliant journalists and favourable reporting about the FPÖ in Austria’s largest newspaper, Kronen Zeitung, would gain his party power. Consequently, Strache resigned from his positions as party leader and Vice-Chancellor. Even though Strache has been openly opposing possible politically tinged media institutions throughout the years by criticising the ORF and calling it Rotfunk8(red radio), he seemed to be in favour of it when he got to choose the colour himself. The Ibiza scandal exemplifies that the debate about “truth” and independence are power games for politicians

8

(30)

(Foucault, 1970) and the judgment of “right” or “wrong” (reporting) depend on the side the critics are taking. Specifically, politicians can delegitimize opposing information channels by accusing them of deceiving the people. This study looked at this problematic phenomenon from a meta-level and investigated the rhetoric construction of (un)truthful communication in the online discourse of Austrian politicians. In other words, how are different politicians using labels to identify truthfulness and to attribute dishonesty and inaccuracy to others?

This thesis responds to the increase of scholarly attention for “Post-Truth” and “Fake News” by investigating to what extent Austrian politicians contribute to the public debate on the epistemic status of knowledge and the attribution of mis- and disinformation. The aim of the paper was threefold: (1) to map if, and if so, to what extent party leaders use (un)truthfulness communication on Facebook, (2) investigate possible links to moral rhetoric, and (3)

differentiate among politicians with varying ideologies. To do that, an explorative approach using an Automated Content Analysis (ACA) was employed.

As (un)truthfulness discourses are about “truth” judgements, their relationship with morality, a mean for politicians to express “right” and “wrong” signals to their voters, has been investigated. While most research on moral rhetoric stems from America’s two-party system, this study focussed on Austria’s multi-party system. Austria’s FPÖ received special attention due to their typically on morality based right-wing populist rhetoric (Krzyzanowski, 2012).

Results provide insight into the understudied topics of discourses surrounding deliberate and undeliberate spread of false information without aiming to verify or falsify its content as done in previous studies (Søe, 2018). This study purely looked at the occurrence or absence of rhetoric cues. Overall, this study showed that (un)truthfulness discourses are clearly linked to moral rhetoric which might be the case because moral-emotional appeals are given more weights

(31)

than facts (Rochlin, 2017). The data reveals that party leaders most often speak about reality and truth in their Facebook postings, and less about misinformation, for example nonsense, or disinformation, such as propaganda. In total numbers, postings on the discursive construction of reality paired with moral words account for the biggest share, followed by moral words linked to disinformation or misinformation.

Delving deeper into the link between morality and (un)truthfulness discourses, the analysis supports the assumption that politicians’ moral judgments are most salient when they create posting about disinformation. This is in line with the notion that morality is an important tool for politicians because it mobilizes voters (Jung, 2018). Furthermore, due to its

persuasiveness (Anderson et al., 2014), moral rhetoric is an effective mean to guide the public opinion finding process. This allows politicians to steer voters in an intended way. If politicians integrate the category disinformation in their Facebook postings, they use strong judgements of untruthfulness. By using labels of intentionally spread false information, they make their position very clear. This makes it easy for voters to know which side to take. Because the category

misinformation is similar to disinformation but is associated with less strong judgments of intentional falsehood, it is not surprising that misinformation is often paired with morality, but morality is less salient in reality construction.

Another goal was to shed light on the difference between right-wing extremist parties and others regarding their creation and participation in discourses on (un)truthfulness. Strache from the FPÖ clearly engages more often in strong judgments of disinformation compared to other party leaders. This is line with the notion that creating discourse concerning conspiracies, is a rhetoric strategy of the extreme right (Wodak, 2011). Strache’s use of moral words in postings, however, is less salient. Even the combination of disinformation and morality is not significantly

(32)

different to other party leaders. This finding might overall seem contrary to the conclusion that ideologically extreme parties moralize more than non-extreme parties (Wendler, 2014). However, taking a closer look on which (un)truthfulness discourses are paired with morality in SNSs, it becomes apparent that the combination of moral judgments and reality construction, as well as misinformation are more prevalent on Strache’s Facebook account. This supports the notion that right-wing populists like the FPÖ tend to be bound together by “common sense appeals to a presupposed shared knowledge of ‘the people’” (Wodak, 2011, para. 10).

This study is the first to provide insights in the user engagement with (un)truthfulness discourses on Facebook. The empirical evidence shows that likes are the most popular form of user engagement. Liking a posting is easy, not time consuming, and only requires low effort of participation (Hille & Bakker, 2013). If politicians would want to trigger more user engagement in the form of likes and reactions, they should focus on postings where they discuss the “truth”. Pairing postings on reality construction with morality leads to more likes but omits other forms of user engagement. This finding partly confirms previous research which found that moral rhetoric in tweets triggers user engagement (Brady et al., 2017).

This thesis has some limitations. First and foremost, this paper employed a dictionary-based method which used a pre-defined list and counted words. It is possible, that unanticipated but relevant words related to (un)truthfulness occurred in the data but were not recognized because they were not part of the dictionary. Future studies interested in visibility analysis could expand the research with lists of media institutions, journalists or the names of opposition politicians to investigate who is delegitimized or criticised. Furthermore, basic counting led to descriptive findings rather than more in-depth conclusions. However, this approach was necessary to gather novel insights into online (un)truthfulness discourses. Nevertheless, future

(33)

research should consider the context in which key words occur. To refine the findings, supervised machine learning would be desirable as it increases efficiency, makes findings easier

reproducible, and contributes to transparency (Boumans & Trilling, 2016).

Secondly, as this is a case study on Austria, the reliable dictionaries should be tested in the context of other German speaking countries to increase validity. Future studies should aim for a comparison of European countries to gather insights on how the extreme right shapes their online discourses on (un)truthfulness. Literature shows that extreme right-wing parties generally differentiate between “us” and “them” (Mudde, 2004) with the goal to rhetorically create “the guilty other” and common enemies; characteristics which are relevant when trying to convince voters of the “true reality”. Interestingly, in Germany, a shift to softer discourse can be seen among the radical right (Schellenberg, 2013) integrating subtle exclusionary rhetoric by using words like Heimat (homeland) (Wodak, 2012) which are part of the Moral Foundations Theory (Graham et al., 2012). Future research of political online discourse is required because SNS provide right-wing populists with an instrument to directly reach voters by circumventing gatekeepers (Engesser, Ernst, Esser, & Büchel, 2017) and therefore they are able to frame their (un)truthful discourse as they wish.

Despite these limitations, the study provided foundational empirical evidence for the presence and scope of attributions of (un)truthful communication of Austrian party leaders in Facebook. Especially valuable is that this paper covers the full ten-year Facebook presence of Strache, a representative figure of the FPÖ’s far-right extremist and populist ideology, while he was their party leader, as well as his rise and fall as the Vice-Chancellor of Austria.

Democracies worldwide are faced with a climate of distrust regarding the production of knowledge. It is a worrying trend that emotional appeals are rewarded with increased attention

(34)

and delegitimizing practices are accepted political strategies. Counter strategies need to follow a long-term approach (Dahlgren, 2018). The discursive distinction between (un)deliberate spread of false information, as done in this research, is the first essential step to decelerate the growth of the current epistemic crisis. In this light, this thesis enhanced the understanding of how

politicians contribute to the current “Post-Truth” and “Fake News” era.

References

Allcott, H., & Gentzkow, M. (2017). Social Media and Fake News in the 2016 Election. Journal of Economic Perspectives, 31(2), 211–236. https://doi.org/10.1257/jep.31.2.211

Anderson, S. E., Potoski, M., DeGolia, A., Gromet, D., Sherman, D., & Van Boven, L. (2014). Mobilization , Polarization , and Compromise : The Effect of Political Moralizing on Climate Change Politics. APSA 2014 Annual Meeting Paper, 1–23.

Arendt, H. (1955). Elemente und Ursprünge totaler Herrschaft (K. Jaspers, Ed.). Frankfurt am Main: Europäische Verlagsanstalt.

Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1133.

Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122–139. https://doi.org/10.1177/0267323118760317

Berghel, H. (2017). Lies, damn lies, and fake news. Computer, 50(2), 80–85. https://doi.org/10.1109/MC.2017.56

Blasius, R. (2015, January 13). Von der Journaille zur Lügenpresse. Frankfurter Allgemeine Zeitung. Retrieved from http://www.faz.net/aktuell/gesellschaft/unwort-des-jahres-eine-kleine-geschichte-der-luegenpresse-13367848.html

(35)

Borgesius, F. J. Z., Trilling, D., Möller, J., Bodo, B., De Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1), 1–16.

https://doi.org/10.14763/2016.1.401

Boumans, J. W., & Trilling, D. (2016). Taking stock of the toolkit: An overview of relevant automated content analysis approaches and techniques for digital journalism scholars. Digital Journalism, 4(1), 8–23. https://doi.org/10.1080/21670811.2015.1096598

Boxell, L., Gentzkow, M., & Shapiro, J. (2017). Is the Internet Causing Political Polarization? Evidence from Demographics. https://doi.org/10.3386/w23258

Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318. https://doi.org/10.1073/pnas.1618923114

Clifford, S., Jerit, J., Rainey, C., & Motyl, M. (2015). Moral Concerns and Policy Attitudes: Investigating the Influence of Elite Rhetoric. Political Communication, 32(2), 229–248. https://doi.org/10.1080/10584609.2014.944320

Corner, J. (2017). Fake news, post-truth and media–political change. Media, Culture and Society, 39(7), 1100–1107. https://doi.org/10.1177/0163443717726743

Dahlgren, P. (2018). Media, Knowledge and Trust: The Deepening Epistemic Crisis of Democracy. Javnost, 25(1–2), 20–27. https://doi.org/10.1080/13183222.2018.1418819 Del Vicario, M., Vivaldo, G., Bessi, A., Zollo, F., Scala, A., Caldarelli, G., & Quattrociocchi, W.

(2016). Echo Chambers: Emotional Contagion and Group Polarization on Facebook. Scientific Reports, 6, 1–12. https://doi.org/10.1038/srep37825

Engesser, S., Ernst, N., Esser, F., & Büchel, F. (2017). Populism and social media: how

(36)

1109–1126. https://doi.org/10.1080/1369118X.2016.1207697

Erlanger, S. (2017, December 12). ‘Fake News,’ Trump’s Obsession, Is Now a Cudgel for Strongmen. The New York Times. Retrieved from

https://www.nytimes.com/2017/12/12/world/europe/trump-fake-news-dictators.html Ernst, N., Engesser, S., Büchel, F., Blassnig, S., & Esser, F. (2017). Extreme parties and

populism: an analysis of Facebook and Twitter across six countries. Information Communication and Society, 20(9), 1347–1364.

https://doi.org/10.1080/1369118X.2017.1329333

European Commission. (2018). Flash Eurobarometer 464 Report. Retrieved from

https://ec.europa.eu/commfrontoffice/publicopinion/index.cfm/ResultDoc/download/Docum entKy/82797

Floridi, L. (2005). Semantic Conceptions of Information. In Stanford Encyclopedia of Philosophy. Retrieved from http://plato.stanford.edu/entries/information-semantic/

Foucault, M. (1970). The archaeology of knowledge. Social Science Information, 9(1), 175–185. https://doi.org/10.1177/053901847000900108

Friedman, U. (2017, December 23). The Real-World Consequences of ‘Fake News.’ The Atlantic. Retrieved from https://www.theatlantic.com/international/archive/2017/12/trump-world-fake-news/548888/

Gentzkow, M., & Shapiro, J. M. (2011). Ideological Segregation Online and Offline. The Quarterly Journal of Economics, 126(4), 1799–1839.

https://doi.org/10.1093/qje/qjr044.Advance

Goldstein, D., Hasher, L., & Toppino, T. (1977). Frequency and the Conference of Referential Validity. Journal of Verbal Learning and Verbal Behavior, (16), 107–112.

(37)

https://doi.org/10.1016/S0022-5371(77)80012-1

Graham, J., Haidt, J., Koleva, S., Motyl, M., Iyer, R., Wojcik, S. P., & Ditto, P. H. (2012). Moral Foundations Theory: The Pragmatic Validity of Moral Pluralism. Advances in Experimental Social Psychology, 47, 55–130.

Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and Conservatives Rely on Different Sets of Moral Foundations. Journal of Personality and Social Psychology, 96(5), 1029–1046. https://doi.org/10.1037/a0015141

Hayes, A. F., & Krippendorff, K. (2011). Answering the Call for a Standard Reliability Measure for Coding Data. Communication Methods and Measures, 1(1), 77–89.

https://doi.org/10.1080/19312450709336664

Hille, S., & Bakker, P. (2013). I like news. Searching for the “Holy Grail” of social media: The use of Facebook by Dutch news media and their audiences. European Journal of

Communication, 28(6), 663–680. https://doi.org/10.1177/0267323113497435

Hooghe, L., Bakker, R., Brigevich, A., De Vries, C., Edwards, E., Marks, G., … Vachudova, M. (2010). Reliability and validity of the 2002 and 2006 Chapel Hill expert surveys on party positioning. European Journal of Political Research, 49(5), 687–703.

https://doi.org/10.1111/j.1475-6765.2009.01912.x

Humprecht, E. (2018). Where ‘fake news’ flourishes: a comparison across four Western democracies. Information Communication and Society, 0(0), 1–16.

https://doi.org/10.1080/1369118X.2018.1474241

Iyengar, S. (1991). Is Anyone Responsible? How Television Frames Political Issues. In The University of Chicago Press. https://doi.org/10.2307/2075856

(38)

Group Polarization. American Journal of Political Science, 59(3), 690–707.

Jasanoff, S., & Simmet, H. R. (2017). No funeral bells: Public reason in a ‘post-truth’ age. Social Studies of Science, 47(5), 751–770. https://doi.org/10.1177/0306312717731936

Jung, J. (2018). The Mobilizing Effect of Parties’ Moral Rhetoric. Washington University, St. Louis.

Katsirea, I. (2019). ‘“Fake news”: reconsidering the value of untruthful expression in the face of regulatory uncertainty.’ Journal of Media Law, 0(0), 1–30.

https://doi.org/10.1080/17577632.2019.1573569

Krippendorff, K. (2004). Reliability in content analysis. Human Communication Research, 30(3), 411–433.

Krzyzanowski, M. (2012). From Anti-Immigration and Nationalist Revisionism to

Islamophobia : Continuities and Shifts in Recent Discourses and Patterns of Political Communication of the Freedom Party of Austria (FPÖ). Right-Wing Populism in Europe : Politics and Discourse, 135–148. https://doi.org/10.5040/9781472544940.ch-009

Ladd, J. (2010). The neglected power of elite opinion leadership to produce antipathy toward the news media: Evidence from a Survey Experiment. Political Behavior, 32(1), 29–50.

https://doi.org/10.1007/s11109-009-9097-x

Ladd, J. (2012). Why Americans Hate the Media and How It Matters. In Princeton University Press. https://doi.org/10.2307/j.ctt7spr6

Lecheler, S., & De Vreese, C. H. (2011). Getting Real: The Duration of Framing Effects. Journal of Communication, 61(5), 959–983. https://doi.org/10.1111/j.1460-2466.2011.01580.x Lewandowsky, S., Ecker, U. K. H. H., & Cook, J. (2017). Beyond Misinformation:

(39)

Memory and Cognition, 6(4), 353–369. https://doi.org/10.1016/j.jarmac.2017.07.008 Mandlik, M. (2018). Strache Tweet gegen ORF. Retrieved June 26, 2019, from Twitter website:

https://twitter.com/MichaelMandlik/status/963452891871801344/photo/1

Marres, N. (2018). Why We Can’t Have Our Facts Back. Engaging Science, Technology, and Society, 4, 423. https://doi.org/10.17351/ests2018.188

McIntyre, L. (2018). Post-Truth. MIT Press.

McKinney, W. (2010). The pandas project. Proceedings of the 9th Python in Science Conference. Retrieved from http://conference.scipy.org/proceedings/scipy2010/mckinney.html

Meffert, M. F., & Gschwend, T. (2010). Strategic coalition voting: Evidence from Austria. Electoral Studies, 29(3), 339–349. https://doi.org/10.1016/j.electstud.2010.03.005 Mudde, C. (2000). The Ideology of the Extreme Right. Manchester University Press. Mudde, C. (2004). The Populist Zeitgeist. Government and Opposition, 39(4), 541–563.

Murphy, F. (2018). Austrian far-right leader pledges to take on public broadcaster. Retrieved June 26, 2019, from Reuters website:

https://www.reuters.com/article/us-austria-politics-idUSKCN1FY32Z

Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2018). Reuters Institute Digital News Report 2018. Reuters Institute for the Study of Journalism, 144. https://doi.org/10.2139/ssrn.2619576

Oxford Dictionary. (2016). Word of the Year 2016: Post-truth. In Oxford Dictionary (pp. 1–11). https://doi.org/10.1074/jbc.M611242200

Paccagnella, L. (2018). Post-Truth. International Journal of E-Politics, 9(2), 1–13. https://doi.org/10.4018/ijep.2018040101

(40)

Books.

Pedregosa, F., Varoquaux, G., Michel, V., Thirion, B., Grisel, O., Blondel, M., … Perrot Edouard Duchesnay, M. (2011). Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research, 2825–2830.

Peters, M. A. (2018). Education in a post-truth world. Post-Truth, Fake News: Viral Modernity Higher Education, 1857, 145–150. https://doi.org/10.1007/978-981-10-8013-5_12

Prior, M. (2013). Media and Political Polarization. Annual Review of Political Science, 101–127. https://doi.org/10.1146/annurev-polisci-100711-135242

R: A Language and Environment for Statistical Computing. (2010). Retrieved from http://www.r-project.org

Reber, R., & Schwarz, N. (1999). Effects of Perceptual Fluency on Affective Judgments of Truth. Consciousness and Cognition, 8, 338–342.

Rieder, B. (2013). Studying Facebook via data extraction. Proceedings of the 5th Annual ACM Web Science Conference on - WebSci ’13, 346–355.

https://doi.org/10.1145/2464464.2464475

Rochlin, N. (2017). Fake news: belief in post-truth. Library Hi Tech, 35(3), 386–392. https://doi.org/10.1108/LHT-03-2017-0062

Rossum, G. (1995). Python Reference Manual. Amsterdam: Centre for Mathematics and Computer Science.

Sagi, E., & Dehghani, M. (2014). Measuring Moral Rhetoric in Text. Social Science Computer Review, 32(2), 132–144. https://doi.org/10.1177/0894439313506837

Schellenberg, B. (2013). Developments within the Radical Right in Germany : Discourses, Attitudes and Actors. Right-Wing Populism in Europe : Politics and Discourse, 149–162.

(41)

https://doi.org/10.5040/9781472544940.ch-010

Schwarz, K. (2016, November 22). Im Netz der Lügen. Zeit. Retrieved from

https://www.zeit.de/politik/deutschland/2016-11/fake-news-deutschland-geruechte-hoaxmap Søe, S. O. (2018). Algorithmic detection of misinformation and disinformation: Gricean

perspectives. Journal of Documentation, 74(2), 309–332. https://doi.org/10.1108/JD-05-2017-0075

Sunstein, C. R., Lazer, D. M. J., Schudson, M., Benkler, Y., Zittrain, J. L., Thorson, E. A., … Rothschild, D. (2018). The science of fake news. Science, 359(6380), 1094–1096. https://doi.org/10.1126/science.aao2998

Tandoc Jr, E. C., Wei Lim, Z., & Ling, R. (2018). Defining “Fake News.” Digital Journalism, 6(2), 137–153. https://doi.org/10.1080/21670811.2017.1360143

Thorson, E. (2016). Belief Echoes: The Persistent Effects of Corrected Misinformation. Political Communication, 33(3), 460–480. https://doi.org/10.1080/10584609.2015.1102187

Törnberg, P. (2018). Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS ONE, 13(9), 1–21. https://doi.org/10.1371/journal.pone.0203958

Tucker, J. A., & Guess, A. (2018). Social Media, Political Polarization, and Political

Disinformation: A Review of the Scientific Literature. Infoanimales.Com, (March), 1–95. https://doi.org/10.2139/ssrn.3144139

Van Aelst, P., Aalberg, T., Berganza, R., Salgado, S., Stępińska, A., Stanyer, J., … Esser, F. (2017). Political communication in a high-choice media environment: a challenge for democracy? Annals of the International Communication Association, 41(1), 3–27. https://doi.org/10.1080/23808985.2017.1288551

(42)

Amsterdammer. Retrieved from https://www.groene.nl/artikel/hollands-nepnieuws Waisbord, S. (2018). The elective affinity between post-truth communication and populist

politics. Communication Research and Practice, 4(1), 17–34. https://doi.org/10.1080/22041451.2018.1428928

Wardle, C. (2017). Fake News. It’s Complicated. Retrieved June 26, 2019, from First Draft website: https://medium.com/1st-%09draft/fake-news-its-complicated-d0f773766c79 Wendler, F. (2014). Justification and political polarization in national parliamentary debates on

EU treaty reform. Journal of European Public Policy, 21(4), 549–567. https://doi.org/10.1080/13501763.2014.882388

Wodak, R. (2011). Old and new demagoguery: the rhetoric of exclusion | openDemocracy. Retrieved June 26, 2019, from Open Democracy website:

http://www.opendemocracy.net/ruth-wodak/old-and-new-demagoguery-rhetoric-of-exclusion

Wodak, R. (2012). ‘Anything Goes!’ – The Haiderization of Europe. Right-Wing Populism in Europe : Politics and Discourse, 2–38. https://doi.org/10.5040/9781472544940.ch-002

(43)

Appendix A Wordlists

(un)truthfulness – German and English translation reality construction glaubwuerdig* believable authentisch* authentic wahr$ true wahrheit truth wirklich* really ehrlich* honestly expert* expert fakt fact real$ real realitaet* reality tatsa* fact ungelogen honestly empirisch* empirically beweis* proof evidenzbasiert* evidence-based erwiesen* proven nachpruef* verifiable nachweis* proof ueberpruef* verifiable verifizier* verify serioes* serious zuverlaessig* reliable vertrauen* trust gewiss$ certainly bestimmt$ determined ueberzeugt convinced freilich of course zweifellos* certainly gegebenheit* - tatbesta* fact sachverhalt facts unbestreitbar* undeniably unleugbar* undeniably unstrittig* undisputed unumstritten* undisputed unabweisbar* irrefutably unwiderleg* undisputed zweifelsohne undoubtedly unzweifelhaft* undoubtedly richtig* right echt$ real echte$ real exakt* exactly genau$ exactly genaue$ exact binsenweisheit* truism unfehlbarkeit* infallibility aufrichtig* sincerely korrekt* correctly fehlerfrei* error-free ordnungsmaessig* regular legitim* legit bestaetig* confirm bekraeft* verifying untermauer* underpinning erhaert* substantiating begruend* justification glaubhaft* credibly zeuge* witness versichern assure bejah* affirming affimieren* affirming bezeug* attesting unbefang* unprejudiced unparteiisch* impartial unvoreingenommen* impartially unverfaelscht* unadulterated unverzerrt* undistorted bestechend* captivatingly ueberzeugend* convincingly triftig* convincingly garantier* guaranteed absolut$ absolutely selbstsprechend self -explanatory objektiv* objective

Referenties

GERELATEERDE DOCUMENTEN

This study examines the effect of perceived privacy and security towards social media behavior (trust, attitude, and self- disclosure) on Facebook.. Age is used to

In 2009, for example, the Dutch Ministry of Foreign Affairs made this link explicit by claiming that because the Netherlands contributed significantly to International

24 Needless to say that European politicians tend to like this, and in the past indeed a few epochs in Europe’s deep history have been promoted as quintessential to

Vandaar ook het vraagte- ken, want nu weet je nog niet dat

1, De overheid behoort hier te lande, in verband met de Nederlandse staatkundige ontwikkeling en economische toe- standen, het hoger onderwijs krachtig te bevorderen, met

De Communisten hebben toen de mogelijkheid tot blijvende samenwerking met beide handen aangepakt, omdat zij voorzagen dat de strijd met de nederlaag van het

1. De familie Verweij spreekt met de wethouder over de beëindiging van het initiatief voor de oprichting van Landgoed Beuningen door ICE-onrwikkeling/Berghege. Na jarenlange

Allerlei instanties verzamelen ontzettend veel gegevens. Gegevens over bedrijven, over mensen, over locaties. Vaak zijn deze gegevens niet vrij toegankelijk voor anderen. Dat