• No results found

Fact-checking of news on social media and its effect on perceived accuracy and news engagement in the context of media literacy

N/A
N/A
Protected

Academic year: 2021

Share "Fact-checking of news on social media and its effect on perceived accuracy and news engagement in the context of media literacy"

Copied!
63
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Fact-checking of news on social media and its effect on

perceived accuracy and news engagement in the context of

media literacy

Master of Arts (MA) in Journalism, Media and Globalisation

Lisa Marie Lechner

Student ID: 1284689

Master’s Thesis

Graduate School of Communication

Master’s programme Communication Science

Supervisor: dhr. dr. J. (Jakob) Ohme

Date of completion: 29th of May 2020

(2)

Abstract

This study examines the extent to which the format and evaluation of fact-checks on news posts on social media affect the perceived accuracy and news engagement of users. The study’s special contribution to existing research lies in exploring whether media literacy has a moderating effect on the examined relation. The growing phenomena of global misinformation and disinformation disseminating online are reaffirming the relevance of this study.

By conducting an experiment in which respondents are subjected to a simulated check in a manipulated news post in a Facebook template, the study demonstrates that the fact-checking evaluation (“mostly true” vs. “mostly false”) has a significant effect on perceived accuracy. In addition, the effect of the fact-checking evaluation on perceived accuracy differs significantly between the two fact-checking formats (visual vs. textual). Media literacy has a significant main effect on perceived accuracy, as respondents with low media literacy perceive the news post as significantly more accurate than people with higher media literacy.

This suggests that the application of fact-checking labels to indicate the truth or falsity of information is not required for people with high media literacy, as they are able to assess accuracy without the assistance of fact-checks. In conclusion, the teaching and acquiring of media literacy could become the primary alternative to fact-checking in the future. By including the concept of media literacy in the study on disinformation and fact-checking, the paper contributes to existing research and becomes of vital interest for both academia and society.

Keywords: disinformation, fact-checking, media literacy, perceived accuracy, news

(3)

Table of Contents

Introduction ... 1

Fact-checking in the Context of Disinformation ... 3

Previous Research on Fact-checking, Perceived Accuracy and News Engagement ... 5

Research Expansion Through the Concept of Media Literacy ... 9

Method ... 11

Sample and Manipulation Check ... 12

Experimental Design and Procedure ... 13

Measures ... 15 Perceived accuracy ... 15 News engagement ... 15 Media literacy ... 15 Controls ... 16 Results ... 16

Conclusion and Discussion ... 20

(4)

1

Introduction

In recent years, the dissemination of false information online has posed a growing threat – for instance concerning “the manipulation of elections, damage to public health and failure to recognise the risks of climate change” (Ireton & Posetti, 2018, p. 18).

Disinformation represents a danger to democratic societies by creating hazardous

misperceptions and promoting distrust towards public information (Clayton et al., 2019). Not only was disinformation widespread in the 2016 United States (US) presidential election, but also in the 2016 Brexit referendum and the 2019 European Union (EU) elections (Hänska & Bauchowitz, 2017; Lanoszka, 2018). The sharing of misleading information also appears to be intensifying with the 2020 US presidential primary elections and the ongoing global outbreak of the coronavirus (“Fighting disinformation”, n.d.; Korecki, 2019). Most recently, the coronavirus pandemic has evoked a series of misconceptions and false

statements that were shared on social media – arguing that holding your breath for more than 10 seconds is an effective test for the virus or that drinking water prevents people from contracting the virus (“World Health Organization refutes viral claims”, 2020).

In the light of the growing so-called “global disinformation order” on digital media platforms (Bradshaw & Howard, 2019), several fact-checking organisations have been established since – aiming to verify information, identify false statements and combat

disinformation (Amazeen, 2015; Stencel & Luther, 2020). In April 2020, the Duke Reporters’ Lab counted 237 active fact-checking organisations across 78 countries worldwide – numbers keep rising (Stencel & Luther, 2020). Labelling the truth or falsity of articles and statements, fact-checking organisations are using different methodological approaches that include a range of evaluation systems such as visual ranking scales or text-based ratings.

The growth of fact-checking organisations is largely driven by the fact that news outlets and social media platforms like Facebook are becoming increasingly dependent on

(5)

2 cooperating with third-party fact-checkers to combat the surge of misleading information (Stencel & Luther, 2020; “How is Facebook”, n.d.). Regarding the 2020 US election and the coronavirus pandemic, Facebook and Instagram announced they would clearly label incorrect content with fact-checking mechanisms more intensively to help users assess the accuracy of information (Rosen, Harbath, Gleicher & Leathern, 2019; Rosen, 2020).

At the same time, critics say that fact-checking on its own may not be enough in the fight against disinformation (Bennet & Livingston, 2016; Jolls & Johnsen, 2018). According to Jolls and Johnsen (2018), “citizens who rely solely on fact-checking sites to determine what information is trustworthy are more at risk of manipulation than those who have learned to evaluate and analyse information independently with critical thinking skills acquired through media literacy education” (p. 1403). Based on this statement, it becomes clear that fact-checking and its effects should not be analysed as a separate construct but be explored in connection with media literacy.

However, previous studies on fact-checking have disregarded media literacy as a factor. For instance, prior research has examined the effect of fact-checks on social media users’ behavioural intentions and found that users tend to ignore false news posts without considering correcting them (Tandoc, Lim & Ling, 2019). Other studies revealed that visual rating scales seem more effective in correcting false beliefs and misperceptions than textual fact-checks (Amazeen, Thorson, Muddiman & Graves, 2018; Young, Jamieson, Poulsen & Goldring, 2018). But, based on the assessment of Jolls and Johnson (2018), it remains unanswered whether media literacy might play a role in how users perceive fact-checking methodologies and engage with verified content.

On the other hand, factors such as the perceived accuracy have been comprehensively examined in previous research on the effectiveness of fact-checking. Thus, the perceived accuracy of fact-checked news items is also one of the dependent variables in this paper, as it

(6)

3 is of special interest to explore how verification labels can potentially influence people’s perception of what is accurate or not (see Clayton et al., 2019). News engagement is included as the second dependent variable in the study’s framework, as much disinformation seems to be shared on social media regardless of its inaccuracy (Talwar et al., 2019). Hence, it is important to explore whether and to what extent differing fact-checking labels influence people’s engagement with social media news items. Furthermore, the engagement of social media users is “a new form of user-based political participation in the digital sphere that aims to restore an accessible and reasoned public debate” (Porten-Cheé, Kunst & Emmer, 2020, p. 515). Therefore, the extent to which fact-checking can influence digital participation is of great importance with regards to possibly reducing the dissemination of unverified

information. Considering the above-mentioned aspects, this paper developed the following research question that is analysed by means of a survey experiment:

RQ: To what extent do differing fact-checking labels on social media news posts affect

the perceived accuracy and news engagement of users and how is this influenced by the level

of media literacy?

Based on the identified research gap, this study furthers existing academic knowledge by investigating the effectiveness of fact-checking taking into consideration people’s level of media literacy. Additionally, the experiment has an inherent societal relevance as it is

conducted in the context of global events that are threatened by the spread of false or malicious information.

Fact-checking in the Context of Disinformation

Disinformation is generally understood as “intentionally deceptive falsehood” to manipulate people, whereas fake news is “a form of disinformation that mimics the look and feel of news” (Tandoc et al., 2019, p. 3; Ireton & Posetti, 2018). In contrast, misinformation refers to “misleading information created or disseminated without manipulative or malicious

(7)

4 intent” (Ireton & Posetti, 2018, p. 7). As argued before, fact-checking organisations play a role in the battle against the spread of such false information. Based on previous studies (Amazeen, Vargo & Hopp, 2019; Graves, 2016; Graves & Glaisyer, 2012; Graves & Konieczna, 2015), this study has developed the following definition of fact-checking:

Fact-checking aims to debunk falsehood and rectify misinformation and disinformation.

It is performed by commercial newsrooms and independent organisations that aim to inform

and educate, but not necessarily to persuade or influence people’s beliefs. Fact-checkers use

differing methodologies such as visual rating scales or text-only evaluations. A distinction is

made between traditional, internal fact-checking and external, professional fact-checking.

While internal checks aim to correct inaccuracies before publication, external

fact-checking focuses on reported speech and the uncovering of false claims.

This paper will exclusively focus on external fact-checking processes. As mentioned earlier, several fact-checking organisations and departments are developing in connection with the rising demand for online verification services. Each individual organisation employs its own methodology – while some fact-checkers develop visual scales to label the extent of an information’s truth or falsehood, others make use of textual explanations. The approaches of US-based fact-checkers illustrate the extent of these differing methodologies. For instance,

FactCheck.org (https://www.factcheck.org/) applies a textual fact-checking format, while

PolitiFact (https://www.politifact.com/) has developed a visual “Truth-O-Meter” to label the

veracity of information. The Washington Post has even established its own fact-checking unit and invented the so-called “Pinocchio Test” that rates accuracy by displaying a numeric range of Pinocchio noses (Kessler, 2017). The list of global fact-checkers continues – from the German investigation centre CORRECTIV (https://correctiv.org/en/) to South African Africa

Check (https://africacheck.org/) and Indian Alt News (https://www.altnews.in/). Amongst

(8)

Fact-5

Checking Network (IFCN), under which they have agreed to follow a Code of Principles

(https://www.poynter.org/ifcn/). With this code, they respect a certain set of standards – for instance the commitment to non-partisanship and the transparency of sources (“The commitments”, n.d.).

Considering the growing importance of fact-checkers for news agencies and social media platforms, it is worth noting that Facebook is exclusively collaborating with verified third-party fact-checkers (Lyons, 2018). When organisations rate an article as incorrect, it is displayed further down Facebook’s news feed, reducing the chance for future views by up to 80% (Lyons, 2018). As a research report by the University of Oxford has confirmed that Facebook remains the most used social media platform for manipulation through false information (Bradshaw & Howard, 2019) and especially since “Facebook is an epicentre of coronavirus misinformation” in the ongoing pandemic (Avaaz, 2020), the importance of Facebook in research on fact-checking gets underlined.

Previous Research on Fact-checking, Perceived Accuracy and News Engagement

As mentioned before, perceived accuracy and news engagement were included as the two dependent variables within this study’s framework. The concept of perceived accuracy refers to the respective degree of correctness that recipients ascribe to certain information (Clayton et al., 2019; Garrett, Nisbet & Lynch, 2013). Since fact-checking aims to “debunk falsehood and rectify misinformation and disinformation” (see above), the degree of

correctness that people ascribe to information is crucial for their subsequent news engagement.

A survey experiment by Clayton et al. (2019) analysed people’s perception of false story headlines. The researchers found that a specific “Rated false” tag on articles on social media was more effective in reducing people’s perceived accuracy than a general “Disputed” label, which used to be Facebook’s original strategy (Clayton et al., 2019).

(9)

6 Additionally, an experiment by Pennycook, Bear, Collins and Rand (2020) subjected people to news headlines on Facebook that were labelled as “true” or “false”. The researchers found that “participants were more likely to believe true headlines … compared with false headlines” (Pennycook et al., 2020, p. 5). Based on these results, a first hypothesis on fact-checking evaluations was developed for this research project:

H1: Respondents are more likely to perceive a news post as accurate when it is labelled

as “mostly true” compared to when it is labelled as “mostly false”.

The concept of news engagement on social media is somewhat more complex and thus required a conceptual definition that was derived from previous research (Ha et al., 2018; Ksiazek, Peer & Lessard, 2016; Park & Kaye, 2018):

The engagement of online users with news on social media platforms is the active

involvement in news content for either personal or social purposes. Users apply different levels

of engagement that are characterised by the extent to which a certain news content is liked,

commented, shared or even discussed with other people both online and offline.

Regarding this concept, Tandoc et al. (2019) explored the influence of Singaporean social media users’ perception of fake news on their news engagement – showing that most users ignored fake news and did not further engage with the content if the issue was not of strong personal relevance to them. What Tandoc et al. (2019) criticised about their own method was that the study automatically assumed that users were able to differentiate between fake news and true facts. However, the distinction between reliable facts and false information has proven to be difficult for news recipients (McGrew, Ortega, Breakstone & Wineburg, 2017; Tandoc et al., 2019). Therefore, the researchers suggested an experiment for further research that would explore users’ perceptions and behavioural intentions after being exposed to a fact-checked news post (Tandoc et al., 2019), which is the research purpose of this study. Furthermore, the study by Pennycook et al. (2020) found that the intention of users to share a

(10)

7 news piece on social media was lower when it was labelled as “false” as compared to “true”. Deriving from previous research, it is expected that social media users show higher engagement with news posts that are labelled as accurate. Thus, a second hypothesis on evaluations of fact-checks was suggested for this study:

H2: Respondents are more likely to engage with a news post when it is labelled as

“mostly true” compared to when it is labelled as “mostly false”.

A wide range of other experimental studies have been conducted regarding fact-checking and its influence on perception and engagement. Young et al. (2018) explored the influence of fact-checking formats (print vs. video) “in shaping message interest and belief correction” (p. 49). The overall results revealed that fact-checks in a video format led to an increased attention of respondents to the message and that the video was more effective in reducing misperceptions than a long-form online article without visualisations on the same topic (Young et al., 2018).

In another experiment, Amazeen et al. (2018) explored the impact of visual rating scales on the correction of false beliefs in political and non-political contexts by assigning participants to the same statement that was either visually or textually labelled as “mostly false”. The results demonstrated that most respondents preferred combined corrections of visual and textual ratings rather than solely textual fact-checks (Amazeen et al., 2018). Adding a visual rating to a textual rectification did indeed enhance the effectiveness in the reduction of misperceptions – however, this was only the case when non-political disinformation was corrected (Amazeen et al., 2018).

Furthermore, Lazard and Atkinson (2015) analysed differences in people’s engagement between the exposure to visual or textual messages by including visual literacy – which refers to advanced “abilities to interpret and create visual materials” – as a moderator (p. 13). They also integrated the Elaboration Likelihood Model (ELM) by Petty and Cacioppo (1986) “to

(11)

8 explore whether these different message formats lead to central or peripheral processing” (Lazard & Atkinson, 2015, p. 8). The researchers found that individuals who were exposed to an infographic, which combined visual and textual elements, experienced greater elaboration with a pro-environmental message than those who were exposed to text-only and visual-only conditions (Lazard & Atkinson, 2015). However, the individual level of visual literacy did not have a moderating effect – both participants with low and high levels of visual literacy processed the infographic more thoroughly than the visual- and text-only conditions (Lazard & Atkinson, 2015).

Regarding the ELM, it is important to note that the higher users’ motivation and ability to consume certain content are, the more likely they are to thoughtfully process the information (Petty & Cacioppo, 1986). According to Petty and Cacioppo (1986), there are two possible ways of processing information – central and peripheral. Processing the message via the central route leads to a more consistent change of attitudes, while people that process information via the peripheral route focus on visual and non-argument signals that cause a less persistent formation of attitudes (Petty & Cacioppo, 1986). Even though the ELM mainly focuses on persuasion and attitude change – which is not the main purpose of fact-checking (see above) – the fundamentals of central and peripheral processing were applied to the fact-checking formats used in this study and their impact on perception and engagement.

Against the logic of the ELM, Lazard and Atkinson (2015) proved that “visual cues can and are processed as central elements of the message” (p. 27). They suggested that “further studies should build on this study’s insights by testing visual cues in different contexts and exploring which moderators might influence the central processing of individuals” (Lazard & Atkinson, 2015, p. 27).

In line with Young et al. (2018) and Amazeen et al. (2018), one can assume that these findings also exist within the context of this paper – namely, that visual and textual fact-checks

(12)

9 make a difference in how people perceive and engage with a fact-checked news post. According to Lazard and Atkinson (2015), visual fact-checks might have a greater effect on how accurate respondents perceive a message and how they engage with it than textual fact-checks. In this regard, it would not matter whether respondents perceived the article as “mostly true” or “mostly false”, as solely the fact-checking format is at centre at this point. This notion led to the development of additional hypotheses on fact-checking formats:

H3: Visual fact-checking scales have a greater impact on how accurate a news post is

perceived than textual fact-checks.

H4: Visual fact-checking scales have a greater impact on the level of engagement with

a news post than textual fact-checks.

As shown above, there is little research that focuses on a combined examination of the fact-checking evaluation (“mostly true” vs. “mostly false”) and the fact-checking format (visual vs. textual). For this reason, the following sub-research question was formulated for this project:

RQ1: Is there an interaction effect between a fact-check’s evaluation (“mostly true” vs.

“mostly false”) and its format (visual vs. textual) on perceived accuracy and news engagement?

However, the combination of fact-checking characteristics is not the only additional aspect that this study contributes to previous research. Another add-on is the inclusion of media literacy in the fact-checking context.

Research Expansion Through the Concept of Media Literacy

As explained earlier, other researchers criticised fact-checking as an inadequate method to fight disinformation (Bennett & Livingston, 2018; Jolls & Johnson, 2018). As argued by Jolls and Johnson (2018), recipients who have learned to analyse information through media literacy education are less vulnerable to possible manipulation in their perceived accuracy of information than the ones who rely exclusively on fact-checks. The importance of media literacy is even stressed at a European level. As the European

(13)

10 Commission emphasised in their so-called Action Plan Against Disinformation, “the objective is for the Union and its neighbourhood to become more resilient against disinformation. This requires continuous and sustained efforts to support education and media literacy, journalism, fact-checkers, researchers, and the civil society as a whole” (Action Plan Against

Disinformation, 2018, p. 12). As this statement points out, fact-checking is not the only approach in fighting disinformation, but the promotion and enhancement of individual media literacy is encouraged. Additionally, a field study found that teachers and schoolchildren from Germany, Belgium and Austria agree that media literacy is not sufficiently taught (Lie

Detectors, 2019).

A recent study on coronavirus-related misinformation on Facebook showed that 100 examined pieces of misleading content in six languages were shared over 1.7 million times within three months (Avaaz, 2020). Although the false information was detected by third-party fact-checkers, it took up to 22 days for Facebook to add warning labels and downgrade the content (Avaaz, 2020). A dangerously slow process, especially since the World Health Organization (WHO) emphasised that “we’re not just fighting an epidemic, we’re fighting an infodemic” and that “fake news spreads faster and more easily than [the] virus…” (World Health Organization, 2020). Due to this delay from Facebook, it is not surprising that false reports about the coronavirus are spreading rapidly despite the cooperation with fact-checkers – another example of why users should not only rely on external fact-checks and why a certain degree of media literacy should be applied if societies hope to increase resilience to disinformation.

Hence, fact-checking should be explored within the context of media literacy, bearing in mind that the verification of facts might not be adequate without considering an individual’s ability to evaluate information through media literacy. This study contributed to research by doing just that.

(14)

11 In the framework of this study, the concept of media literacy was based on previous research (Ashley, Maksl & Craft, 2013; Literat, 2014; Maksl, Ashley & Craft, 2015; Potter, 2004; Vraga & Tully, 2015; Vraga et al., 2015) and is understood as follows:

A person’s level of media literacy is the extent to which an individual can access, understand and communicate in relation to traditional and new media channels. Media

literacy enables individuals to critically read media texts and identify and understand the

meaning of media messages. For the highest possible level of media literacy, people should be

educated in how to analyse media messages. As a specification, news media literacy relates to

the knowledge about the construction and consumption of news, emphasising that a critical

evaluation of news is required to rate the credibility of the information therein. Furthermore,

a high level of news media literacy includes knowledge of media structures and encourages a

questioning attitude towards news content. Finally, media literacy is important in a societal

context, as it sets the base for an informed citizenship, which is a cornerstone of a functioning

democracy.

Due to the centrality of media literacy and its lack in prior research, this paper posed the following sub-research question:

RQ2: Does media literacy have a moderating effect on the relation between a

fact-check’s evaluation and format with perceived accuracy and news engagement?

With this research question, the project has set an approach for future studies – highlighting the importance of media literacy in research on disinformation and fact-checking.

Method

In order to investigate the impact of a fact-checks’ format and its evaluation on perceived accuracy and news engagement in the context of media literacy, an online experiment was designed with the survey software Qualtrics (see Appendix C). Before the survey was fielded, 20 pre-testers were recruited in the personal environment of the researcher

(15)

12 in order to generate feedback on the questionnaire and to assess whether the manipulation via the experimental stimuli was successful. Starting the data collection process, convenience sampling with subsequent random assignment to the experimental stimuli was used as the sampling method. The survey was distributed through the researcher’s own personal network on social media channels like Facebook, Instagram and LinkedIn. It was also spread in several public survey groups on Facebook, to ensure a wide range of respondents. The chance to win one of two Amazon vouchers worth 10€ served as an incentive to participate. The data was collected under the circumstances of worldwide lockdowns due to the coronavirus pandemic within a two-week period between the 22nd of March and 6th of April 2020 and gathered N = 263 responses in total.

Sample and Manipulation Check

Before the data was analysed, the data set was controlled for research units

(respondents) that did not pass the manipulation check in the questionnaire. The manipulation check entailed an open recall and a closed recall question (see Appendix C, Q12 & Q13). In the open recall, respondents were asked to state in their own words whether the news post that they were subjected to contained an indication on its accuracy, and if yes, what kind (e.g. “Yes, the post contained an indication on its accuracy.”). The closed recall prompted respondents to choose the visual stimulus that accompanied the post (e.g. “Pinocchio” or “None of the above”). A filter variable was generated in order to exclude those cases that did not pass the manipulation check and to perform all calculations with respondents who showed to have correctly perceived the experimental stimulus (e.g. respondents who were exposed to a visual fact-check had to indicate “Pinocchio” in the closed recall, while respondents that were part of the textual condition or the control group had to indicate “None of the above” to be kept in the data set).

(16)

13 Through this approach, n = 49 respondents were excluded from the data set. Table 1 (see Appendix A for all tables) shows the distribution of experimental cases before and after the manipulation check. The comparatively high loss of people in the visual groups might not only be random, but systematic – for instance, respondents might have been confused with the similarity between the like button in the manipulated Facebook post and the thumb label as an option in the closed recall. However, the small differences between pre- and

post-manipulation data seem negligible (Table 2). Thus, the analyses were performed with the post-manipulation data (N = 214). Thereof, 66.4% (n = 142) of all respondents were female and 33.2% (n = 71) were male. With M = 26.99 (SD = 5.37) years, the average age of all respondents was rather young. Bachelor’s degrees were the most represented level of

education at 45.8%, followed by master’s degrees (28.0%) and high school diplomas (22.9%).

Experimental Design and Procedure

The experiment contained a 2×2 between-subjects design and a control group. With equal probability, all respondents were randomly assigned to one of four experimental stimuli groups or the control group.

Respondents were exposed to the same news article, which was presented within the simulated template of a Facebook post. The post was manipulated so respondents saw the news post accompanied by either a visual fact-check or a textual fact-check that rated the article as either “mostly true” or “mostly false”. The control group was exposed to the same post without any fact-checking label.

The manipulated news post was based on a real article by “Bloomberg Green” (Rathi & Hodges, 2020). It combined two topical issues – environmentalism and the corona crisis – and explained how the outbreak of the crisis had reduced carbon emissions in China. For the purpose of this paper, the content of the real article was manipulated by changing the original number of 100 million metric tons of reduced carbon emissions to 800 million metric tons and

(17)

14 by comparing the number with “what Germany emits in a year” instead of referring to Chile as in the original article (Appendix B.1 – B.5). The news article was chosen because it included two recent topics that interest a broad audience worldwide. In addition, the subject matter and the manipulations were neither exaggerated nor understated so that the article could, in principle, be classified as both true and false – thus, providing an ideal stimulus for manipulation. The original source of the news article was changed into a fictional news source called “News Global Network”, to avoid any external effects deriving from source recognition and association. Even though news sources can generally influence the perceived accuracy of an article (Pennycook, Cannon & Rand, 2018), this study adapted the approach used by Clayton et al. (2019) with regards to excluding the original source. Therewith, the danger of a possible third-variable effect was minimised (see Clayton et al., 2019).

The procedure of the questionnaire was as follows (see Appendix C): First,

participants had to consent to participate in the study. A preliminary question on the general news consumption via social media platforms served as introduction. Respondents were also asked if they owned a personal Facebook account. Next, they had to indicate how much they agreed or disagreed with statements about their news consumption and behaviour – serving the operationalisation of media literacy as a potential moderator. Respondents were then asked how important they considered a range of international news topics such as environmentalism and the outbreak of the coronavirus. Ultimately, they were randomly assigned to one of the four experimental groups or the control condition.

In the post-manipulation procedure, respondents had to answer questions related to their perceived accuracy of the news post, their motivation to read the full article and their likelihood to engage with the content. Afterwards, they were subjected to the open and closed recall questions as part of the manipulation check, followed by sociodemographic questions. A debriefing section told respondents about the study’s purpose. At the end, respondents were

(18)

15 asked whether they would like to participate in the raffle of two Amazon vouchers. Of the original N = 263 participants, n = 118 provided their email address to enter the competition – subsequently, two winners were randomly drawn.

Measures

Perceived accuracy

The perceived accuracy of the news post was included as one of the two dependent variables within this study’s research design. To measure the degree to which respondents considered the article to be accurate, they were asked to indicate their perceived accuracy on a 5-point scale that was based on research by Garrett, Nisbet and Lynch (2013) (1 = Not at all

accurate, 5 = Extremely accurate; M = 2.62; SD = .93) (see Appendix C, Q8).

News engagement

News engagement served as the second dependent variable in the paper. Its

operationalisation was developed based on previous studies (Ha et al., 2018; Ksiazek et al., 2016; Park & Kaye, 2018). Respondents were asked to rate the likelihood of their engagement with the news post on a 7-point scale (1 = Extremely unlikely, 7 = Extremely likely). The concept was measured by including seven different questions (e.g. “How likely is it that you

would give a comment on the news post?”; see Appendix C, Q11). The likelihood of engaging

with the news post resulted in an overall index (M = 2.20, SD = .92, Cronbach’s α = .83). The alpha coefficient indicated a high level of internal consistency and a good reliability of the questions to measure the concept. The item total statistics in SPSS suggested that Cronbach’s Alpha would not improve if a certain question was deleted from the scale.

Media literacy

As a possible moderating variable, media literacy was included within the experimental model. Previous literature is lacking a shared conceptual and operational definition of the multidimensional construct (Vraga et al., 2015). This has led to a novel

(19)

16 operationalisation in this paper, which was based on previous studies (Ashley et al., 2013; Literat, 2014; Maksl et al., 2015; Potter, 2004; Vraga & Tully, 2015; Vraga et al., 2015).

Respondents were asked on a 7-point scale from (1 = Strongly disagree, 7 = Strongly

agree), how much they agreed with nine different statements relating to their understanding

and consumption of news (e.g. “I know how to critically engage with news content.”; see Appendix C, Q4). This resulted in an overall index (M = 5.56, SD = .75, Cronbach’s α = .82).

Controls

Additional measures were added to control for possible confounding influences on perceived accuracy and news engagement. Initially, respondents were asked to indicate their gender, level of education and age. The survey also asked for their use of news on social media per week (0 = Never, 4 = Daily) and whether they had a personal Facebook account. Only n = 3 respondents had never had a personal Facebook account. The small share was however kept in the data set, since the study was solely distributed online – making it highly likely that the three respondents were present on other social media platforms where news are shared. Additionally, respondents were questioned on their motivation to read the full article on a 5-point scale (1 = Not at all motivated, 5 = Extremely motivated). As a final control variable, participants were questioned on how important they considered a range of news topics (1 = Not at all important, 5 = Extremely important). The outbreak of the coronavirus was the most important issue for all respondents on average, followed by climate change and environmentalism (Table 2).

Results

H1 predicted a higher likelihood to perceive a news post as accurate when it is

labelled as “mostly true” compared to when it is labelled as “mostly false”. A significant but weak effect between the fact-checking evaluation and perceived accuracy was found, F (2, 211) = 13.07, p < .001, η² = .11; Levene’s F (2, 211) = .46, p = .635. The fact-checking

(20)

17 evaluation had a declared share of 11% in the variation of perceived accuracy. Respondents who were exposed to the “mostly true” label significantly perceived the news post as more accurate (M = 2.87, SD = .92) than the “mostly false” group (M = 2.24, SD = .87; Table 3 & 5). With these results, H1 was accepted.

Regarding H2, it was assessed whether users are more likely to engage with a news post when it is labelled as “mostly true” than compared to “mostly false”. There was a significant but very weak effect between the fact-checking evaluation and news engagement,

F (2, 211) = 5.54, p = .005, η² = .05; Levene’s F (2, 211) = 1.47, p = .232. The results

suggested that the likelihood of engaging with the post was higher among respondents who perceived the news post as “mostly true” (M = 2.27, SD = .98) compared to “mostly false” (M = 1.96, SD = .81; Table 3). However, this difference was not statistically significant – instead, the variation in news engagement appeared due to the significant difference between the “mostly false” and the control group (Table 5). Thus, H2 was not confirmed.

H3 expected visual fact-checks to have a greater impact on the respondents’ perceived

accuracy than textual fact-checks. Results showed that respondents who were exposed to the visual (M = 2.56, SD = .98) or to the textual fact-check (M = 2.54, SD = .93) perceived the article as slightly to moderately accurate (Table 4). However, the differences between all groups did not prove as significant (Table 6), F (2, 211) = 2.11, p = .123, η² = .02; Levene’s F (2, 211) = 2.95, p = .055. Hence, H3 was not confirmed.

H4 assumed that visual fact-checks have a greater impact on news engagement than

textual fact-checks. A significant but very weak effect between the fact-checking format and news engagement was found, F (2, 211) = 3.32, p = .038, η² = .03; Levene’s F (2, 211) = .11,

p = .892. The results suggested that respondents that were exposed to the visual fact-check

showed a greater likelihood to engage with the post (M = 2.17, SD = .97) than the textual group (M = 2.07, SD = .86; Table 4). Yet, this difference was not statistically significant – instead, the

(21)

18 variation in news engagement appeared due to the significant difference between the textual and the control group (Table 6). Hence, H4 was not confirmed.

To control for possible third-variable effects on the above-examined dependent variables, a hierarchical regression analysed the impact of control variables on perceived accuracy and news engagement. Both the motivation to read the full news article (p < .001) and the level of education (p = .007) had a significant impact on perceived accuracy (Table 7). Regarding news engagement, the respondents’ motivation did again prove to be a significant influence (p < .001), in addition to the importance of the coronavirus issue (p = .035; Table 8). Nonetheless, as randomisation in the experiment aimed to control for individual effects, the control variables were left out of consideration in the further analysis.

After the examination of H1 to H4 focused on the fact-checking evaluation’s (“mostly true” vs. “mostly false”) and format’s (visual vs. textual) separate impact on perceived accuracy and news engagement, RQ1 explored whether an interaction effect between these factors occurred. Therefore, the control group was filtered out of both the fact-checking evaluation’s and the format’s variable. A two-way analysis of variance (ANOVA) showed that the fact-checking evaluation itself had a highly significant impact on perceived accuracy by explaining 12% of its variation (Table 9). As shown with H1, respondents who saw the post labelled as “mostly true” perceived the article as more accurate (M = 2.87, SE = .10) than respondents in the “mostly false” group (M = 2.24, SE = .10; Table 10). While the fact-checking format itself was not significant, there was a significant but weak interaction effect between the fact-checking evaluation and format, F (3, 164) = 8.37, p = .049, η² = .02 (Table 9). More precisely, the effect of the fact-check’s evaluation as “mostly false” on perceived accuracy significantly differed between the visual and textual format. The perceived accuracy was distinctively lower when respondents were subjected to the visual format (M = 2.10, SE = .14) than to the textual format (M = 2.36, SE = .13; Table 10). Thus, since the textual “mostly false” evaluation resulted

(22)

19 in higher perceived accuracy, the visual format was more efficient in informing and educating respondents about the falseness of the manipulated news item.

In contrast to the unconfirmed H2, the fact-checking evaluation now explained 2.9% of the variance in news engagement after the control group had been removed for the interaction analyses (Table 11). The “mostly true” evaluation evoked a higher news engagement among respondents (M= 2.27, SE = .10) than the “mostly false” evaluation (M= 1.96, SE = .10; Table 12). However, neither the fact-checking format itself nor the interaction between evaluation and format had a significant impact on news engagement.

Finally, RQ2 examined whether the respondents’ level of media literacy had a moderating effect on the relation between the fact-checking evaluation and format with the dependent variables. A median split was used in order to divide respondents into categories of low and high media literacy. Therefore, the median of the original variable (Md = 5.56) was applied as the cut-off point. Thus, n = 111 (51.9%) (n = 82, excluding the control group for moderation in ANOVA) of all respondents showed a low level of media literacy, while n = 103 (48.1%) (n = 86 excluding the control group) had a high level of media literacy.

Both fact-checking evaluation (p < .001) and media literacy (p = .009) showed a main effect on perceived accuracy (Table 13). Thereby, media literacy explained 4% of the variance in perceived accuracy. Respondents with a low media literacy perceived the article as more accurate (M = 2.74, SE = .10) compared to respondents with a high level of media literacy (M = 2.38, SE = .10; Table 14). Despite its significant main effect, media literacy did not have a significant moderating impact on the relation between the fact-checking evaluation and perceived accuracy. Neither did media literacy have a significant moderating effect on the relation between the fact-checking format and perceived accuracy (Table 15 & 16).

Concerning news engagement, media literacy did neither have a moderating impact on the relation between the fact-checking evaluation and the engagement with the content (Table

(23)

20 17 & 18), nor on the correlation between the fact-checking format and engagement (Table 19 & 20). Neither did it directly affect news engagement. Concerning RQ2, this study concluded that no moderating effect of media literacy was found. However, despite the low proportion of explained variance in perceived accuracy, the significant main effect of media literacy called for interpretation in the further analysis of the results. In order to answer the overall research question that was posed in the beginning of this study, the presented findings were discussed in the following and final part of this paper.

Conclusion and Discussion

This study examined the extent of which the format and evaluation of fact-checks on news posts on social media affect the perceived accuracy and news engagement of social media users in connection with their level of media literacy. An experiment was conducted against the backdrop of the ongoing coronavirus pandemic, a time in which the dissemination of false information is becoming an increasing threat to society and where the verification of facts in online spheres is getting more important than ever.

The fact-checking evaluation had a significant effect on both perceived accuracy (H1) and news engagement (RQ1), with the differentiation between “mostly true” and “mostly false” having a greater impact on perceived accuracy than on news engagement. The simulated news post was perceived as significantly more accurate when the fact-check labelled the post as “mostly true” compared to “mostly false”. The findings corroborated results of earlier studies (Clayton et al., 2019; Pennycook et al., 2020; Tandoc et al., 2019) and proved that people are influenced in their opinion and perception by the evaluation of a fact-checker. Against the assumptions of prior studies (Amazeen et al., 2018; Lazard & Atkinson, 2015; Young et al., 2018), the fact-checking formats did not cause any significant variation in perceived accuracy (H3) and news engagement (H4) – suggesting that the format in which the fact-checking evaluation appears does not play a role.

(24)

21 However, a significant interaction effect between the fact-checks’ evaluation and format showed, that the visual format was more successful in informing respondents about the news posts’ falsehood than the textual format. The visual cue seemed to prompt respondents to be even more sceptical with regards to assigning accuracy to the news post. With this result, this study recommends the use of visual fact-checking labels for fact-checking

organisations as they are more effective in revealing falsehood than textual labels – possibly due to the visual stimulation of rating scales and limited attention paid to textual explanations of why an information is false. This finding goes in line with the study by Lazard and

Atkinson (2015), which found visual signals to being centrally processed as opposed to peripherally. With reference to the ELM (Petty & Cacioppo, 1986), which argues that the central processing of information leads to more consistent formation of attitudes, this study confirms that visual fact-checks are more effective in informing people of an information’s falsehood than textual labels. However, it must be kept in mind that the ELM refers to the formation of attitudes, whereas the goal of fact-checking is not to influence people, but to inform and educate them. Nevertheless, the basic idea of differing ways of processing information was successfully applied to the differences in perceived accuracy between the fact-checking evaluation as “mostly false” and the formats.

Regarding media literacy, no moderating effect on either of the dependent variables was found. However, media literacy directly and significantly affected the perceived accuracy of the news post – an outstanding finding of this experiment. A lower level of media literacy led to higher perceived accuracy, while respondents with higher media literacy levels

perceived the post as less accurate. This suggests that people with higher levels of media literacy are more critical towards possible manipulation in their perceived accuracy and that they carefully evaluate the information – a finding in line with the opinion of Jolls and Johnson (2018). Thus, people with higher levels of media literacy have a higher awareness of

(25)

22 the distinction between true and false information – regardless of fact-checking evaluations and the format in which they appear. In contrast, people with lower levels of media literacy are more likely to assign accuracy without deliberate consideration. This implies that the application of fact-checking labels to evaluate an information’s veracity is by no means necessary for people with high media literacy, as they have adequate competences to assess its accuracy. With this result in mind, the teaching and acquisition of media literacy should become the primary supplement or even an alternative to mere fact-checking – not only because teachers and schoolchildren advocate a stronger education in media literacy (Lie Detectors, 2019). Additionally, Kim (2018) highlights “the quality of media use, political and civic engagement, and increase of self-expression” as the positive outcomes of applied media literacy (p. 585) – emphasising media literacy’s beneficial effect on society.

Despite not being in significant connection with fact-checking itself, media literacy remains an important factor in the issue of online disinformation. After all, the application of autonomous media literacy skills to differentiate between true and false information becomes even more important with the current coronavirus pandemic (Devlin & Pohjola, 2020).

Limitations, Further Research and Outlook

Certain limitations must however be considered when interpreting the results of this study. Firstly, the sample was not representative for social media users worldwide, as no proportionate number of respondents per country was sampled. This also limits the generalisability of the findings and the study’s internal and external validity. In comparison, the reliability of all measurements in the survey was high, as the applied metric scales were consistent and could be reapplied in future research (Babbie, 2005). Nevertheless, the limited representativeness of the findings did not have a severe impact on the analysis itself.

Secondly, it is important to note that the experimental stimuli represented an example of misinformation, since the information of the news post was only slightly manipulated. A

(26)

23 more extreme disinforming stimulus – for instance concerning the claim that drinking water prevents people from contracting the coronavirus (“World Health Organization refutes viral claims”, 2020) – might have led to more significant results concerning a moderating effect of media literacy. Considering the findings by Amazeen et al. (2018) and Lazard and Atkinson (2015), who emphasised people’s preference of combined visual and textual elements when it comes to checking, the results in this study might have differed if a combining fact-checker in an infographic style would have been included within the experimental stimuli. Further research is encouraged to replicate this experiment with more radical stimuli and a larger sample size.

Thirdly, media literacy and news engagement were queried and not directly measured. Future studies could capture the construct more ideally by using knowledge questions (e.g. on media systems). News engagement could be captured by including like or share buttons instead of asking respondents to indicate their hypothetical likelihood to engage. This would help to avoid a possible response bias towards socially desirable answers. Further studies could also conduct an eye-tracking study to see how differing fact-checking labels modify people’s examination of news content.

As a final constraint, it is important to note that the experimental groups were not evenly distributed. The comparatively high loss of respondents in the visual stimulus

conditions might have happened because of the thumb label as an option in the closed recall question and its misleading similarity with the like button within the manipulated post. To mitigate the failure in the closed recall question, one could have replaced the visual icons with textual options.

If future research manages to incorporate some of the above-mentioned limitations, it may render a more comprehensive answer to an issue of paramount importance to

(27)

24 To sum up, this study has contributed to existing research on the effectiveness of fact-checking on perceived accuracy and news engagement by incorporating the concept of media literacy. Also, the paper added to future research by developing its own conceptual and operational definition of media literacy, as research is still lacking a common

operationalisation. By finding that people with high levels of media literacy had a lower perceived accuracy regardless of the fact-checking evaluation and format, this project underlines the importance of teaching media literacy in the future. Thus, media literacy should not be secondary to external fact-checking, but rather the primary weapon against misinformation and disinformation altogether.

References

Action Plan Against Disinformation (2018). Retrieved from

https://eeas.europa.eu/headquarters/headquarters-homepage/54866/action-plan-against-disinformation_en

Amazeen, M. A., Thorson, E., Muddiman, A., & Graves, L. (2018). Correcting Political and Consumer Misperceptions: The Effectiveness and Effects of Rating Scale Versus Contextual Correction Formats. Journalism & Mass Communication Quarterly, 95(1), 28–48. https://doi.org/10.1177/1077699016678186

Amazeen, M. A., Vargo, C. J., & Hopp, T. (2019). Reinforcing attitudes in a gatewatching news era: Individual-level antecedents to sharing fact-checks on social media.

Communication Monographs, 86(1), 112–132.

https://doi.org/10.1080/03637751.2018.1521984

Ashley, S., Maksl, A., & Craft, S. (2013). Developing a News Media Literacy Scale.

Journalism & Mass Communication Educator, 68(1), 7–21.

(28)

25 Avaaz. (2020). How Facebook can Flatten the Curve of the Coronavirus Infodemic. Retrieved

from https://secure.avaaz.org/campaign/en/facebook_coronavirus_misinformation/ Babbie, E. (2005). The basics of social research (3rd). Belmont, CA: Thomson Wadsworth. Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication

and the decline of democratic institutions. European Journal of Communication, 33(2), 122–139. https://doi.org/10.1177/0267323118760317

Bradshaw, S. & Howard, P. N. (2019). The Global Disinformation Order. 2019 Global Inventory of Organised Social Media Manipulation. The Computational Propaganda

Project at the Oxford Internet Institute. Retrieved from

https://comprop.oii.ox.ac.uk/research/cybertroops2019/

Clayton, K., Blair, S., Busam, J. A., Forstner, S., Glance, J., Green, G., . . . Nyhan, B. (2019). Real Solutions for Fake News? Measuring the Effectiveness of General Warnings and Fact-Check Tags in Reducing Belief in False Stories on Social Media. Political

Behavior, 38(2), 173. https://doi.org/10.1007/s11109-019-09533-0

Devlin, A. & Pohjola, S. (2020, April 30). Critical media literacy to fight fake news.

Retrieved from https://eaea.org/2020/04/30/critical-media-literacy-to-fight-fake-news/ Fighting disinformation. (n.d.). Retrieved from

https://ec.europa.eu/info/live-work-travel-eu/health/coronavirus-response/fighting-disinformation_en

Garrett, R. K., Nisbet, E. C., & Lynch, E. K. (2013). Undermining the Corrective Effects of Media-Based Political Fact Checking? The Role of Contextual Cues and Naïve Theory.

Journal of Communication, 63(4), 617–637. https://doi.org/10.1111/jcom.12038

Graves, L. (2016). Deciding what’s true: The fact-checking movement in American

(29)

26 Graves, L., & Glaisyer, T. (2012). The fact-checking universe in spring 2012: An overview.

Washington, DC: New America Foundation.

Graves, L., & Konieczna, M. (2015). Sharing the News: Journalistic Collaboration as Field Repair. International Journal of Communication, 9(19), 1966–1984.

https://ijoc.org/index.php/ijoc/article/view/3381

Ha, L., Xu, Y., Yang, C., Wang, F., Yang, L., Abuljadail, M., . . . Gabay, I. (2018). Decline in news content engagement or news medium engagement? A longitudinal analysis of news engagement since the rise of social and mobile media 2009–2012. Journalism,

19(5), 718–739. https://doi.org/10.1177/1464884916667654

How is Facebook addressing false news through third-party fact-checkers?. (n.d.). Retrieved from https://www.facebook.com/help/1952307158131536

Hänska, M., & Bauchowitz, S. (2017). Tweeting for Brexit: How Social Media Influenced the Referendum. Retrieved from http://eprints.lse.ac.uk/84614/

Ireton, C., & Posetti, J. (Eds.) (2018). UNESCO series on journalism education. Journalism,

"fake news" & disinformation: Handbook for journalism education and training. Paris:

United Nations Educational Scientific and Cultural Organization. Retrieved from http://unesdoc.unesco.org/images/0026/002655/265552E.pdf

Jolls T., & Johnsen, M. (2018). Media Literacy: A Foundational Skill for Democracy in the 21st Century. Hastings Law Journal, 69(5), 1379–1408. Retrieved from

http://www.hastingslawjournal.org/media-literacy-a-foundational-skill-for-democracy-in-the-21st-century/

Kessler, G. (2017, January 1). About The Fact Checker. Retrieved from

https://www.washingtonpost.com/politics/2019/01/07/about-fact-checker/

Kim, E.-m. (2018). 30. Media Literacy. In P. M. Napoli (Ed.), Mediated Communication (pp. 585–608). Berlin, Boston: De Gruyter. https://doi.org/10.1515/9783110481129-031

(30)

27 Korecki, N. (2019, February 20). ‘Sustained and ongoing’ disinformation assault targets Dem

presidential candidates. Retrieved from

https://www.politico.com/story/2019/02/20/2020-candidates-social-media-attack-1176018

Ksiazek, T. B., Peer, L., & Lessard, K. (2016). User engagement with online news:

Conceptualizing interactivity and exploring the relationship between online news videos and user comments. New Media & Society, 18(3), 502–520.

https://doi.org/10.1177/1461444814545073

Lanoszka, A. (2018). Disinformation in International Politics. SSRN Electronic Journal. Advance online publication. https://doi.org/10.2139/ssrn.3172349

Lazard, A., & Atkinson, L. (2015). Putting Environmental Infographics Center Stage. Science

Communication, 37(1), 6–33. https://doi.org/10.1177/1075547014555997

Lie Detectors (2019). Tackling Disinformation Face to Face: Journalists’ Findings From the Classroom.

Literat, I. (2014). Measuring New Media Literacies: Towards the Development of a

Comprehensive Assessment Tool. Journal of Media Literacy Education, 6(1). Retrieved from https://digitalcommons.uri.edu/jmle/vol6/iss1/2/

Lyons, T. (2018, June 14). Hard Questions: How is Facebook’s Fact-Checking Program Working?. Retrieved from https://about.fb.com/news/2018/06/hard-questions-fact-checking/

Maksl, A., Ashley, S. & Craft, S. (2015). Measuring News Media Literacy. Journal of Media

Literacy Education, 6(3), 29-45. Retrieved from

(31)

28 McGrew, S., Ortega, T., Breakstone, J. & Wineburg, S. (2017). The Challenge That’s Bigger

than Fake News: Civic Reasoning in a Social Media Environment. American Educator,

41(3), 4-9. Retrieved from https://eric.ed.gov/?id=EJ1156387

Park, C. S., & Kaye, B. K. (2018). News Engagement on Social Media and Democratic Citizenship: Direct and Moderating Roles of Curatorial News Use in Political Involvement. Journalism & Mass Communication Quarterly, 95(4), 1103–1127. https://doi.org/10.1177/1077699017753149

Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology. General, 147(12), 1865– 1880. https://doi.org/10.1037/xge0000465

Pennycook, G., Bear, A., Collins, E. T., & Rand, D. G. (2020). The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy of Headlines Without Warnings. Management Science. Advance online publication. https://doi.org/10.1287/mnsc.2019.3478

Petty, R. E., & Cacioppo, J. T. (1986). The Elaboration Likelihood Model of Persuasion. In L. Berkowitz (Ed.), Advances in Experimental Social Psychology. Advances in

experimental social psychology (Vol. 19, pp. 123–205). New York: Academic Press.

https://doi.org/10.1016/S0065-2601(08)60214-2

Porten-Cheé, P., Kunst, M., & Emmer, M. (2020). Online Civic Intervention: A New Form of Political Participation Under Conditions of a Disruptive Online Discourse. International

Journal Of Communication, 14(21), 514–534. Retrieved from

https://ijoc.org/index.php/ijoc/article/view/10639

Potter, W. J. (2004). Argument for the Need for a Cognitive Theory of Media Literacy.

American Behavioral Scientist, 48(2), 266–272.

(32)

29 Rathi, A. & Hodges, J. (2020, February 19). Virus Cuts China’s Carbon Emissions by 100

Million Metric Tons. Retrieved from https://www.bloomberg.com/news/articles/2020-02-19/virus-cuts-china-s-carbon-emissions-by-100-million-metric-tons

Rosen, G., Harbath, K., Gleicher, N. & Leathern, R. (2019, October 21). Helping to Protect the 2020 US Elections. Retrieved from https://about.fb.com/news/2019/10/update-on-election-integrity-efforts/

Rosen, G. (2020, April 16). An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19. Retrieved from

https://about.fb.com/news/2020/04/covid-19-misinfo-update/

Stencel, M. & Luther, J. (2020, April 3). Update: 237 fact-checkers in nearly 80 countries…and counting. Retrieved from https://reporterslab.org/latest-news/

Talwar, S., Dhir, A., Kaur, P., Zafar, N., & Alrasheedy, M. (2019). Why do people share fake news? Associations between the dark side of social media use and fake news sharing behavior. Journal of Retailing and Consumer Services, 51, 72–82.

https://doi.org/10.1016/j.jretconser.2019.05.026

Tandoc, E. C., Lim, D., & Ling, R. (2019). Diffusion of disinformation: How social media users respond to fake news and why. Journalism, 12, 146488491986832.

https://doi.org/10.1177/1464884919868325Sdlfj

The commitments of the code of principles. (n.d.). Retrieved from

https://ifcncodeofprinciples.poynter.org/know-more/the-commitments-of-the-code-of-principles

Vraga, E. K., & Tully, M. (2015). Media Literacy Messages and Hostile Media Perceptions: Processing of Nonpartisan Versus Partisan Political Information. Mass Communication

(33)

30 Vraga, E., Tully, M., Kotcher, J., Smithson, A. & Broeckelman-Post, M. (2015). A

Multi-Dimensional Approach to Measuring News Media Literacy. Journal of Media Literacy

Education, 7(3), 41 -53. Retrieved from

https://digitalcommons.uri.edu/jmle/vol7/iss3/4/

World Health Organization. (2020). Munich Security Conference. Retrieved from https://www.who.int/dg/speeches/detail/munich-security-conference

World Health Organization refutes viral claims that holding your breath can test for COVID-19. (2020, March 11). Retrieved from

https://factcheck.afp.com/world-health-organization-refutes-viral-claims-holding-your-breath-can-test-covid-19 Young, D. G., Jamieson, K. H., Poulsen, S., & Goldring, A. (2018). Fact-Checking

Effectiveness as a Function of Format and Tone: Evaluating FactCheck.org and FlackCheck.org. Journalism & Mass Communication Quarterly, 95(1), 49–75. https://doi.org/10.1177/1077699017710453

(34)

31

Appendix A: Tables

Table 1: Distribution of experimental conditions before and after manipulation check

Fact-checking methodology Label N = 263

(before MC)

N = 214

(after MC)

Visual Mostly True 51 39

Visual Mostly False 54 39

Textual Mostly True 53 45

Textual Mostly False 53 45

Control 52 46

Table 2: Average means of control variables before and after the manipulation check

Control variables M (SD)

(before MC)

M (SD)

(after MC)

Age 27.42 (6.09) 26.99 (5.37)

Social media news use per week 3.38 (1.09) 3.43 (1.06)

Motivation to read 2.60 (1.07) 2.54 (1.04)

Importance: Coronavirus 4.63 (.60) 4.65 (.58)

Importance: Climate change & environmentalism 4.35 (.74) 4.34 (.75) Importance: Refugee crisis at Turkish-Greek border 3.74 (1.01) 3.72 (1.02) Importance: Bushfires across Australia 3.64 (.91) 3.61 (.92) Importance: US presidential primary elections 3.50 (.89) 3.43 (.91)

Table 3: Average means of dependent variables across the fact-checking evaluation

Evaluation Perceived accuracy News engagement

M (SD) N M (SD) N

Mostly true 2.87 (.92) 84 2.27 (.98) 84

Mostly false 2.24 (.87) 84 1.96 (.81) 84

Control 2.87 (.83) 46 2.49 (.90) 46

(35)

32

Table 4: Average means of dependent variables across the fact-checking format

Format Perceived accuracy News engagement

M (SD) N M (SD) N

Visual 2.56 (.98) 78 2.17 (.97) 78

Textual 2.54 (.93) 90 2.07 (.86) 90

Control 2.87 (.83) 46 2.49 (.90) 46

Note. N = 214.

Table 5: Post-Hoc Tests H1 & H2

Randomiser Perceived accuracy News engagement

M (Difference) 95% CI M (Difference) 95% CI Mostly true Mostly false .63* [.30, .96] .30 [-.03, .64] Control -.00 [-.39, .39] -.22 [-.62, .18] Mostly false Mostly true -.63* [-.96, -.30] -.30 [-.64, .03] Control -.63* [-1.02, -.24] -.53* [-.93, -.13] Control Mostly true .00 [-.39, .39] .22 [-.18, .62] Mostly false .63* [.24, 1.02] .53* [.13, .93]

Note. *The mean difference is significant at the .05 level.

Table 6: Post-Hoc Tests H3 & H4

Randomiser Perceived accuracy News engagement

M (Difference) 95% CI M (Difference) 95% CI Visual Textual .02 [-.33, .37] .10 [-.24, .44] Control -.31 [-.72, .11] -.32 [-.73, .08] Textual Visual -.02 [-.37, .33] -.10 [-.44, .24] Control -.33 [-.73, .08] -.42* [-.82, -.02] Control Visual .31 [-.11, .72] .32 [-.08, .73] Textual .33 [-.08, .73] .42* [.02, .82]

(36)

33

Table 7: Summary of hierarchical regression analysis for control variables predicting perceived accuracy

Control

variable Model 1 Model 2 Model 3

B SE (B) β B SE (B) β B SE (B) β Gender -.07 .09 -.06 -.03 .08 -.02 -.02 .08 -.01 Age .02 .01 .11 .02 .01 .13 .02 .01 .12 Education -.23 .08 -.20** -.22 .08 -.19** -.21 .08 -.19** SM use .02 .06 .02 .02 .06 .03 Fb account .10 .23 .03 .12 .23 .04 Motivation to read .35 .06 .39*** .34 .06 .38*** Corona -.10 .11 -.06 Climate .02 .09 .01 Refugees -.02 .07 -.02 Bushfires .05 .07 .05 Elections -.02 .07 -.02 R2 .04 .19 .19 Note. N = 214; *p < .05; **p < .01; ***p < .001

(37)

34

Table 8: Summary of hierarchical regression analysis for control variables predicting news engagement

Control

variable Model 1 Model 2 Model 3

B SE (B) β B SE (B) β B SE (B) β Gender -.11 .09 -.09 -.06 .08 -.05 -.05 .08 -.04 Age -.01 .01 -.06 .00 .01 -.00 -.01 .01 -.04 Education -.10 .08 -.09 -.10 .08 -.09 -.10 .08 -.09 SM use .06 .06 .07 .07 .06 .08 Fb account -.09 .22 -.03 -.02 .22 -.01 Motivation to read .41 .05 .47*** .41 .06 .46*** Corona -.21 .10 -.14* Climate -.07 .09 -.06 Refugees -.02 .06 -.02 Bushfires .09 .07 .09 Elections .05 .07 .05 R2 .03 .26 .28 Note. N = 214; *p < .05; **p < .01; ***p < .001

Table 9: Two-way ANOVA for interaction effect between fact-checking evaluation and fact-checking format on perceived accuracy

Source Model 1 Model 2

df MS F p η2 part. df MS F p η2 part. Evaluation 1 17.68 22.36 .000*** .120 Format 1 .02 .02 .887 .000 Evaluation * Format 1 3.11 3.93 .049* .023

Note. N = 168; adjusted R2 = .117; Evaluation (0 = Mostly true, 1 = Mostly false); Format (0 = Visual, 1 = Textual)

(38)

35

Table 10: Descriptive statistics for interaction effect between fact-checking evaluation and fact-checking format on perceived accuracy

Source Model 1 Model 2

M SE n M SE n

Mostly true 2.87 .10 84

Mostly false 2.24 .10 84

Visual 2.56 .10 78

Textual 2.54 .09 90

Mostly true * visual 3.03 .14 39

Mostly true * textual 2.73 .13 45

Mostly false * visual 2.10 .14 39

Mostly false * textual 2.36 .13 45

Note. N = 168

Table 11: Two-way ANOVA for interaction effect between fact-checking evaluation and fact-checking format on news engagement

Source Model 1 Model 2

df MS F p η2 part. df MS F p η2 part. Evaluation 1 4.02 4.93 .028* .029 Format 1 .39 .48 .489 .003 Evaluation * Format 1 .28 .34 .560 .002

Note. N = 168; adjusted R2 = .015; Evaluation (0 = Mostly true, 1 = Mostly false); Format (0 = Visual, 1 = Textual)

Referenties

GERELATEERDE DOCUMENTEN

Maar het denken stuit op zijn eigen grenzen: 'De laatste stap van het verstand is te erkennen dat er oneindig veel dingen zijn die het te boven gaan: het is door en door zwak

De Nederlandse Algemene Keuringsdienst voor zaaizaad en pootgoed van landbouwgewassen is een van de instanties die deze testen uitvoert.. Mede dankzij

Belangrijke ver- anderingen met de invoering van de regeling functiebeloning zijn het afschaffen van de maatregel- subsidies, een vaste bijdrage van 140 gulden in

Hoe zijn Eye Body uit 1963 en Interior Scroll uit 1975 van Carolee Schneemann te interpreteren door het gebruiken van de theorie van performativiteit van gender zoals ontwikkeld

Accurate relative stopping power prediction from dual energy CT for proton therapy: Methodology and experimental validation1.

This allows for consistency between quasi-steady pressure distributions (the difference between two steady solutions) and unsteady solutions at zero frequency. In

This study focuses on investigating the reinforcing behavior of a TESPT modified lignin-based filler in a SSBR/BR blend in comparison to CB and silica/TESPT.. With mechanical

The focus is on developing robust proxies to go beyond the physical evaluation perspective, and to extract socio- economic information and functional assessment of urban areas using