• No results found

Misinformation supplied in a lab coat: mental effort as the key to resistance?A study into the effects of misinformation, source expertise, and need for cognition, on beliefs about and attitudes towards vaccination.

N/A
N/A
Protected

Academic year: 2021

Share "Misinformation supplied in a lab coat: mental effort as the key to resistance?A study into the effects of misinformation, source expertise, and need for cognition, on beliefs about and attitudes towards vaccination."

Copied!
51
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Misinformation supplied in a lab coat: mental

effort as the key to resistance?

A study into the effects of misinformation, source expertise, and need for cognition, on beliefs about and attitudes towards vaccination.

Name: Linde van Rooijen Student ID: 10575839

Master´s Thesis Political Communication and Journalism Graduate School of Communication

Supervisor: dr. R. (Rachid) Azrout Date: 26-06-2020

(2)

1 Abstract

The year 2020 is marked by the rapid spread of COVID-19, and the drastic measures taken worldwide to stop it. This pandemic underlines the potential dangers of the spread of online misinformation: messages containing false information and conspiracies about secret cures and immunizing products find their way through the Internet just as easily as actual relevant messages containing true information. However, the risks of online misinformation go beyond health communication: it affects news, politics, and therefore, indirectly, the fundaments of democracy. The effects of online misinformation on beliefs and attitudes, have not been researched extensively yet in the Netherlands, let alone in the context of vaccination, from an Elaboration Likelihood Model point-of-view. This study fills this gap by answering the question whether, and under what circumstances, misinformation effects are visible, through an online survey-experiment amongst Dutch adults (N = 192). Analyses show that no significant effects of information type on vaccination beliefs or attitudes are found. The results partially point in the expected direction: people exposed to misinformation, are slightly more likely to have negative beliefs about, but less likely to have negative attitudes towards vaccination. Source type does play a small, yet insignificant role as a moderator, and this also applies to different levels of Need For Cognition. These results are placed into context, and implications are discussed. Longitudinal future research is advised, to complement current findings and learn further about the crucial battle against misinformation and its effects.

Keywords: misinformation, vaccination, Elaboration Likelihood Model, source

(3)

2 Introduction

During a time in which the corona-virus is rapidly spreading throughout the world, the pandemic is an inescapable topic of discussion. Besides all the press conferences, articles and opinionated (non-)experts commenting on taken measures, the Internet, as is the case with many big news events, gets filled with loads of misinformation. Misinformation entails the spreading of false facts, such as accusations towards China; claiming the country created the virus for economic purposes, or secret cures and immunizing products (Frenkel, Alba, & Zhong, 2020). The WHO declared to not only be fighting a pandemic, but an ‘infodemic’ as well; referring to the amounts of false information and rumors being spread about the corona-virus (Richtel, 2020).

By default, a democracy can only function based on an informed populace (Allcott, Gentzkow, & Yu, 2019; Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000; Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012). It is not without reason that in 2013, the World Economic Forum placed ‘online misinformation’ in the top 10 of main threats facing the world (Del Vicario et al., 2016). The classification of misinformation as a worldwide threat, is corroborated in the current health crisis. While the WHO tries to fight unfounded rumors about COVID-19, US President Donald Trump encourages people to drink bleach to kill the virus (Noor, 2020). In the past months, worldwide protests brought together those who see the national lockdowns as a threat to their freedom, while often also opposing the use of a still-to-be created vaccine against the virus (Bogel-Burroughs, 2020). From a scientific perspective, protesting against a vaccine that suppresses a deadly and economically catastrophic virus, is an obvious threat to public health.

In recent years, the anti-vaccination movement has particularly been active online (Kata, 2010; 2012). One prominent aspect of the Internet is the lack of gate-keeping possibilities, like there are for established media. With the large amounts of information being uploaded,

(4)

3

inevitably the online public sphere gets filled with misinformation, making it an important platform to study (Kata, 2010).

How we process this online (mis)information that we are exposed to, and whether it will persuade us, partly depends on the extent to which we elaborate on the message. Following the Elaboration Likelihood Model (ELM) by Petty and Cacioppo (1986), our elaboration follows either a central or peripheral route towards persuasion. The peripheral route is relatively superficial: “a result of some simple cue in the persuasion context (e.g., an attractive source) that induced change without necessitating scrutiny of the central merits of the issue-relevant information presented” (Petty & Cacioppo, 1986, p. 3). Thus, rather than on actual argumentation, people base their evaluations on cues, such as an attractive source. People generally accept information from sources they consider credible (Hovland & Weiss, 1951), and when they want to know something that they lack expertise on, people like to listen to someone they do consider an expert (Lachapelle, Montpetit, & Gauvin, 2014). However, expertise, “the extent to which the source is perceived as being capable of providing correct information” (Bristor, 1990, as cited in Ismagilova, Slade, Rana, & Dwivedi, 2020, p. 2), is not an objective assessment. Most people trust scientific evidence when it comes to complex issues (Mooney & Schuldt, 2008), yet those opposing vaccination, do not all consider scientific research credible (Jolley & Douglas, 2014). This raises the question what role simple cues and the peripheral route of the ELM play in the persuasiveness of (mis)information on vaccination.

The opposite, ‘central’ route in the ELM represents thorough deliberation when evaluating a message, or “a person's careful and thoughtful consideration of the true merits of the information presented in support of an advocacy” (Petty & Cacioppo, 1986, p. 3). In order to ‘follow’ this central route, people need a certain level of ability and motivation to elaborate. These levels are strongly related to people’s Need For Cognition (NFC): their “need to structure relevant situations in meaningful, integrated ways” (Cohen, Stotland, & Wolfe, 1955, p. 291).

(5)

4

As an internal psychological factor related to the ELM, NFC forms an interesting counterpart to the potential external influence of source type.

Not only is it scientifically relevant to see whether misinformation effects differ for people taking various ELM routes, we also see that many studies focused on online vaccination information, are based on content analyses (Nan & Madden, 2012). Few scholars studied individual cognitive factors as moderators of attitude effects, which leaves a gap to fill by studying the effects of NFC. Moreover, this study contributes to literature, by focusing on a Dutch context. In the Netherlands, misinformation has not been studied extensively yet. The few Dutch misinformation studies are experiments that, - like many non-Dutch studies – mostly focus on political themes or news (Roozenbeek & Van der Linden, 2018), or crimes and witness statements (Van Bergen, Horselenberg, Merckelbach, Jelicic, & Beckers, 2009), rather than on vaccination. Moreover, more attention is paid to memory, than to beliefs and attitudes (Van Bergen et al., 2009). The current worldwide health crisis and the gap that needs to be filled in the Netherlands concerning online misinformation effects in a vaccination context, led to the following research question:

RQ: To what extent does misinformation on vaccination affect people’s beliefs about and attitudes towards vaccination, and what role do NFC and source expertise play in this?

Theoretical Framework

The Potential Risks of Misinformation

In order to have beliefs about and attitudes towards topics, people need to gather information first. Lee and Shin (2019) state that “perceiving a message as a truthful statement” is a prerequisite for “any cognitive, affective, and/or behavioral changes to occur” (p. 2). How we assess the truth of information we encounter, is often based on four considerations: the extent to which the information is compatible with what the reader believes, to what extent the

(6)

5

story is coherent, whether it comes from a credible source, and if others believe the information to be true (Lewandowsky et al., 2012). Since these evaluations are highly personal, misinformation could be interpreted as ‘the truth’. Consequently, based on misinformation, people can form misperceptions; “factual beliefs that are false or contradict the best available evidence in the public domain” (Flynn, Nyhan, & Reifler, 2017, p. 128). Lewandowsky et al. (2012) argued that misinformation might result in stronger effects than ignorance, as those who are ignorant are aware of lacking knowledge, whereas people encountering misinformation, believe they are being informed (Lewandowsky et al., 2012). So contrary to being uninformed, which implies having no factual beliefs whatsoever, people are considered misinformed when they “confidently” (Kuklinski et al., 2000, p. 792) hold beliefs that are factually inaccurate.

The difficulty of resisting misinformation is underlined by the fact that it has proven to be remarkably influential in opinion formation. When getting picked up as real information, misinformation can have significant impact on political decisions (Lewandowsky, Ecker, & Cook, 2017). The major political consequences are visible in historical developments such as the fight against HIV and AIDS, people’s previous aversion against genetically modified foods, and, currently, climate change (Flynn et al., 2017).

Vaccination Information: The Internet Filled With False Facts

As with most complex topics debated in the media, people have different reasons to doubt the benefits of vaccination.1 Nowadays, the main reasons to be sceptic, seem to be the suspicion of science, belief in alternative medicine, and the notion of parental autonomy and responsibility (Kata, 2010). Some even interpret the promotion of vaccination as a conspiracy (Jolley & Douglas, 2014; 2017). For instance, they believe MMR (measles, mumps and rubella)

1 Although religions such as the Islam see vaccination as a gift from God, protecting life and contributing to the

minimization of harm (Pelčić et al., 2016), other religious communities such as orthodox protestants, argue vaccination to be against the trust in divine providence (Ruijs et al., 2012). For this study, people objecting vaccination for religious reasons are not the main focus.

(7)

6

vaccines cause autism, as concluded in an infamous study by Wakefield (1998). Although this study has been retracted and massively discredited, people still believe the findings. Furthermore, people distrust ‘Big Pharma’: large pharmaceutical companies supposedly bribing researchers to fake data, in order to make money (Jolley & Douglas, 2014).

People with strong anti-vaccination attitudes show individual similarities, such as thinking they know more about vaccination, than they actually do (Motta, Callaghan, & Sylvester, 2018), commonly known as the Dunning-Kruger effect (Kruger & Dunning, 1999). They also tend to be more conspiratorial in general, compared to people who trust vaccinations (Mitra, Counts, & Pennebaker, 2016), and they often share feelings of “powerlessness, disillusionment and mistrust in authorities” (Jolley & Douglas, 2014, p. 1), including mistrust in scientific evidence. As mentioned, conspiracies and other theories based on misinformation, can do serious harm to populations, among others because they are difficult to ‘combat’ with facts. Conspiracists often interpret refutation as additional proof for their convictions, thereby discrediting all counter-evidence, and minimizing the possibility of a valuable debate (Sunstein & Vermeule, 2009). Furthermore, the modern-day anti-vaccination movement mostly shares its ideas online, which makes it hard to ‘fact-check’ the information (Kata, 2010; Lee & Shin, 2019). Without this gate-keeping, people depend on their own previously mentioned ‘strategy’ (Lewandowsky et al., 2012) to find out whether information is true. This is risky, as studies show that nowadays people are inclined to use the Internet to get their vaccination information and advice, rather than consulting a doctor (Jolley & Douglas, 2017; Lewandowsky et al., 2012). Meanwhile, research shows that studies on vaccination often get misrepresented online: “drawing false conclusions from research, using sources untruthfully, and describing data very selectively” (Kata, 2010, p. 1713). Therefore, those who spread misinformation in the form of anti-vaccination theories, can indirectly cause physical harm to others, and ultimately to public health (Mitra et al., 2016).

(8)

7

Fighting misinformation is a complex, yet crucial battle to maintain a well-informed populace. Since vaccination is a complicated topic, on which not everyone has in-depth knowledge, people logically turn to the Internet for basic information (Song, Zhao, Song, & Zhu, 2019). However, when people get exposed to misinformation on vaccination, it is not easy to recognize and resist these falsehoods (Vraga & Bode, 2018). Exposing people to misinformation on a complex topic like vaccination, is therefore likely to affect the development of beliefs about, and subsequently, negative attitudes towards vaccination. Therefore, the expectations for the first hypothesis, are as follows:

H1: Participants exposed to misinformation about vaccination, will show more false beliefs about, and negative attitudes towards vaccination, than those exposed to real information.

Elaboration Likelihood as the Key to Effective Communication

Scholars like Jolley and Douglas (2014; 2017) and Kata (2010; 2012) make clear that exposure to misinformation on the Internet is hard to circumvent, and that vaccination is a subject complex enough for people to want to get informed on, however, on the Internet. If people are not able to distinguish facts from falsehoods based on the information itself, this might even become more challenging when misinformation is supplied by someone portrayed as a credible expert.

Logically, people do not have time or motivation to educate themselves on every topic, so they turn to experts as heuristics for opinion formation (Lachapelle et al., 2014). Whether or not an individual trusts the words of an expert, mainly depends on the credibility someone assigns to that expert (Hovland & Weiss, 1951). Hovland and Weiss laid the basis of source credibility research, by showing that perceived source credibility does not necessarily influence the amount of information people remember, yet it does affect the extent to which a source can change people’s opinions (1951). As discussed in the Introduction, according to the ELM, the

(9)

8

cognitive route that people take - the central or peripheral -, determines whether and how messages persuade people (Petty & Cacioppo, 1986).

The source of a message is one of many simple cues people could resort to, when taking the peripheral route towards persuasion (Petty & Cacioppo, 1986). From this perspective, the message is relatively irrelevant, as long as the source comes across as credible. Source credibility has proven to be fundamental in communication research. In her overview of five decades of source credibility research, Pornpitakpan finds that sources that are perceived highly credible, are more persuasive (regarding attitudes and behaviors) than those with low credibility (2004). Moreover, she finds that this level of persuasion strength is explained by both – but not necessarily equally – trustworthiness and expertise. These two concepts form two dimensions of source credibility; expertise is seen as “the extent to which a speaker is perceived to be capable of making correct assertions” (2004, p. 244). Trustworthiness is related to expertise, but refers to “the degree of confidence in the communicator's intent to communicate the assertions he considers most valid” (Whitehead, 1968, p. 59). Together, expertise and trustworthiness form a solid predictor of source credibility (Pornpitakpan, 2004).

Even though scientific experts seem to be crucial in the vaccination debate, their influence fully depends on their assigned credibility. Hopfer (2012) for instance, showed that the success of behavior-change campaigns (in her case; attempts to get college women HPV vaccinated), strongly depends on the choice of source. Czapínski and Lewicka (1979, as cited in Pornpitakpan, 2004) found a negative bias when a source had high credibility: the impact of negative information (which, in the current study, is the misinformation message), is significantly higher than that of the positive message. For low credibility sources, the effect seemed to be the other way around (Pornpitakpan, 2004, p. 253). This is in line with the expected interrelatedness of misinformation and source credibility.

(10)

9

The general consensus on source credibility effects is clear (e.g. Hovland & Weiss, 1951; Ismagilova et al., 2020; Lachapelle et al., 2014)2: if misinformation is likely to negatively affect people’s beliefs and attitudes, high source credibility – through the use of an expert source – is likely to strengthen those effects. A scientific expert generally enhances the perceived credibility of a message, making people more likely to be persuaded:

H2: The effects of misinformation on vaccination beliefs and attitudes, will be stronger when the message is supplied by an expert source, compared to a non-expert source.

Ability and Motivation: Need For Cognition

How people deal with exposure to information, is not only affected by the source it is coming from, it also depends on personal psychological factors. Petty and Cacioppo (1986) mention ability and motivation as conditions for taking one of two routes towards persuasion. Whereas source cues can be seen as an external factor in this model, a more ‘internal’ personal factor considering this motivation, is people’s Need For Cognition.

Need For Cognition generally refers to the “the tendency for an individual to engage in and enjoy thinking” (Cacioppo & Petty, 1982, p. 116). Whether or not someone is easily persuadable, partially depends on the pleasure they take in cognitive effort. People with high levels of NFC are more likely to “engage in effortful processing” (Leding & Antonio, 2019, p. 410). In the context of this study, this means those with high levels of NFC, will be more likely to critically evaluate the message they are exposed to, regardless what type of information this message contains (Leding & Antonio, 2019). Therefore, following the ELM, we expect the effects of information type to be weaker for those who are high in NFC. The differences in beliefs and attitudes are expected to be smaller between those exposed to misinformation and

(11)

10

information, for people that score high on NFC. This translates to the following hypothesis: H3: The effects of misinformation on vaccination beliefs and attitudes, will be weaker when the level of Need For Cognition is high.

Following the ELM, source credibility is likely to interfere with NFC, as shown in findings by Lee and Shin (2019), and Nan (2009). Furthermore, Priester and Petty (1995) show that perceived source trustworthiness affects attitude certainty only for those low in NFC. The ELM suggests that those taking the peripheral route, evaluate messages based on superficial cues, such as source credibility. They will likely be persuaded when the source seems credible, regardless of the information it supplies. Whether or not an expert source affects someone’s message evaluation, therefore depends on which cognitive route someone takes. Since the central route is strongly related to NFC, it is expected that those with high NFC will take the central route. Doing this, they will not assign much worth to the source and whether or not they consider this an expert. This leads to the fourth and last hypothesis, suggesting a three-way interaction:

H4: The positive moderating effect of source expertise will be weaker when the level of Need For Cognition is high.

Methods Research Design and Sample

Since this study tested effects by asking for beliefs, attitudes and agreement with certain statements, data was gathered through an online survey-experiment. The focus was on a causal relationship, which implies that an experiment was the only proper way to test it. The stimulus material was self-made, to have as much control over the differences between the conditions as possible.

(12)

11

covered the type of information presented in the Facebook post, and consisted of two levels: information versus misinformation. The second factor focused on the source of the (mis)information, and also consisted of two levels; expert versus non-expert source.

The study’s target group was Dutch adults, and the sample was recruited through online communication, making this a convenience sample. In order to find some ‘balance’, being aware of the downsides of convenience samples, the survey URL was posted in the Facebook group Vegan Netherlands, containing 18.500 members, some of which are known to be critical towards vaccination. The questionnaire was active for 15 days, and the total sample consisted of 395 participants, showing a drop-out rate of 78. This resulted in a sample of N = 317 participants, however, 125 did not pass an attention check, leaving a valid sample of N = 192. 70.3% of the sample was female, and educational level was generally high; 43.2% of the participants declared to have completed a university degree. The average age was M = 29.58 (SD = 11.53, Minimum = 18.00, Maximum = 63.00), for further relevant descriptive data, see Table 2.

Procedure

For this experiment, participants were randomly assigned to one of four conditions. For the distribution of participants, see Table 1.

Table 1

Distribution of participants over different conditions, before and after attention check

Information Misinformation

Pre-attention check Post-attention check Pre-attention check Post-attention check

Non-Expert 79 47 80 49

Expert 78 41 80 55

Before excluding all participants failing the attention check, all conditions consisted of 78-80 participants. Substantially more people from the information + expert source condition failed the attention check, compared to those in the misinformation + expert condition.

(13)

12

Although seemingly substantial, the drop-out difference between the conditions was non-significant3 and was therefore not likely to affect results.

In order to be certain that random assignment to the conditions succeeded, a randomization check was conducted for several variables. Randomization on the different conditions showed to be successful: the manipulations were not significantly correlated to age (F(3, 188) = 0.15, p = 929), gender (χ²(3) = 1.26, p = .739), education level (χ²(9) = 3.24, p = .954), having children (χ²(3) = 0.43, p = .935), institutional trust4 (F(3, 188) = 0.09, p = .968), media trust (F(3, 188) = 0.76, p = .519), or NFC (F(3, 188) = 1.83, p = .143).

When starting the survey-experiment, participants first got informed about the survey and their rights as participants. After their declaration of informed consent, people were asked for their age and whether or not they had children. Before exposure to the stimulus, people were asked for their trust in certain institutes, followed by a range of questions concerning people’s Need For Cognition. After this, participants got exposed to one of four experimental conditions. Subsequently, they were asked for their beliefs about vaccination, as well as their attitudes towards different statements on vaccination. Then, people were asked whether they would like or share the Facebook post, followed by two questions in order to conduct a manipulation check. The questionnaire ended with an attention check, followed by some demographical questions. Finally, participants were debriefed and explicitly told about the general scientific stance towards vaccination. In total, participation in this questionnaire took 5 to 10 minutes.

Stimulus Material

The basic framework for the experimental stimulus was a Facebook post written by a

3 The randomization of passing the attention check, was analyzed through cross tabs combined with a Chi-square

test, with the experimental condition as nominal independent variable, and passing or failing the attention check as nominal dependent variable. No significant differences were found between experimental conditions, when comparing the amount of participants passing the attention check, χ²(3) = 4.39, p = .222.

4 For the items used to measure trust, see Appendix D, and for a description of the realization of the scale

(14)

13

Dutch woman. In her post, she addresses some arguments in favor of, or against vaccination, and refers to a specific source she got this information from.

Information versus Misinformation

The first factor covered the type of information presented: half of the participants were exposed to misinformation; four (scientifically proven false) arguments defending reasons not to vaccinate children. This plea consisted, for instance, the assumption that children could get autism from vaccinations. For further arguments, see Appendix B. These arguments, based on the main topics of the Dutch vaccination debate (Rijksinstituut voor Gezondheid en Milieu, 2020; RTL Nieuws, 2019) were summed up by the writer, followed by an explicit call not to let your children get vaccinated.

The other half of the sample was placed in the information condition; information that aligned with scientific consensus about earlier mentioned so-called dangers of vaccination. The same arguments as in the misinformation condition were mentioned, yet here they were explicitly refuted. Like the misinformation condition, this condition ended with an explicit call to action, however now stating that people should indeed get their children vaccinated.

Expert versus Non-expert source

The second factor in this experiment consisted of two levels, and was related to the source that was used to back up the arguments of the author of the Facebook post. In one condition, the source was a personal friend of the woman, called Saskia. It was in no way clear that Saskia had any expertise on the matter discussed, neither where she got her information from. Saskia was an example of a non-expert source, with most likely a quite low level of source credibility. The friend was referred to several times throughout the message.

The second level of this factor showed a source that did qualify to be seen as an expert

(15)

14

the RIVM, the Dutch national authority on vaccinations. Van Dijk was referred to several times, by name and title.

Manipulation Check

To make sure the manipulation of the two factors succeeded, participants were asked two questions. To check the manipulation of Information Type, people were asked what position the author takes in her Facebook post, on a 7-point Likert scale. An independent samples t-test showed that those in the Misinformation condition (M = 1.96, SD = 1.51), reported the author to be significantly and substantially more critical towards vaccinations, compared to those in the Information condition (M = 5.47, SD = 1.43), t(190) = 16.40, p < .001, 95% CI [3.08, 3.93], Cohen’s d = 2.38. This indicated that the manipulation of Information Type was successful.

To make sure the manipulation of the source had succeeded, participants were asked to what extent they considered the cited source to be an expert, on a 7-point scale. An independent samples t-test showed that those in the Expert condition (M = 4.07, SD = 1.96), considered the source to be of significantly and substantially higher expertise compared to those in the Non-Expert condition (M = 1.74, SD = 1.21), t(158.08) = -9.94, p < .001, 95% CI [-2.80, -1.87], Cohen’s d = 1.43.5 This indicated that the manipulation of Source Type was also successful. For the exact questions, see Appendix E.

Variables

Need For Cognition. Participant’s Need For Cognition was measured through 9 items.

The items were based on the original 34-item Need For Cognition scale by Cacioppo and Petty

5 An additional comparison between all four conditions in terms of perceived source credibility has been made.

This comparison shows that, besides the significant difference in credibility between expert and non-expert sources, the perceived credibility was also affected by whether the source expressed information or misinformation. Even though the source credibility of the expert source was significantly higher, the expert is evaluated more credible in the information condition (M = 4.80, SD = 1.78), than in the misinformation conditions (M = 3.53, SD = 1.92). However, the results make clear that the expert was still considered remarkably more credible than the non-expert in both the information (M = 2.02, SD = 1.31) and the misinformation (M = 1.47, SD = 1.04) conditions. These results show that source credibility is not a factor standing on its own: the evaluation is influenced by the views the sources express in the stimulus.

(16)

15

(1982). This scale was later developed into a shorter 18-item scale (Cacioppo, Petty & Kao, 1984), after which Tanaka, Panter, and Winborne (1988) identified a distinction between three different subcategories in the scale: cognitive persistence, cognitive complexity, and cognitive confidence. Out of all categories, three questions were chosen (evenly dividing negatively and positively phrased questions), resulting in 9 questions, extracted from the different subcategories of the NFC scale. For the 9 items used, see Appendix C. To assess whether the 9 items successfully measured the same construct, a factor analysis was conducted, using Principal Axis Factoring. KMO and Bartlett’s Test of Sampling Adequacy showed a significant result, p < .001, KMO = .74. The analyses revealed three factors with an Eigenvalue above 1.00, however, based on the scree plot, only one factor showed substantial explanation of the total variance (Eigenvalue = 2.65, 29.41% of the total explained variance). Furthermore, relying on multiple studies by Cacioppo and Petty (e.g. 1982, 1984), all items were meant to measure NFC. Therefore the decision was made to further work with all items in one factor. A reliability analysis using Cronbach’s alpha resulted in a reasonably reliable score; α = .67 (M = 5.00, SD = 0.73). This score could only be improved a small fraction by deleting an item, yet with that item being too relevant to the scale, a scale variable Need For Cognition (NFC) was created with all 9 items. The sample showed a minimum score of 2.56 (whereas the lowest score possible was 1.00) and a maximum score of 7, indicating the highest possible level of NFC.

Beliefs about vaccination. Through five statements, described in Appendix F, people’s

beliefs about vaccination were tested. Those statements encompassed themes like vaccines causing autism, and measles being ‘just a flu’. A low mean score on these statements, implied a low belief in scientific evidence, and therefore stronger false beliefs. The factor analysis showed KMO and Bartlett’s Test of Sampling Adequacy were significant, p < .001, KMO = .76. The items loaded on one factor (Eigenvalue = 2.23, explaining 44.64% of total variance), and together formed a reasonably reliable scale (Cronbach’s α = .65, M = 5.59, SD = 0.88).

(17)

16

Removing Item 2 would increase the internal reliability towards α = .69, however this small enhancement would not justify a core statement (“Measles is more than just a flu”) being removed from the scale. Therefore, a scale variable Vaccination Beliefs was constructed with all items, with the sample scoring a minimum of 2.80 (1.00 being the lowest score possible) and a maximum of 7.00; this highest score representing the strongest ‘true’ beliefs about vaccination.

Attitude towards vaccination. Next to measuring beliefs on vaccination, the

questionnaire asked for participants’ attitudes towards six statements on vaccination. These statements for instance related to whether governments should enforce parents to let their children get vaccinated, or whether child-care centers should allow children that are not vaccinated. For all items, see Appendix F. The factor analysis showed that KMO and Bartlett’s Test of Sampling Adequacy were significant, p < .001, KMO = .87. The items loaded on one factor (Eigenvalue = 3.75, explaining 62.45% of total variance), and formed a good, reliable scale (Cronbach’s α = .87, M = 5.58, SD = 1.12). The scale variable Vaccination Attitudes was constructed, containing all six items, with, within the sample, a minimum score of 1.33 (1.00 being the lowest score possible) and a maximum score of 7.00, the latter indicating the most positive attitude towards vaccination.

Table 2

Descriptives numeric variables

N M SD

Need For Cognition 192 5.00 0.73

Vaccination Beliefs 192 5.59 0.88

(18)

17 Results Hypothesis 1

To test Hypothesis 1, analyzing the effects of misinformation on beliefs and attitudes towards vaccination, and Hypothesis 2, focusing on the interaction effects of Source Type and Information Type on the same dependent variables, two two-way ANOVAs were conducted.6 The two-way ANOVA showed that Information Type had no significant effect on Vaccination Beliefs (F(1, 188) = 1.48, p = .226, η² = .01). Those exposed to misinformation, showed slightly more false beliefs about vaccination (M = 5.52, SD = 0.91), than those exposed to information (M = 5.68, SD = 0.83).

Information Type had no significant effect on Vaccination Attitudes either (F(1, 188) = 0.18, p = .674, η² < .01). Against expectations, those exposed to misinformation even expressed slightly more positive attitudes towards vaccination (M = 5.62, SD = 1.17), than those exposed to information (M = 5.54, SD = 1.06), however still insignificantly. This means no support is found for Hypothesis 1: misinformation does not significantly stimulate false beliefs about, or negative attitudes towards vaccination.7

Hypothesis 2

The same two-way ANOVA showed no significant interaction effect of Information Type and Source Type on Vaccination Beliefs (F(1, 188) = 0.06, p = .801, η² < .01). The mean

6 Levene’s test of equality of variances showed an insignificant result for both Vaccination Beliefs (F(3, 188) =

0.72, p = .543) and Vaccination Attitudes (F(3, 188) = 0.40, p = .755), meaning it is in line with statistical assumptions of equality of variances between different groups.

7 As a an additional analysis, the effects of Information Type on likelihood to like and share the Facebook post

were tested. An independent samples t-test showed that those in the Information condition (M = 1.72, SD = 1.20), were significantly more likely to “like” the post they had just read, compared to those in the

Misinformation condition (M = 1.09, SD = 0.40), t(103.02) = 4.70, p < .001, 95% CI [0.36, 0.90], Cohen’s d = 0.70, implying a medium effect of Information Type on likelihood to like the Facebook post.

Those in the Information condition (M = 1.16, SD = 0.50), were almost, yet not significantly more likely to “share” the post they had just read, compared to those in the Misinformation condition (M = 1.04, SD = 0.31),

t(139.98) = 1.97, p = .051, 95% CI [-0.00, 0.24]. Cohen’s d = 0.29, meaning a small effect of Information Type

on likelihood to share the Facebook post is shown, although insignificantly. Other analyses focusing on liking and sharing behavior, showed no significant results. The result of the first analysis, implies that people are more likely to like (and, however non-significantly, share) a message containing information, than a message

containing misinformation. As liking and sharing are simple ways of distributing information further through the Internet, those are important factors to consider in the context of misinformation.

(19)

18

difference between Information and Misinformation was even bigger in the Non-Expert condition (MD = 0.19) than the Expert Condition (MD = .12), contrary to expectations. Expertise seems to have a small (non-significant) weakening effect on the correlation between Information Type and Vaccination Beliefs.

No significant interaction effect was found on Vaccination Attitudes either (F(1, 188) = 1.58, p = .210, η² = .01).8 Interestingly, the mean differences of Vaccination Attitudes show to be strengthened by expertise (MDexpert = -0.27, MDnon-expert = 0.14). However, this is in the opposite direction of what was expected, as, theoretically, those exposed to misinformation by an expert, were assumed to score the lowest on Vaccination Attitudes. Yet here, those exposed to misinformation shared by an expert, are more positive towards vaccination than those exposed to information. For a visual representation of the results, see Figure 1 and Figure 2. These results show that Hypothesis 2 cannot be confirmed: Source Type does not significantly strengthen the effect of Information Type on Vaccination Beliefs or Attitudes. Perceived expertise does not seem to play a role in the effects of misinformation on people’s views on vaccination.

8 In line with expectations, Source Type did not show a significant main effect on Vaccination Beliefs (F(1, 188)

= 0.01, p = .904, η² < .01), or Vaccination Attitudes (F(1, 188) = 0.33, p = .568, η² < .01) either, as a source can hardly have an effect on beliefs about a topic, without taking into account which message the source distributes on that topic.

(20)

19

Note. The y-axis of the graphic varies from 5.4 to 6.0, whereas the actual scale of

Vaccination Beliefs varied from 1.0 to 7.0. This decision was made in order to provide insight into the relatively small differences between the mean scores.

Note. The y-axis of the graphic varies from 5.4 to 6.0, whereas the actual scale of

Vaccination Attitudes varied from 1.0 to 7.0. This decision was made in order to provide insight into the relatively small differences between the mean scores.

5,67 5,69 5,55 5,50 5,4 5,6 5,8 6,0 Expert Non-Expert V ac cin ation B eli ef s Information Type

Figure 1 Effects Information Type on Vaccination Beliefs, Moderated by Source Type

Information Misinformation 5,48 5,60 5,76 5,46 5,4 5,6 5,8 6,0 Expert Non-Expert V acc in atio n At tit u d es Information Type

Figure 2 Effects Information Type on Vaccination Attitudes, Moderated by Source Type

Information Misinformation

(21)

20 Hypothesis 3

The third hypothesis takes into account participants’ Need For Cognition, and whether this weakens the effect of misinformation on Vaccination Beliefs and Attitudes. To test Hypothesis 3, a moderation analysis was conducted through the PROCESS Macro by Hayes (2012), using Model 1, applying 10,000 bootstrap samples.

As the numbers show in Table 3, the regression model with Information Type and NFC as independent variables and Vaccination Beliefs as dependent variable, was significant (F(3, 188) = 3.47, p = .017), however only explaining 5.24% of the total variance in Vaccination Beliefs (R² = .05). There was no significant interaction effect of Information Type and NFC on Vaccination Beliefs (b = -0.02, t = -0.09, p = .932, 95% CI [-0.36, 0.33]). Although weak and far from significant, the general negative effect shown is in the expected direction. However, as shown in Figure 3 and against expectations, someone scoring high on NFC, shows slightly stronger negative marginal effects of Misinformation on Vaccination Beliefs (when NFC = 7.00, b = -0.22), than someone lower on NFC (when NFC = 2.56, b = -0.15).

Table 3 shows similar results for Vaccination Attitudes, for which significant effects were found neither for the whole model (p = .931), nor for interaction of Information Type and NFC (p = .631). In general, the direction of the effect is negative, which is according to expectations. However, as illustrated in Figure 4, more detailed results show that those scoring low in NFC, are more likely to show positive effects of Misinformation on Vaccination Attitudes (when NFC = 2.56, b = 0.34), whereas those scoring high on NFC, are more likely to show minor negative effects of Misinformation on Vaccination Attitudes (when NFC = 7.00, b = -0.15), the latter being the only finding following expectations regarding Attitudes.

Based on these results, Hypothesis 3 cannot be confirmed: NFC does not significantly weaken the effect of Misinformation on Vaccination Beliefs or Attitudes. Whether and how

(22)

21

strongly someone is affected in their beliefs and attitudes by being exposed to misinformation, cannot be predicted by their level of Need For Cognition. 9

9 Interestingly, a significant main effect of NFC on Vaccination Beliefs was found, whereas not on Vaccination

Attitudes. A single regression analysis with NFC as independent variable and Vaccination Beliefs as dependent variable, was significant, F(1, 190) = 8.18, p = .005, however NFC only predicts 4.1% of the variance in

Vaccination Beliefs (R² = .04). Surprisingly, the same analysis with Vaccination Attitudes as dependent variable, showed not to be significant, F(1, 190) = 0.02, p = .886. This means that NFC is a (relatively weak) positive predictor of true beliefs (b* = 0.20, t = 2.86, p = .005, 95% CI [0.08, 0.41]), yet not of positive attitudes (b* = 0.01, t = 0.14, p = .886, 95% CI [-0.20, 0.24]). These findings imply that, even though often the case in this study, attitudes do not always automatically follow from beliefs. High levels of Need for Cognition correlate with a high level of true beliefs about vaccination, but not necessarily also with positive attitudes towards vaccination.

Table 3

Different models explaining Vaccination Attitudes and Vaccination Beliefs

1 2

Beliefs

Information Type -0.11 (-0.13) 0.22 (0.18)

Source Type 1.41 (1.09)

Need For Cognition 0.26* (2.06) 0.40* (2.24)

Information Type * Source Type -0.89 (-0.50)

Information Type * Need For Cognition -0.02 (-0.09) -0.09 (-0.39)

Source Type * Need For Cognition -0.28 (-1.09)

Information Type * Source Type * NFC 0.20 (0.58)

.05 .06

F 3.47* 1.72

Attitudes

Information Type 0.62 (0.54) 1.21 (0.75)

Source Type -1.13 (-0.67)

Need For Cognition 0.07 (0.43) -0.03 (-0.14)

Information Type * Source Type -1.53 (0.67)

Information Type * Need For Cognition -0.02 (-0.48) -0.26 (-0.82)

Source Type * Need For Cognition 0.21 (0.62)

Information Type * Source Type * NFC 0.38 (0.83)

<.01 .03

F 0.15 0.92

Note. Entries are coefficients with t-values between parentheses. Model 1 = Hypothesis 3, Model 2 =

Hypothesis 4. * p < .05

(23)

22

Note. The y-axis of the graphic varies from -1.50 to 1.00. This decision was made in order to provide insight into the relatively small differences between the marginal effects. Effects are shown from

NFC-scores of 2.56 to 7.00, as these are the minimum and maximum score in the sample. The dotted lines shown, are the lower and upper bound 95% Confidence Intervals.

Note. The y-axis of the graphic varies from -1.40 to 1.60. This decision was made in order to provide insight into the relatively small differences between the marginal effects. Effects are shown from

NFC-scores of 2.56 to 7.00, as these are the minimum and maximum score in the sample. The dotted lines shown, are the lower and upper bound 95% Confidence Intervals.

-1,5 -1 -0,5 0 0,5 1 1 2 3 4 5 6 7 Ma rg in al ef fec ts o f In fo rm atio n T y p e o n Vaccin ation B elief s

Need for Cognition

Figure 3. Marginal Effects of Information Type on Vaccination Beliefs for different values of NFC

-1,4 -0,9 -0,4 0,1 0,6 1,1 1,6 1 2 3 4 5 6 7 Ma rg in al ef fec ts o f In fo rm atio n T y p e o n Vaccin ation A tt it u d es

Need for Cognition

Figure 4. Marginal Effects of Information Type on Vaccination Attitudes for different values of NFC

(24)

23 Hypothesis 4

A three-way interaction was tested, to see if NFC affects the moderation of Source Type, on the effects of Information Type on Vaccination Beliefs and Attitudes. A three-way interaction analysis was conducted through the PROCESS Macro (Hayes, 2012), using Model 3, applying 10,000 bootstrap samples. As shown in Table 3, the regression model with Information Type, Source Type, and NFC as independent variables and Vaccination Beliefs as dependent variable showed to be non-significant (F(7, 184) = 1.72, p = .107). Information Type, Source Type and NFC together explained 6.14% of the total variance in Vaccination Beliefs (R² = .06), and no significant three-way interaction effect was found (b = 0.20, t = 0.58, p = .565, 95% CI [-0.49, 0.89]).

This non-significant yet positive effect is not in the expected direction: NFC seems to strengthen the moderating effect of Source Type, rather than weakening it. However, looking at more detailed results, we see different effect-directions for the two conditions of Source Type: those low in NFC (NFClow) in the Expert condition (NFC = 2.56, b = -0.39), show stronger negative effects of Information Type on Vaccination Beliefs, than those NFClow in the Non-Expert condition (NFC = 2.56, b = -0.02). Meanwhile, those high in NFC (NFChigh)in the Expert condition (NFC = 7, b= 0.09), show weak, but positive effects whereas those NFChigh in the Non-Expert condition show negative effects (NFC = 7, b = -0.43). Expected was, that an increase of NFC would lead to smaller differences of (negative) effects between the expert and non-expert condition. Except for NFChigh, all effects are indeed negative. Moreover, for NFClow, we see stronger negative effects in the expert-, than the non-expert condition, which is in line with expectations. However, the differences between non-expert and expert were smaller for NFClow than for NFChigh,. For NFChigh, where one would expect weak negative effects, the effects were even positive, and instead of a small difference between expert and non-expert, the

(25)

24

difference was fairly substantive. The (non-significant) numbers found, are largely the opposite of what was expected.

As Table 3 shows, the regression model with Information Type, Source Type and NFC was non-significant for Vaccination Attitudes as well (p = .494), neither was a significant three-way interaction effect found (p =.410). This non-significant yet positive effect on Attitudes is not in the expected direction either: the results imply that NFC strengthens the moderating effect of Source Type, instead of weakening it. However, looking at more detailed results (for visualized results, see Figure 5 and 6), in this model too we see different directions of effects: those NFClow in the expert condition (NFC = 2.56, b= -0.02), show small negative effects of Information Type on Vaccination Attitudes, whereas NFClow in the non-expert condition (NFC = 2.56, b = 0.55), shows positive effects. At the same time, NFChigh in the expert condition (NFC = 7, b= 0.51), shows positive effects while NFChigh in the non-expert condition (NFC = 7, b = -0.60), shows negative effects.

This implies that, against expectations that were in line with those for Vaccination Beliefs, the effect differences between the two Source Type conditions were higher for NFChigh than for NFClow. Moreover, the misinformation effects for NFClow were even positive in the non-expert condition, and only slightly negative in the expert condition. Also, unexpectedly, for NFChigh the effect-difference between Source Type conditions was quite large. For NFChigh, the effects in the non-expert condition were negative and substantial, meaning that those NFChigh, still ‘fall' for misinformation delivered by a non-expert source. The expert condition for NFChigh however, where a weak negative effect was expected, showed positive effects. Concluding, the (non-significant) numbers found for Vaccination Attitudes, did hardly meet any of the expectations. The results gave no support for Hypothesis 4: NFC does not significantly and negatively weaken the moderating effect of Source Type on the effect of Misinformation on Vaccination Beliefs and Attitudes.

(26)

25

Note. Marginal effects are shown for NFC-scores 2.56 to 7, as those were the minimum and maximum

scores in the sample. The black line represents the Expert condition, the grey line represents the Non-Expert condition. The dotted lines shown, are the lower and upper bound of the 95% confidence intervals of both conditions.

Note. Marginal effects are shown for NFC-scores 2.56 to 7, as those were the minimum and maximum

scores in the sample. The black line represents the Expert condition, the grey line represents the Non-Expert condition. The dotted lines shown, are the lower and upper bound of the 95% confidence intervals of both conditions.

-2 -1,5 -1 -0,5 0 0,5 1 1,5 1 2 3 4 5 6 7 Ma rg in al ef fec ts o f In fo rm atio n T y p e o n Vac cin atio n B elief s

Need for Cognition

Figure 5. Marginal Effects of Information Type on Vaccination Beliefs for different values of NFC and different Source Types

-2,5 -2 -1,5 -1 -0,5 0 0,5 1 1,5 2 2,5 1 2 3 4 5 6 7 Ma rg in al ef fec ts o f In fo rm atio n T y p e o n Vac cin atio n A ttit u d es

Need for Cognition

Figure 6. Marginal Effects of Information Type on Vaccination Attitudes for different values of NFC and different Source Types

(27)

26

Conclusion & Discussion Misinformation Effects, and How to Fight Them

As the implications of misinformation effects get highlighted during the current pandemic, the urgency of studying it is evident. This study explored misinformation in a health communication context, testing the effects of (mis)information on people’s beliefs and attitudes, and how source expertise and Need For Cognition play into these effects.

The results showed that misinformation did slightly, but not significantly affect people’s beliefs negatively, whereas misinformation even had a positive (non-significant) effect on attitudes. The lack of evidence for the effects of misinformation goes against previous findings by Flynn et al. (2017), Kata (2010), and Lewandowsky et al. (2012). One reason for this could be, that once our beliefs and attitudes concerning a theory are formed, it is very hard to correct those (e.g. Del Vicario et al., 2016; Jolley & Douglas, 2017; Kuklinski et al., 2000; Lewandowsky et al., 2012;). As vaccination is a much-discussed topic, participants likely have formed an opinion on the topic already, and following the literature, this study would not easily change that opinion. Moreover, when people get exposed to ‘true’ information contradicting their former beliefs, those corrections might actually worsen their misperceptions (Cobb, Nyhan, & Reifler, 2013). This continued-influence effect states that, when retracting misinformation, repeating that misinformation in the retraction can result in stronger convictions in favor of the misinformation (Foster, Huthwaite, Yesberg, Garry, & Loftus, 2012). On the contrary, the salience accounts of mental-model updating and knowledge

revision, reveals an opposite perspective: repeating misinformation (in a retraction) can actually

be helpful, as it makes the falsity of the misinformation clear, and shows direct conflict between the misinformation and the updated information (Ecker, Hogan, & Lewandowsky, 2017).

(28)

27

These findings on – fighting – misinformation, call for some interesting follow-up research. The vaccination debate is well-known, and up to a certain extent, the consensus is that vaccines are an important achievement in medical science (Kata, 2012). However, topics that are new in the public debate could have other effects, since people do not have prior knowledge or opinions yet. Therefore, longitudinal research is necessary, as it can both test initial effects regarding ‘new’ topics, and show whether over time these effects change when the topic is discussed more elaborately, like the topic of vaccination. This also adds value to results because it is not a ‘one shot’ experiment like the current study, but an increasing level of exposure to (mis)information throughout time.

As fighting misinformation effects after exposure is not always effective, scholars should also study combatting misinformation effects in the pre-exposure period. An example of this is the Discrepancy Detection principle (Loftus, 2005): warning people beforehand, that information they will get exposed to, might be misleading. This warning would make them better able to resist the influence of misinformation, as people are more attentive of discrepancies (taking the central ELM-route) when exposed to the message. Future studies might benefit from dedicating research to both prior warning effects and post-exposure battling of misinformation.

Elaboration Likelihood: Not a Strong Predictor

Although the discrepancy detection principle aligns well with the fundamentals of the ELM, in this study no clear effects from internal and external related factors of the ELM were found. Source expertise for instance, was not a significant moderator of misinformation effects. One explanation is that people are more inclined to base their opinions on source credibility cues when a message is quantitative. With quantitative information, subjects are more likely to take a peripheral route of elaboration, and lean more on expertise cues (Pornpitakpan, 2004). The message in this study was not quantitative, which could be why source type showed no

(29)

28

effect. Moreover, as the average score of NFC in this sample was high (see Limitations for further discussion), it is likely that many took a central processing route, paying little attention to cues like source expertise (Petty & Cacioppo, 1986).

Coming back to research into battling misinformation, studies on source effects show diverse results. Some argue that explicit refutation of misinformation post-exposure does work, and strengthens the argument of highly credible sources (Pornpitakpan, 2004), others state it is better to counter-argue without explicit refutation of earlier arguments (Jolley & Douglas, 2017). Flynn et al. (2017) find that the effectiveness of corrective information depends on how ideologically sympathetic the source comes across to the public, and whether it is presented in graphical or textual form, in line with suggestions by Pornpitakpan (2004). Contrary to the results of the current study, above-mentioned findings correspond with earlier conclusions that expert sources can be of significant value in misinformation effects. However, what role source type plays in battling these effects, still lacks a unambiguous answer, and needs to be studied further.

The suggestion to perform longitudinal research, also applies to source expertise and credibility effects. Hovland and Weiss proposed a theory called the Sleeper Effect, showing that, on the long term, the effects of high credibility on people’s opinions seem to weaken, whereas the effects of low credibility sources grow (1951). Over time, people will remember

what information they acquired, however not who communicated the information (Hovland &

Weiss, 1951). In the current context, this could mean source type becomes irrelevant altogether over time, even to those taking the peripheral route of ELM.

Like source type, NFC has not shown to be a significant moderating factor in misinformation effects. This is in line with findings by Kessler and Zillich (2019), who found similar results in a comparable study: online search behavior concerning vaccination was “relatively stable and not significantly influenced by cognitive factors or by reading a single

(30)

29

news article” (p. 1156), matching earlier findings by Kessler and Günther (2017). Further explanations of the lack of NFC-effects are discussed in the study’s limitations.

Choice of Platform

This study chose Facebook as the platform to share information on during the experiment, mainly because the Internet, and social media like Facebook in particular, are known to be full of misinformation on vaccination and many other themes (Allcott et al., 2019; Chen, Sin, Theng, & Lee, 2015; Mitra et al., 2016; Vraga & Bode, 2018). However, online media naturally do not stand on their own. Clear effects on beliefs can also be found from misinformation on TV (Hall Jamieson & Albarracín, 2020; Maurer & Reinemann, 2006). Misinformation effects of (offline) print media, seem to be underexposed in science, calling for more research. The Internet most certainly has found its way to us and might replace traditional media partially, yet mainstream media are still widely used (Commissariaat voor de Media, 2019). The correlation between offline media use and (susceptibility to) misinformation effects make for a relevant follow-up study, as (the trust in) the off- and online world are very much intertwined (Wagner & Boczkowski, 2019).

Limitations

A more general reason for the non-significant findings, could be the lack of diversity in the sample. Even though results might be similar to population-based samples (Mullinix, Leeper, Druckman, & Freese, 2015), convenience samples have external validity limitations. For instance, when looking at the demographics, the average age was just below 30. Research shows that, generally, misinformation has the strongest effects on younger children and older adults (Loftus, 2005; Roediger & Geraci, 2007). This implies, since the average age represented relatively young adults, this sample contained people that are the least susceptible to misinformation, resulting in weaker effects. The mean age of the Dutch population was 42.0 years in 2019 (Centraal Bureau voor de Statistiek, 2019), so based on previous findings, the

(31)

30

effects of misinformation could have been stronger if the sample were more representative of the Dutch population in terms of age.

The results show that the mean scores on Vaccination Beliefs and Attitudes were remarkably high (both 5.6 on a 1-7 scale), as was the average level of Need For Cognition: 5 on a 1-7 scale, with a small standard deviation. A reason for these high scores, could be related to social desirability, as people assume that the general consensus is pro-vaccination (Krumpal, 2013). The fact that the sample was positive towards vaccination, shared most scientific beliefs and scored high on NFC, implied a distribution with little variation, potentially skewing the results. Moreover, these high scores predict ceiling effects (Cramer & Howitt, 2004): when many participants already held the true beliefs and positive attitudes tested in the experiment, there is no way those people could ‘increase’ in their true beliefs or attitudes when exposed to a stimulus. In that light, the scale might have been too limited, showing results concentrated in the upper part of the scale. If there was space for more variance by means of a broader scale, this could have resulted in stronger effects of information type as there would be an opportunity to compare beliefs, attitudes and NFC levels with larger differences between high and low scores.

Conclusion

Even though this study found no significant results, it is another step towards getting a better understanding of the effects of the ‘alternative facts’ people are confronted with nowadays. Studies on misinformation are widespread, however there was still a gap to fill from an ELM-perspective in the Netherlands. The results show that we do not have to fear unconditional brainwashing effects from misinformation online, however it is important to take into account the subtle, but potentially dangerous, far-reaching consequences on a longer term. Fortunately, attention to online media literacy in schools is growing, which could help fight these consequences (My Child Online Foundation, 2013; Zucker, Noyce, & McCullough,

(32)

31

2020). Future research is necessary to learn more about dormant long-term effects, and the ways to fight – the effects of – misinformation. While experts are still uncertain if and when the COVID-19 pandemic will ever be over, one thing is clear: the attached infodemic is a long-term battle to fight.

References

Allcott, H., Gentzkow, M., & Yu, C. (2019). Trends in the diffusion of misinformation on social media. Research & Politics, 6(2). https://doi.org/10.1177/2053168019848554

Bogel-Burroughs, N. (2020, May 2). Anti-Vaccination Activists Are Growing Force at Virus

Protests. The New York Times, Retrieved from

https://www.nytimes.com/2020/05/02/us/anti-vaxxers-coronavirus-protests.html

Cacioppo, J. T., & Petty, R. E. (1982). The Need For Cognition. Journal of Personality and

Social Psychology, 42(1), 116–131. doi:10.1037/0022-3514.42.1.116

Cacioppo, J. T., Petty, R. E., & Kao, C. F. (1984). The Efficient Assessment of Need for Cognition. Journal of Personality Assessment, 48(3), 306–307. doi:10.1207/s15327752jpa4803_13

Centraal Bureau voor de Statistiek (2019, December 12). Bevolking; kerncijfers. Retrieved from https://opendata.cbs.nl/statline/#/CBS/nl/dataset/37296NED/table?fromstatweb Chen, X., Sin, S.-C. J., Theng, Y.-L., & Lee, C. S. (2015). Why Students Share Misinformation

on Social Media: Motivation, Gender, and Study-level Differences. The Journal of

Academic Librarianship, 41(5), 583–592. doi:10.1016/j.acalib.2015.07.003

Cobb, M., Nyhan, B., & Reifler, J. (2013). Beliefs Don’t Always Persevere: How Political Figures Are Punished When Positive Information about Them Is Discredited. Political

(33)

32

Cohen, A. R., Stotland, E., & Wolfe, D. M. (1955). An experimental investigation of need for cognition. The Journal of Abnormal and Social Psychology, 51(2), 291–294. doi:10.1037/h0042761

Commissariaat voor de Media (2019, June 12). Mediamonitor 2019. Retrieved from https://www.mediamonitor.nl/nieuws/mediamonitor-2019-nederlanders-blijven-zich -breed-orienteren-voor-nieuws/

Cramer, D., & Howitt, D. L. (2004). The Sage dictionary of statistics: A practical resource for

students in the social sciences. London: SAGE Publications.

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., … Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National

Academy of Sciences, 113(3), 554–559. doi:10.1073/pnas.1517441113

Ecker, U. K. H., Hogan, J. L., & Lewandowsky, S. (2017). Reminders and Repetition of Misinformation: Helping or Hindering Its Retraction? Journal of Applied Research in

Memory and Cognition, 6(2), 185–192. doi:10.1016/j.jarmac.2017.01.014

Flynn, D. J., Nyhan, B., & Reifler, J. (2017). The Nature and Origins of Misperceptions: Understanding False and Unsupported Beliefs About Politics. Political Psychology, 38, 127–150. doi:10.1111/pops.12394

Foster, J., Huthwaite, T., Yesberg, J., Garry, M., & Loftus, E. F. (2012). Repetition, Not Number of Sources, Increases Both Susceptibility to Misinformation and Confidence in the Accuracy of Eyewitnesses. SSRN Electronic Journal. doi:10.2139/ssrn.1998302 Frenkel, S., Alba, D., & Zhong, R. (2020, March 8). Surge of Virus Misinformation Stumps

Facebook and Twitter. The New York Times. Retrieved from

https://www.nytimes.com/2020/03/08/technology/coronavirus-misinformation-social -media.html

(34)

33

Haase, N., Betsch, C., & Renkewitz, F. (2015). Source Credibility and the Biasing Effect of Narrative Information on the Perception of Vaccination Risks. Journal of Health

Communication, 20(8), 920–929. doi:10.1080/10810730.2015.1018605

Hall Jamieson, K., & Albarracín, D. (2020). The Relation between Media Consumption and Misinformation at the Outset of the SARS-CoV-2 Pandemic in the US. Harvard

Kennedy School Misinformation Review. doi:10.37016/mr-2020-012

Hayes, A. F. (2012). “PROCESS: A Versatile Computational Tool for Observed Variable Mediation, Moderation, and Conditional Process Modeling [White Paper],” http://www.afhayes.com/public/process2012.pdf.

Hopfer, S. (2012). Effects of a Narrative HPV Vaccination Intervention Aimed at Reaching College Women: A Randomized Controlled Trial. Prevention Science, 13(2), 173–182. https://doi.org/10.1007/s11121-011-0254-1

Hovland, C. I., & Weiss, W. (1951). The Influence of Source Credibility on Communication Effectiveness. Public Opinion Quarterly, 15(4), 635-650. doi:10.1086/266350

Ismagilova, E., Slade, E., Rana, N. P., & Dwivedi, Y. K. (2020). The effect of characteristics of source credibility on consumer behaviour: A meta-analysis. Journal of Retailing and

Consumer Services, 53. doi:10.1016/j.jretconser.2019.01.005

Jolley, D., & Douglas, K. M. (2014). The Effects of Anti-Vaccine Conspiracy Theories on Vaccination Intentions. PLoS ONE, 9(2), e89177. doi:10.1371/journal.pone.0089177 Jolley, D., & Douglas, K. M. (2017). Prevention is better than cure: Addressing anti-vaccine

conspiracy theories. Journal of Applied Social Psychology, 47(8), 459–469. doi:10.1111/jasp.12453

Kata, A. (2010). A postmodern Pandora's box: anti-vaccination misinformation on the Internet. Vaccine, 28(7), 1709-1716. https://doi.org/10.1016/j.vaccine.2009.12.022

(35)

34

Kata, A. (2012). Anti-vaccine activists, Web 2.0, and the postmodern paradigm – An overview of tactics and tropes used online by the anti-vaccination movement. Vaccine,

30(25), 3778–3789. doi:10.1016/j.vaccine.2011.11.112

Kessler, S. H., & Guenther, L. (2017). Eyes on the frame: Explaining people’s online searching behavior in response to TV consumption. Internet Research, 27, 303–320. doi:10.1108/IntR-01-2016-0015

Kessler, S. H., & Zillich, A. F. (2018). Searching Online for Information About Vaccination: Assessing the Influence of User-Specific Cognitive Factors Using Eye-Tracking. Health

Communication, 34(10), 1150–1158. doi:10.1080/10410236.2018.1465793

Kruger, J., & Dunning, D. (1999). Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments. Journal of

Personality and Social Psychology, 77(6), 1121–1134 https://doi.org/10.1037/0022-3514.77.6.1121

Krumpal, I. (2013). Determinants of social desirability bias in sensitive surveys: a literature review. Quality & Quantity, 47(4), 2025-2047. doi:10.1007/s11135-011-9640-9 Kuklinski, J., Quirk, P., Jerit, J., Schwieder, D., & Rich, R. (2000). Misinformation and the

Currency of Democratic Citizenship. The Journal of Politics, 62(3), 790–816. https://doi.org/10.1111/0022-3816.00033

Lachapelle, E., Montpetit, É., & Gauvin, J.-P. (2014). Public Perceptions of Expert Credibility on Policy Issues: The Role of Expert Framing and Political Worldviews. Policy

Studies Journal, 42(4), 674–697. doi:10.1111/psj.12073

Leding, J., & Antonio, L. (2019). Need For Cognition and discrepancy detection in the misinformation effect. Journal of Cognitive Psychology, 31(4), 409–415. https://doi.org/10.1080/20445911.2019.1626400

Referenties

GERELATEERDE DOCUMENTEN

Table C.39: Cumulative percentage transport of Rhodamine 6G from gastro-retentive dosage form across cellulose nitrate membranes with cognac oil impregnation in 0.1 N HCI... Table

The main aim of this study was to determine the prevalence of asepsis-related deviations from safe practice during parenteral medication administration within

Relying on human-machine symbiotic approach, we use the human to define a certain threshold t, and then let the machine search for all nodes with a number of edges e exceeding it:..

De aanvoer is uitgedrukt in ‘Effectieve Orga- nische Stof’ (EOS), dit is de organische stof die één jaar na aanvoer naar de bodem nog niet is afgebroken (EOS houdt er rekening mee

Bodembiologisch onderzoek in relatie tot biologische bestrijding vind ik van groot belang, niet alleen omdat het mogelijk ' tot toepassing leidt, maar ook omdat het fundamentele

Zoals al eerder beschreven, is te zien dat zinnen waarbij hun als meewerkend voorwerp werd gebruikt (correcte vorm) als minder acceptabel werd beoordeeld in ten opzichte van de

Of sociale steun ook een rol speelt in de levenskwaliteit van kinderen en jongeren en de ervaren last van NAH is onduidelijk, maar op basis van de onderzoeken naar sociale steun

Figure 15: Raman shift in cm −1 of four spots of 5% doped PTEG-1 in the sample holder and microscopic glass with 5% or 10% laser intensity and 5x zoom.. In figure 15, the Raman