• No results found

Too Good to Be True? : an Experiment Investigating the Influence of Exposure to Misinformation on Candidate Characteristics Moderated by Prior Beliefs

N/A
N/A
Protected

Academic year: 2021

Share "Too Good to Be True? : an Experiment Investigating the Influence of Exposure to Misinformation on Candidate Characteristics Moderated by Prior Beliefs"

Copied!
54
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Too Good to Be True?

An Experiment Investigating the Influence of Exposure to Misinformation on

Candidate Characteristics Moderated by Prior Beliefs

Marlon Braakman 10784799 Master’s Thesis

Graduate School of Communication Political Communication and Journalism

Supervised by Mr. Dr M. Hameleers 28-06-2019

(2)

ACKNOWLEDGEMENT

During the last 6 months I have written this thesis about misinformation. It is the final product of five years of studying at the University of Amsterdam. This would not have been so enjoyable without the numerous of wonderful people who I have met during my time at the university and my closest friends. I am grateful to share some of my most precious memories with them. Second, this thesis would not have been here without the help of many people. Therefore, I would like to thank all the respondents who participated in this experiment. Next, there is one person whom I have to mention in particular. I would like to express my very great appreciation to my supervisor Michael Hameleers who was willing to spend a generous amount of time advising and providing me with constructive feedback and suggestions, to make a success out of this study. His support from the last 6 months was sincerely valuable. Lastly, I wish to thank the most important people in my life: my family. They have always been extremely supportive to me and helped me wherever they could. Above all, they have always encouraged me to never give up my dreams. A big thanks to you all. I will cherish this journey for the rest of my life.

(3)

ABSTRACT

Social media is a favorable system for the dissemination of misinformation. Misinformation is commonly known as fake news. However, this concept does not describe even a bit of its complexity, nor does it cover its consequences. Investigating the effects of misinformation is especially important during election campaigns. Campaign makers can anticipate on this by trying to limit the damage misinformation has and social media platforms have to be more assertive in stopping the spread of misinformation. Several researchers did such a study, however, insights into the influence of misinformation during election campaigns in the Netherlands are scarce. By means of an experiment (N = 110) this study investigates whether misinformation influences the evaluation citizens have of candidate characteristics, and if this effect is moderated by prior beliefs on a politician and an issue. The key results indicate that misinformation leads to a more positive evaluation of candidate characteristics than real news. This effect is only moderated by prior belief on issue, and not for politician. For campaign makers this implies that they should try to give as less counter information as possible during elections and try to reinforce prior beliefs with supporting information.

Keywords: Misinformation, candidate characteristics, motivated reasoning, prior beliefs, politics, evaluation

(4)

“If you don’t read the newspapers, you’re uninformed. If you do read it, you’re misinformed”.

(5)

Table of Contents

Introduction 6

Theoretical Framework 8

Disinformation in Online Information Settings 8

The Effects of Misinformation on the Evaluation of Candidate Characteristics 11 How Confirmation Biases Drive the Effects of Misinformation 13

Methods 16 Data Collection 16 Research Design 18 Observed Variables 19 Analysis Strategy 20 Manipulation Check 21 Results 23

Conclusion and Discussion 27

References 31

Appendix I: Survey 36

(6)

Too Good to Be True?

An Experiment Investigating the Influence of Exposure to Misinformation on Candidate Characteristics Moderated by Prior Beliefs

On the 6th of February in 2017, Geert Wilders tweeted a photo of Alexander Pechtold. On the photo, protesting Muslims demonstrating for the shariah in the Netherlands were shown. Geert Wilders edited this photo to make it look like Alexander Pechtold walked along that demonstration holding a banner containing the text: ‘Islam will dominate the world. Freedom can go to hell’. As you can imagine, this photo was fake, and Geert Wilders contributed to the spread of misinformation online. Geert Wilders’ photo did not result in much support, however. Many politicians and news media judged Geert Wilders his action. Alexander Pechtold responded on the tweet declaring: “In times of fake news and alternative facts it is unforeseeable what consequences such a fake picture can have”.

He is right. De Volkskrant (2018) wrote an article about fake news declaring that fake news is spread faster on Twitter than correct news. They point to an American research which investigated 126.000 news stories distributed on Twitter for the past eleven years. The results are clear. According to this article “lies diffuse significantly further, faster, deeper and broader than the truth. Because humans, and not robots, spread the stories more often”. With all consequences thence.

The example above demonstrates that social media facilitate the spread of misinformation. Nowadays, social media shape our most important source of acquiring information and news, especially among the new generations (Dutton, Blank & Gorseli, 2013). It has outgrown television as major news source (Allcott & Gentzkow, 2017). And yes, this rapid development has some benefits and affords some new opportunities. It is quick, easy to

(7)

use and very accessible. However, it does bring some issues along. Misinformation is unfortunately one of them.

The relevance of misinformation became evident during the presidential elections of 2016 in the United States. During these elections, stories containing misinformation - so called fake news stories - were shared more than correct and accurate stories. More specifically, most of the time these stories favoured Trump over Clinton (Silverman, 2016). Misinformation is not something new. Our society, for example, has met misinformation in the form of propaganda. This dates back from long ago. However, the rapid development of social media puts misinformation in a different and new perspective. Previous research showed that citizens lack factual knowledge about political issues (Delli Carpini & Keeter, 1996). When misinformation is distributed on the internet, everyone can get access to it. People base their opinions on false, misleading or inaccurate information considered as correct. So, this lack of truthfulness has an influence on citizens’ opinion about political matters (Althaus, 1998; Kuklinski et al., 2000). Although the government has started campaigns to alert people for fake news, the public is not entirely aware of the online distribution of misinformation. According to Ramsay, Kull, Lewis and Subias (2010) citizens are often not capable of distinguishing correct information from misinformation. In the end, people who are misinformed could make choices for their family and themselves which are not in their best interest and can have genuine effects (Lewandowsky et al., 2012). Since the public does base their opinion on this false information, the influence of misinformation on forming opinions needs to be investigated further. This thesis is doing a next step forward. With an experiment, this thesis tries to advance existing theory by exposing what the impact of misinformation is.

Knowing the effects of misinformation could be especially interesting during election campaigns. Several researchers did such a study, however, insights into the influence of misinformation during election campaigns in the Netherlands are scarce. Taken together, the

(8)

main question of this thesis is: To what extent and how does exposure to misinformation affects

people’s evaluations of a political candidate? To investigate this question, a number of

hypotheses are introduced. These are explicated under the section theoretical framework.

Theoretical Framework Disinformation in Online Information Settings

These days, it is impossible to imagine a society without social media. The new media replaced the traditional media as most important platforms to spread a message; and among younger generations new media also replaced traditional media in their search to news and information (Dutton, Blank & Gorseli, 2013). It even exceeded the television as leading news source (Allcott & Gentzkow, 2017). Studies showed that 62% of the adults in the United States in 2016 received news through social media (Afroz, Brennan & Greenstadt, 2012; Gottfried & Shearer, 2016). Obviously, the increasing use of the Internet and its unlimited possibilities made it easier for people to find news sources. This can be considered as an advantage. However, social media also provide a setting where mis- and disinformation can flourish. This is due to three main reasons. Firstly, there is a lack of quality control on social media. There are some ways to authenticate information online, but so far this has not been widely developed (Allcott & Gentzkow, 2017). Second, the boundaries to communicate on social media are very low. Every citizen with online access can share whenever they want whatever they want; professional journalism is not required anymore (Ross & Rivers, 2018). Lastly, the costs of sharing something online are low. Sometimes there are not even any costs. If your only focus is to acquire money at a short term, online is the place to be (Allcott & Gentzkow, 2017). That being said, this indicates that social media is a favorable ecological system for the dissemination of misinformation. But how is misinformation defined in the first way?

(9)

Mis- and disinformation is commonly known as fake news. This term is generally related to mis- and disinformation but does not even describe a bit of its complexity (Wardle, 2017). Therefore, misinformation and disinformation have to be defined clearly first. According to Wardle (2017) misinformation is “the inadvertent sharing of false information” and disinformation is “the deliberate creation and sharing of information known to be false”. Karlova and Fisher (2013) describe misinformation as “information which has been proven as incorrect and inaccurate”. Some researchers argue that it is going a step further and that misinformation has to be defined as “malicious information intended to cause undesirable effects in the general public, such as panic and misunderstanding; or to supplant valuable information” (Ferrara, 2015; Wang et al., 2017). However, according to Wardle (2017) this definition sounds more like disinformation. Where there is intend, there is disinformation.

Although researchers do not agree completely on the definition, they do agree that misinformation is false information proven as incorrect and inaccurate. There are multiple types of mis- and disinformation according to Wardle (2017). She describes seven types ranging from satire – no intention to cause harm but has potential to fool – till fabricated content – news content which is 100% false, designed to deceive and do harm.

There are several examples which show the threat of the dissemination of misinformation. A recent one occurred during the 2016 elections in the United States. Silverman (2016) showed that stories containing misinformation were shared more often than stories with correct information. The majority of the stories containing misinformation favored Donald Trump over Hilary Clinton and most of them were accepted by the audience (Silverman & Singer-Vine, 2016).

An important question in this setting is why we should consider the spread of misinformation as a threat. Does it really have an influence on citizens? The answer is ‘yes’. Take for example the story about Obama’s birthplace. This story suggested that Obama was not

(10)

born on Hawaii but actually in Kenya. Although the media met this story with skepticism, it had an undesired effect. According to Travis (2010), polls at that moment showed that a respectable part of the public widely believed the story. Including those citizens was a bulk of Republican voters (Barr, 2011).

As this example indicates, factual knowledge seems to be absent among citizens when it comes down to political matters (Delli Carpini & Keeter, 1996). This shortcoming influences the opinions citizens have about political issues (Althaus, 1998; Kuklinski et al., 2000). When misinformation is circulating on the internet people base their opinion on misleading, false or inaccurate information they consider as true. These days it seems that the public is aware of the fact that misinformation is circulating on the internet, especially during election campaigns. Unfortunately, when you ask people if they specifically can identify pieces of misinformation, they are often not capable of separating correct information from inaccurate and false information (Ramsay, Kull, Lewis & Subias, 2010).

Since citizens are not capable enough of distinguishing the two types of information but are more likely to base their opinions and political preferences on inaccurate information. Therefore, the influence of misinformation on forming opinions needs to be investigated further. Especially when bearing in mind that citizens’ beliefs are in conflict with established facts. Taken together, people who are misinformed could make choices for their family and themselves which are not in their best interest and can have some genuine effects (Lewandowsky et al., 2012). So, misinformation can have an effect on society since it pervaded our online world and citizens cannot make a distinction between misinformation and accurate information. Therefore, this thesis is focused on the effects of misinformation on the evaluation of candidate characteristics of a politician.

(11)

The Effects of Misinformation on the Evaluation of Candidate Characteristics

Political leaders play a major role in influencing voters to vote for their party due to two main reasons: the focus on party’s changed to party leaders and personalization (non-political characteristics and private lives) of politicians has raised more importance (Bittner, 2011; Karvonen, 2010; Kriesi, 2012). Since voters most of the time do not meet political leaders in real life, opinions are mainly based on information in the news, on television and online. Therefore, misinformation in the United States, for example, could have influenced people’s opinions about politicians. Factors on which we evaluate politicians are for example our perception of their capabilities and reliability. Those factors – also defined as candidate characteristics – are accepted as important factors influencing vote choice. Therefore, they play an important role in the election process (Laustsen & Bor, 2017).

However, the effects of misinformation on these evaluations did not get a lot of attention in previous research. For this reason, in this thesis there is special interest in five candidate characteristics which could be influenced by misinformation. These characteristics are based on a four-dimensional framework of Kinder (1986). The 5th characteristic is chosen based on the study of Laustsen and Bor (2017). These five candidate characteristics can be placed into two categories. First, there are hard characteristics which will tell more about someone’s capabilities as a politician. Hard characteristics include integrity, competence and leadership. The second category contains soft characteristics which tell more about the person him or herself. This category includes empathy and warmth. The hard characteristics can be summarized as follows:

1. Integrity: this is the quality of being honest and having strong moral principles. It is a personal choice to have a consisted and determined commitment to respect and honor spiritual, moral and ethical principles and values (Killinger, 2010). Respondents can be asked whether they perceive a politician as honest, reliable and moral;

(12)

2. Competence: this is the ability to do something successfully or efficiently. Election studies showed that competence can be seen as the most relevant trait for political leadership (Kinder, 1986; Funk, 1999). According to Kinder et al., (1980), competence is more likely to influence the vote choice of highly educated voters than other characteristics. Items to measure competence are intelligence, knowledge and skill; 3. Leadership: this is the ability to lead a group of people or an organization. Summerfield

(2014) defined leadership in just 3 words: ‘make things better’. According to him the core of the job of a leader is to make things better. This implies that a normal employee can be a leader, but his assigned leader may not be one. Respondents can be asked whether they perceive a politician as inspiring, weak, decisive and strong.

The soft characteristics are:

4. Empathy: this is the ability to understand and share the feelings of another. Empathy involves affective as well as cognitive aspects. The affective aspect contains “the capacity to enter into or join the experiences and feelings of another person” (Aring, 1958; Hojat et al., 2001). However, more important is the cognitive aspect which contains individuals’ capability of understanding someone’s feelings and perspective (Hojat et al., 2002). In terms of factors think about being compassionate and caring about other people;

5. Warmth: this is the quality of being intimate and being attached. An individual who has a warm character is probably friendly and enthusiastic in their behavior towards people. Items to measure warmth are trustworthiness, friendliness, helpfulness and sincerity.

(13)

Based on the alleged impact of misinformation on the evaluation of soft and hard characteristics, the following hypotheses can be introduced:

H1A: People who are exposed to misinformation are more likely to evaluate hard candidate characteristics of politicians in more negative terms than people who are not exposed to misinformation.

H1B: People who are exposed to misinformation are more likely to evaluate soft candidate characteristics of politicians in more negative terms than people who are not exposed to misinformation.

How Confirmation Biases Drive the Effects of Misinformation

Correcting misinformation does not always help. Corrections do not undo the caused damage; they may even strengthen misperceptions among ideological groups (Nyhan & Reifler, 2010). Nyhan and Reifler showed that responses to corrections about controversial political issues vary systematically by ideology. When people receive unwelcome information against their own beliefs, they do not only resist this information but are likely to support their own beliefs and opinions even stronger. This is also called a backfire effect. Lebo and Cassino (2007) define a backfire effect as “viewing unfavorable information as actually being in agreement with existing beliefs”. The backfire effect is part of a broader theory commonly known as motivated reasoning. Motivated reasoning is a mechanism which explains how information is judged while being processed in someone’s brains. This mechanism starts with the presumption that all reasoning is motivated (Kunda, 1987, 1990). While reasoning about perceived information, a human being is always struggling between being accurate and being perseverant when it comes down to one’s own beliefs. This struggle underlies all human reasoning. There are two type of goals explaining the motives of reasoning: accuracy goals and partisan goals. The accuracy goal explains that humans are motivated to thoughtfully examine

(14)

and seek out relevant evidence to achieve the most accurate conclusion (Baumeister & Newman, 1994). The partisan goal explains that humans are motivated to apply reasoning powers in favor of a prior, specific conclusion (Kruglanski & Webster, 1996).

The partisan goal is driven by “automatic affective processes that establish the direction and strength of biases” (Taber & Lodge, 2006). Individuals arguing about political matters are always goal oriented processors. They are appraising information with a bias towards their prior beliefs and views (Kunda, 1990; Chaiken & Trope, 1999). This means that individuals are likely to favor evidence and information which is reinforcing their prior beliefs and rejecting information which is against their prior beliefs, when evaluating political arguments (Edwards & Smith, 1996; Taber & Lodge, 2006). The research done by Nyhan and Reifler (2010) gives more support for the assumption that individuals engage in motivated reasoning more often. Where most previous research on motivated reasoning focused on the way opinions are constructed and arguments are evaluated by means of factual evidence, Nyhan and Reifler (2010) focused on the affective and cognitive processes when facing false information.

This thesis will focus on the first view: investigating how opinions are constructed and influenced by misinformation. To do so, motivated reasoning is investigated at two levels; on a level of a politician and on a level of an issue. To investigate motivated reasoning, it is wise to choose a politician and an issue which are known under the public and which have some clear proponents and opponents; people with strong prior beliefs. Because of these arguments Jesse Klaver is chosen to put central as a politician and related to him climate change is chosen as an issue.

Thorson (2016) did something similar when focusing on how corrected information affects attitudes in the partisan context. Her study proves that attitudinal effects can be created in two ways: through an affective or a deliberative process. When retrieval of information happens in an automatic way, attitudinal effects are created since misinformation has a larger

(15)

impact than its correction. Hereby, party identification plays a smaller role. On the contrary, when the correction is actively recalled, party identification affects the inferences that they draw, which create their attitudes.

It is important to note that not everyone agrees on the backfire effect described by, among others, Nyhan and Reifler (2010). In a major study, Wood and Porter (2016) did not find any evidence for a backfire effect. Across four studies they investigated more than 8100 respondents about 36 potential backfire issues. Among those four studies almost no evidence was found for a backfire effect. Their explanation is that citizens have the capability to participate in the democratic process. Citizens are able to take notice of the facts in factual information, even if they have to leave their ideological preferences behind. This finding connects to an important line written by Lupia and McCubbins (1998): “The capabilities of the people and the requirements of democracy are not mismatched as many critics would have us believe”.

The processes described by Thorson (2016) and Nyhan and Reifler (2010) explaining motivated reasoning have important implications for public opinion. Overall, motivated reasoning suggests that individuals with strong prior beliefs are probably more extreme on defending arguments in favor of their beliefs and uncritically address counterarguments. This could have far reaching implications for effects of misinformation on the public opinion. Especially during times of election campaigns this knowledge could be from special interest. If misinformation could be able to interrupt election processes, it can be a game breaker and changer. However, we are not quite sure if this so-called backfire effect is really strong, since not all studies do agree with each other. Although Wood and Porter (2016) suggest otherwise, I do believe that citizens have a tendency to be motivated by prior beliefs. Even though misinformation may not backfire, citizens with congruent views are probably more likely to believe misinformation than those without strong congruent views. Therefore, it is relevant to

(16)

know whether misinformation can influence perceptions towards a politician and issues, even when it is corrected. To test this, the next three hypotheses are formed.

H2A: People who have strong prior beliefs about a politician are more likely to reject contradicting information than people who do not have such a strong prior belief about a politician.

H2B: People who have strong prior beliefs about climate change are more likely to reject contradicting information than people who do not have such a strong prior belief about climate change.

H3: People who are exposed to news articles containing misinformation which are corrected later will not differ significantly in the evaluation of politicians in comparison to people who are exposed to a news article containing correct information.

Methods Data Collection

Since misinformation can reach everyone these days - especially through social media - the sample has to be a representation of the online population. To create such a sample of online media users, respondents were collected through a convenience and a snowball sample. The experiment was spread through a couple of different channels. Among them were Facebook, Instagram and WhatsApp. While selecting the respondents, particularly the difference in age mattered. Respondents needed to be 18 years or older. Everyone under the age of 18 were not taken into account during the analysis. The variable gender did not play a big role and therefore needed to be approximately equal. Lastly, for a represent sample it would be great when the respondents are on average neutral when it comes down to the political spectrum.

(17)

Where 174 people started the experiment, only 113 completed the experiment. 3 of those were younger than 18. Therefore, the final sample consists of 110 (63.2%) respondents. 55 of those are male (50%) and 55 are female (50%). The youngest respondent in this sample is 19 years old and the oldest is 79 years old. The average age in this sample is 36,06 with a standard deviation of 15,70. The most common age is 23 (11 respondents, 9,1%). 91 respondents finished an education after high school. 17 did so on MBO, 31 on HBO, 29 got a bachelor’s degree and 14 succeeded in getting a Master. Those results are also shown in Table 1. 25 respondents were randomized into condition 1, 29 into condition 2, 25 into condition 3 and 31 into condition 4. So, out of 110 respondents 79 have seen a news article containing misinformation. The other 31 have read a real news article and form the control group.

The respondents are on average quite central in their political attitude. On a political scale from 0 till 100 (from left to right) they scored on average 46,45. Considering scores from 0 till 20 and from 80 till 100 as extreme, 23 respondents are considered as extremely right or left in the political spectrum.

Table 1. Level of education among respondents.

Level of Education Frequency Percentage

Mavo 1 0.9 Havo 4 3.7 Vwo 13 11.9 MBO 17 15.6 HBO 31 28.4 University Bachelor 29 26.6 University Master 14 12.8

(18)

Research Design

To investigate the relation between receiving misinformation and correcting this form of communicative untruthfulness, prior beliefs and the evaluation of candidate characteristics, respondents had to read a news article about Jesse Klaver and evaluate candidate characteristics. To do so, they were placed into one of four conditions, part of an experiment with a 1x3+1 between-subjects factorial design. Factor A concerns exposure to misinformation or correct information. Respondents in three of the four conditions were exposed to a news article containing misinformation about climate change and Jesse Klaver. This news article was a manipulated version of a real news article. Numbers were changed, some assumptions were made without supported by any evidence and in the ‘fake’ story Jesse Klaver was portrayed in a negative way. The real news article was shown in the fourth condition. This article contained reliable information and was earlier this year published on Nu.nl. The reason why an article from Nu.nl is chosen, is because this news platform in general is neutral in their reporting (in contrast to De Telegraaf or De Volkskrant for example). The manipulated and the real news articles are added in Appendix 1.

In condition 1 the respondents were informed that the article they read contained misinformation; in the second condition they did not get any text after reading the news article; and in the last two conditions (among them the real news article) respondents were informed that the article they read was reliable and published during the provincial election campaigns. Whether respondents were informed about the exposure to misinformation forms factor B. In Table 2 the conditions and the number of respondents are schematically presented.

(19)

Table 2. Conditions in experiment.

Additional information: article contains misinformation

No additional information Additional information: article was published News article contains misinformation 1 (25 respondents) 2 (29 respondents) 3 (25 respondents) News article did not contain misinformation

- - 4

(31 respondents)

Observed Variables

Independent variables and moderators. For the first two hypotheses, there is only one independent variable. This is the condition respondents were placed into. A distinction is made for respondents who were exposed to misinformation and those who were not exposed to untrue communication. Hypotheses 2A and 2B include the independent variable condition and prior beliefs as moderator occurring at two levels: a politician and an issue. Both were measured with four items. Out of these items, two latent scales for prior beliefs on politician and climate change were created. The factor analysis for politician showed one component explaining 77,19% of the variance with a Cronbach’s Alpha of .901. The factor analysis for issue showed one component explaining 50,34% of the variance with a Cronbach’s Alpha of .783. Respondents with strong prior beliefs are those who score higher than M + 1SD on the latent scales. For prior belief on politician this counts for people who score above 78,71. For prior belief on issue this counts for people who score above 6,46.

Dependent variables. The dependent variable is the evaluation of candidate characteristics. This variable consists of five characteristics. These can be distinguished into

(20)

two categories: hard and soft characteristics. The hard ones are integrity, competence and leadership and the soft ones are empathy and warmth. Every single characteristic was measured with three statements. These can be found in Table 3. Respondents had to indicate on a 7-point scale whether they agreed or disagreed with all these statements. The averages and standard deviations are also shown in Table 3. Out of the fifteen items measuring five characteristics, three latent scales were created. One scale was created with the hard characteristics, one scale with the soft characteristics and one with all the characteristics. For all three latent scales a factory analysis and a reliability analysis has been performed. The hard characteristics consists of one component explaining 64,61% of the variances. Cronbach’s Alpha is .931 and cannot be increased when an item would be deleted. The soft characteristics consists of one component explaining 76,56% of the variances. Cronbach’s Alpha is .938 and cannot be increased when an item would be deleted. The factor analysis for one scale containing all the characteristics showed that the fifteen items consist of one component explaining 64,65% of the variances. Cronbach’s Alpha is .960 and cannot be increased when an item would be deleted.

Analysis Strategy

The collected data was analyzed with SPSS. First of all, every respondent under 18 was excluded from the analysis. All the respondents who did not complete the experiment, were also excluded. For Hypotheses 1A and 1B the respondents in the four conditions were recoded into a different variable. Respondents exposed to misinformation were placed into one category and those who were not exposed to misinformation were placed into another category. The hypotheses can be analyzed with an Independent Sample T-test since the averages between the fake news conditions and the real news condition have to be compared.

To test Hypotheses 2A and 2B a two-way ANOVA was used. This test does not only measure two main effects but also whether an interaction effect occurs between two independent variables: the exposure to misinformation (yes/no) and prior belief. Finally, for the

(21)

last hypotheses only respondents in the first and fourth condition are taken into account. The latent scale created for all the candidate characteristics is used. Hypotheses 3 can also be measured with an Independent Sample T-test since two averages are compared.

Table 3. Statements measuring candidate characteristics.

Manipulation Check

To measure whether the manipulation has worked correctly, a manipulation check has been performed. This check is done using an independent and a dependent variable. The independent one is the condition respondents were placed into. The dependent variable is the manipulation check. In this case, respondents were asked to tick statements they remembered from reading the news article. Six different statements were given. Two of those were not

Characteristics Statements Mean SD

Empathy Jesse Klaver cares about people 4.46 1.38

(Soft) Jesse Klaver shows compassion with people 4.39 1.26 Jesse Klaver is good in empathizing with other people 4.08 1.36 Integrity Jesse Klaver is honest and sincere 4.03 1.37 (Hard) Jesse Klaver does not have a hidden agenda and does

what he promises

3.74 1.25

Jesse Klaver is reliable 3.97 1.38

Competence Jesse Klaver has enough skills and knowledge 4.28 1.40

(Hard) Jesse Klaver is intelligent 4.92 1.29

Jesse Klaver works very hard to accomplish his goals 4.95 1.23 Leadership Jesse Klaver is an inspiring politician 4.06 1.59 (Hard) Jesse Klaver is able to make good decisions 3.94 1.34 Jesse Klaver is very good in leading a group of people

or an organization

4.08 1.35

Warmth Jesse Klaver is very friendly 4.54 1.49

(Soft) Jesse Klaver is helpful and thinks along with people 4.04 1.31

(22)

mentioned in any article, two were mentioned in the fake article and the last two were mentioned in the real article. To do the manipulation check four Fisher exact tests are used to test the differences between groups for the four statements.

The first statement is ‘the desired CO2-reduction of 25% in 2035 will not be reached’ and was shown in the fake article. The test shows that there is a significant effect between respondents who were exposed to misinformation and those who were not (Fisher exact p = .001). Significantly more respondents exposed to misinformation noticed this statement than those who were not exposed to misinformation. This is a weak relation (Lambda = .179).

The second statement is ‘the costs for citizens will be 3 billion higher than those for companies’ and was shown in the real article. The test shows that there is a significant effect between respondents who were exposed to misinformation and those who were not (Fisher exact p = .012). Relatively more respondents who were not exposed to misinformation noticed this statement than those who were exposed to misinformation. This is a weak relation (Lambda = .027).

The third statement is ‘there will not be a CO2-charge for companies’ and was shown in the fake article. The test shows that there is a significant effect between respondents who were exposed to misinformation and those who were not (Fisher exact p < .001). Significantly more respondents who were exposed to misinformation noticed this statement than those who were not exposed to misinformation. This is a weak relation (Lambda = .196).

The fourth statement is ‘the Labour Party and GreenLeft suggested a tax on CO2’ and was shown in the real article. The test shows that there is a significant effect between respondents who were exposed to misinformation and those who were not (Fisher exact p = .001). Relatively more respondents who were not exposed to perceive misinformation noticed this statement than those who were exposed to misinformation. This is a weak relation (Lambda = .032).

(23)

As shown above, all the four statements show a significant effect in the desired direction. This implies that the differences in reading several statements are big enough between the two conditions: the respondents recognized the right statements in the right condition. The fake statements were noticed in the fake conditions significantly more than in the condition with the correct information. The correct statements were noticed significantly more by the respondents in the condition with the correct information in comparison to those in the misinformation-condition. Therefore, we can conclude that the manipulation has worked, and the respondents understood and realized what they read.

Results

Hypothesis 1A stated that people who are exposed to misinformation are more likely to evaluate

hard candidate characteristics of politicians in more negative terms than people who are not exposed to misinformation. Since the averages between the fake news conditions and the real

news condition have to be compared an Independent Sample T-test is used. Respondents who were exposed to misinformation evaluated candidate characteristics more positive (M = 4.45, SD = 1.18) than respondents who were not exposed to misinformation (M = 3.91, SD = 1.25). This is a significant difference, t (108) = -2.14; p = .018, 95% CI [-1.05, -.04] (1-tailed). This does not only show that Hypothesis 1A was not supported, it even proves that there is a significant effect the other way around. Respondents who were exposed to misinformation evaluated the soft characteristics more positive than respondents who were not exposed to misinformation.

Hypothesis 1B stated that people who are exposed to misinformation are more likely to

evaluate soft candidate characteristics of politicians in more negative terms than people who are not exposed to misinformation. Since the averages between the fake news conditions and

(24)

Respondents who were exposed to misinformation evaluated candidate characteristics more positive (M = 4.37, SD = 1.06) than respondents who were not exposed to misinformation (M = 3.82, SD = 1.08). This is a significant difference, t (108) = 2.45; p = .008, 95% CI [1.00, -.11] (one-tailed). This does not only show that Hypothesis 1B was not supported, it even proves that there is a significant effect the other way around. Respondents who were exposed to misinformation evaluated the hard characteristics more positive than respondents who were not exposed to this.

Hypothesis 2A stated that people who have strong prior beliefs about a politician are

more likely to reject contradicting information than people who do not have such a strong prior belief about a politician. A two-way-ANOVA is used since this test can show if an interaction

effect occurs between the two independent variables and the dependent variable. Although the groups are not equal, Levene’s test is not significant so the requirements to perform this test are still met. Respondents who have strong prior beliefs evaluated candidate characteristics more positive (M = 5.21, SD = 0.76) than respondents with lower prior beliefs (M = 4.09, SD = 1.07). This is a moderately strong significant effect, F (1, 106) = 18.62, p < .001, n2 = .14. However, no interaction effect was found for condition and prior beliefs on candidate characteristics, F (1, 106) = 2.48, p = .119. The data does not support Hypothesis 2A. As Figure 1 shows, people with strong prior beliefs score higher in both conditions on evaluation than people without strong prior beliefs. Among people with strong prior beliefs, candidate characteristics are evaluated more positive when exposed to the correct news article, than to misinformation. This effect is different for people with lower prior beliefs. They score on average higher in the misinformation-condition than in the real news-condition. So, exposure to misinformation affects evaluation of candidate characteristics but this effect is not influenced by differences in prior beliefs on politician.

(25)

Hypothesis 2B stated that people who have strong prior beliefs about climate change

are more likely to reject contradicting information than people who do not have such a strong prior belief about climate change. Again, a two-way-ANOVA is used to test whether an

interaction effect occurs. Although the groups are not equal, Levene’s test is not significant so the requirements to perform this test are still met. Although respondents with strong prior beliefs score a little bit more positive on evaluation (M = 4.66, SD = 1.32) than those without strong prior beliefs (M = 4.16, SD = 1.04), this effect is not significant, F (1, 106) = .25, p = .617. However, an interaction effect was found for prior beliefs on climate change and condition on the evaluation of candidate characteristics, F (1, 106) = 6.98, p = .009, n2 = .06. This effect is moderately small. This means that the data supports Hypothesis 2B. As shown in Figure 2, people with strong prior beliefs score significantly higher in the misinformation-condition than people without strong prior beliefs. However, this is entirely different in the real news-condition where people with strong prior beliefs score significantly lower than people without those beliefs. So, exposure to misinformation has an effect on evaluation, but this effect is moderated by prior beliefs on climate change.

Hypothesis 3 stated that people who are exposed to news articles containing

misinformation which are corrected later will not differ significantly in the evaluation of politicians in comparison to people who are exposed to a news article containing correct information. Since the averages between the respondents who received a correction and

respondents who perceived a real news story have to be compared an Independent Sample T-test is used. Respondents who received a correction on their fake news story evaluated candidate characteristics higher (M = 4.50, SD = 1.37) than respondents who perceived real news (M = 3.80, SD = 1.08). This is a significant difference, t (53) = 2.10; p = 0.041, 95% CI [-.03, -1.35] (two-tailed). This shows that Hypothesis 3 does not find support in the data. Respondents who received a correction evaluate candidate characteristics more positive than respondents who

(26)

have read the real news story. This difference counters the hypothesis which did not expect a difference between groups. It even shows a significant effect supporting the backfire effect: unwelcome information is not only resisted but also strengthens the evaluation of candidate characteristics in a positive direction.

Figure 1. Relation between prior belief on politician, condition and candidate characteristics.

(27)

Conclusion and Discussion

The main question of this study was to what extent and how does exposure to misinformation

affects people’s candidate’s evaluations? By means of an experiment (N = 110) this has been

investigated. The main question was structured by five hypotheses. The first two hypotheses investigated the direct effect of exposure to misinformation on the evaluation of hard and soft candidate characteristics. They both found that respondents who were exposed to misinformation evaluated the hard and soft characteristics more positive than respondents who were not exposed to misinformation. Although this was in contrast to the hypotheses, both findings are significant. In relation to existing literature, most of the studies have written about the lack of factual knowledge among citizens regarding political issues. This lack would influence the beliefs and opinions citizens have about political matters (Delli Carpini & Keeter, 1996; Althaus, 1998; Kuklinski et al., 2000). Besides, citizens are not capable of separating misinformation from correct information (Ramsay, Kull, Lewis & Subias, 2010). Therefore, the expectation for this study was that a negative fake news story about Jesse Klaver would lead to more negative evaluation, just like happened during the 2016 American election (Silverman & Singer-Vine, 2016). Nonetheless, the opposite happened and therefore are these findings in contrast with existing literature.

The next two hypotheses also took prior beliefs into consideration as a moderator. They lead to interesting outcomes. First of all, prior beliefs about a politician has a significant main effect on candidate characteristics: respondents who have strong prior beliefs about a politician evaluated candidate characteristics more positive than those without strong prior beliefs. However, no interaction effect was found for this type of prior belief, condition and candidate characteristics. This was different for prior belief on issue. The data supports the hypothesis and confirms that being exposed to misinformation has an effect on evaluation of candidate characteristics, but this effect is moderated by prior beliefs on climate change. Hypothesis 3

(28)

showed a significant effect, however not justifying the hypothesis but the other way around. People who received a correction after being exposed to misinformation, evaluate candidate characteristics more positive than people who were not exposed to misinformation.

This latter finding supports the backfire effect. The principle of motivated reasoning did also occur when it came down to the issue level of prior beliefs. Respondents who have strong prior beliefs about climate change, evaluated candidate characteristics significantly more positive than those without these prior beliefs. Respondents rejected contradicting information and reinforced their prior beliefs, which is in line with existing literature (Edwards & Smith, 1996; Chaiken & Trope, 1999; Taber & Lodge; 2006; Nyhan & Reifler, 2010). However, a backfire effect on the politician level was not found. People with strong positive prior beliefs about Jesse Klaver, did not evaluate him more positive than people without these strong beliefs. This is in line with a study done by Wood and Porter (2016) which did not find a backfire effect either. There explanation is that people have the competence to participate in the democratic process. So, on one hand this study supports the backfire effect and the process of motivated reasoning; on the other this study shows that citizens are capable of judging every piece of information without bearing in mind their prior beliefs.

Those contradicting findings teach us one thing in general: more research on misinformation, prior beliefs and the evaluation on candidate characteristics is needed. This study does not only show that existing literature contradicts each other but also that within one study different results can be found on the same topic. Nevertheless, it contributes to current practice and literature due to several reasons. First of all, political parties can use these results during election campaigns. Although it seems tougher, reinforcing and reacting to citizens’ prior beliefs of an issue is more important than reinforcing prior beliefs of a politician. Information against someone’s prior beliefs on an issue is rejected faster. So, campaign makers should try to give as less counter information as possible during elections and try to reinforce

(29)

prior beliefs with supporting information. Second, this study is one of the first of its kind to be performed in the Netherlands using a real-life example of an existing politician. Future research can build further on this. Next, it also provides some new insights and along with that, some suggestions to focus future research on.

In terms of suggestions, first of all, misinformation is still a complex problem which needs more research to be understood completely. Many researchers still cannot find agreement on how to solve this issue. This study only plots what for consequences misinformation has on the evaluation of candidate characteristics; it does not provide a solution to this problem specifically. Future research should focus on solving this problem. A second important and key take-away point is that future research should separate between source and content-level motivated reasoning. Third, this study did not provide support for prior belief on politician as a moderator. This research focused on a left-wing politician. Possibly, the outcome will be different when a right-wing politician or someone in the political center is investigated, like Geert Wilders for example. Clearly, Jesse Klaver and Geert Wilders are very different politicians and persons. The outcome would probably not be the same. This also counts for the investigated issue. Climate change is a typically leftish subject. Issues like taxes, integration and immigration could have other results. Future research has to take several politicians and issues into account.

This latter point can also be considered as a limitation of this study. Since only a left-wing politician and a left-left-wing issue are investigated, the results in this study are not generalizable to all politicians in the Dutch political landscape. A study investigating both sides of the political spectrum would solve this problem. A second limitation is the completion percentage of the experiment, and with that the small sample of this experiment. Only 63.2% (110 people) of the respondents completed the experiment. So, either the subject was too boring for people or it was too difficult and too long. That first point is hard to change, however, a

(30)

future experiment on this topic can be made shorter and easier to solve this problem. This could lead to more respondents filling in the experiment completely and thus to a bigger sample. In that way, the respondents in the groups will increase which will lead to higher external validity. A third limitation is the forced exposure stimuli. Normally, people choose articles to read based on what their preferences. They are free to choose articles. This was not the case in this study where they were forced to a certain stimulus.

This study started with the suggestion of Alexander Pechtold participating in a demonstration of radical Muslims. A photo edited and shared by Geert Wilders lead to the question whether the spread of misinformation has its consequences on the evaluation citizens have of candidate characteristics. Next to that, prior belief about a politician and an issue was taken into account as a moderator. After investigating this with an experiment, it seems that misinformation even leads to a more positive evaluation of candidate characteristics than correct news. Something which was not expected on forehand. This effect is only moderated by prior belief on issue, not for politician. Particularly, this is something which doubts the backfire effect again. Are citizens indeed not capable of participating into political activities since they cannot distinct correct information from misinformation, and is there really a lack of factual political knowledge among citizens? Or do we underestimate our citizens, and should we agree with Lupia and McCubbins (1998) who doubted the backfire effect way before researchers even investigated this effect? They were the first to write that “the capabilities of the people and the requirements of democracy are not mismatched as many critics would have us believe”. There is one thing we know for sure. Responding to these questions with some simple answers is just like misinformation: it is too good to be true.

(31)

References

Afroz, S., Brennan, M., & Greenstadt, R. (2012). Detecting hoaxes, frauds, and deception in writing style online. Security and Privacy (SP), 2012 IEEE Symposium on (pp. 461-475). IEEE.

Althaus, S. L. (1998). Information effects in collective preferences. American Political Science

Review, 92(3), 545–558.

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal

of Economic Perspectives, 31(2), 211-36.

Aring, C. D. (1958). Sympathy and empathy. Journal of the American Medical Association, 167(4), 448-452.

Barr, A. (2011). 51% of GOP voters: Obama foreign. Politico. Retrieved at May 22 2019 from https://www.politico.com/story/2011/02/51-of-gop-voters-obama-foreign-049554.

Baumeister, R. F., & Newman, L. S. (1994). Self– Regulation of Cognitive Inference and Decision Processes. Personality and Social Psychology Bulletin, 20(1):3–19.

Bittner, A. (2011). Platform or Personality? the Role of Party Leaders in Elections. Oxford: Oxford University Press.

Chaiken, S., & Trope, Y. (1999). Dual Process The stories in Social Psychology. New York: Guilford Press.

Delli Carpini, M. X., & Keeter, S. (1996). What Americans know about politics and why it

(32)

Dutton, W.H., Blank, G. & Gorseli, D. (2013). Cultures of the Internet: The Internet in Britain.

Oxford Internet Survey 2013 Report, University of Oxford.

Edwards, K., & Smith, E. E. (1996). A disconfirmation bias in the evaluation of arguments.

Journal of Personality and Social Psychology, 71(1), 5–24.

Ferrara, E. (2015). Manipulation and abuse on social media. ACM SIGWEB Newsletter, (Spring), 4.

Funk, C.L. (1999). Bringing the candidate into models of candidate evaluation. J. Polit. 61, 700-720.

Gottfried, J., & Shearer, E. (2016). News Use Across Social Medial Platforms 2016. Pew Research Center.

Hojat, M., Mangione, S., Nasca, T. J., Cohen, M. J., Gonnella, J. S., Erdmann, J. B., & Magee, M. (2001). The Jefferson Scale of Physician Empathy: development and preliminary psychometric data. Educational and psychological measurement, 61(2), 349-365.

Hojat, M., Gonnella, J. S., Nasca, T. J., Mangione, S., Vergare, M., & Magee, M. (2002). Physician empathy: definition, components, measurement, and relationship to gender and specialty. American Journal of Psychiatry, 159(9), 1563-1569.

Karlova, N. A., & Fisher, K. E. (2013). Plz RT”: A social diffusion model of misinformation and disinformation for understanding human information behaviour. Information Research, 18(1), 1-17.

Karvonen, L. (2010). The Personalization of Politics: A Study of Parliamentary Democracies. Wivenhoe Park: ECPR Press.

(33)

Keulemans, M. (2018, 8 maart). Nepnieuws op Twitter sneller en vaker verspreid dan nieuwtjes

die wel kloppen. Derived at June 16 2019, from

https://www.volkskrant.nl/cultuur- media/nepnieuws-op-twitter-sneller-en-vaker-verspreid-dan-nieuwtjes-die-wel-kloppen~b884c1e0/.

Killinger, B. (2010). Integrity: Doing the right thing for the right reason. McGill-Queen's Press-MQUP. p. 12. ISBN 9780773582804.

Kinder, D.R. (1986). Presidential character revisited. In: Lau, R.R., Sears, D.O. (Eds.), Political Cognition. Lawrence Erlbaum, Hillsdale, New Jersey, pp. 233e255.

Kinder, D.R., Peters, M.D., Abelson, R.P., & Fiske, S.T. (1980). Presidential prototypes. Polit. Behavior. 2 (4), 315e337.

Kriesi, H. (2012). Personalization of national election campaigns. Party Polit. 18(6), 825–844

Kruglanski, A., & Webster, D. (1996). Motivated Closing of the Mind: ‘Seizing’ and ‘Freezing’. Journal of Personality and Social Psychology, 103(2) 263–83.

Kuklinski, J. H., Quirk, P. J., Schweider, D., & Rich, R. F. (2000). Misinformation and the currency of democratic citizenship. Journal of Politics, 62(3), 790–816.

Kunda, Z. (1987). Motivated Inference: Self-serving Genera- tion and Evaluation of Causal Theories. Journal of Personality and Social Psychology 53(4) 636–47.

(34)

Laustsen, L., & Bor, A. (2017). The relative weight of character traits in political candidate evaluations: Warmth is more important than competence, leadership and integrity. Electoral Studies, 49, 96-107.

Lebo, M. J., & Cassino, D. (2007). The aggregated consequences of motivated ignorance and the dynamics of partisan presidential approval. Political Psychology, 28(6), 719–746.

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the

Public Internet, 13(3), 106–131.

Lupia, A., & McCubbins, M.D. (1998). The Democratic Dilemma: Can Citizens Learn What They Need to Know?

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32, 302–330.

Ramsay, C., Kull, S., Lewis, E., & Subias, S. (2010). Misinformation and the 2010 election: A

study of the US electorate. Retrieved from http://drum.lib.umd.edu/bitstream/1903/11375/3/misinfor- mation_dec10_quaire.pdf

Ross, A. S., & Rivers, D. J. (2018). Discursive Deflection: Accusation of “Fake News” and the Spread of Mis-and Disinformation in the Tweets of President Trump. Social Media+ Society,

4(2), 2056305118776010.

Silverman, C. (2016). “This Analysis shows how Fake Election News Stories Outperformed Real News on Facebook.” BuzzFeed news, November 16.

(35)

Silverman, C. & Singer-Vine, J. (2016). “Most Americans Who See Fake News Believe It, News Survey Says.” BuzzFeed News, December 6.

Summerfield, M. R. (2014). Leadership: A simple definition. American Journal of Health-System Pharmacy, 71(3), 251-253.

Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs.

American Journal of Political Science, 50, 755-769. doi:10.1111/j.1540- 5907.2006.00214.x

Thorson, E. (2016). Belief echoes: The persistent effects of corrected misinformation. Political

Communication, 33, 460-480. doi:10.1080/10584609.2015.1102187

Travis, S. (2010). CNN poll: Quarter doubt Obama was born in U.S. Retrieved from http://politicalticker.blogs.cnn.com/2010/08/04/cnn-poll-quarter-doubt-president-was-born-in-u-s/

Wardle, C. (2017, 16 februari). Fake news. It’s complicated. Derived at 15 May 2019, from https://medium.com/1st-draft/fake-news-its-complicated-d0f773766c79

Wang, X., Lin, Y., Zhao, Y., Zhang, L., Liang, J., & Cai, Z. (2017). A novel approach for inhibiting misinformation propagation in human mobile opportunistic networks. Peer- to-Peer

Networking and Applications, 10(2), 377-394.

Wood, T., & Porter, E. (2018). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior. Advance online publication. doi:10.1007/s11109-018- 9443-y.

(36)

Appendix I: Survey

Start of Block: Openingstekst

Q4 Beste deelnemer,

Ik nodig u uit om deel te nemen aan een onderzoek dat wordt uitgevoerd onder de

verantwoordelijkheid van onderzoeksinstituut Graduate School of Communication, onderdeel van de Universiteit van Amsterdam.

Het doel van deze studie is om in kaart te brengen hoe bepaalde informatie verwerkt wordt en wat uw mening is ten opzichte van een aantal onderwerpen.

Het volledige onderzoek duurt niet langer dan 10 minuten en deelname is geheel vrijwillig. Aangezien dit onderzoek wordt uitgevoerd onder de verantwoordelijkheid van de Universiteit van Amsterdam, kan ik u het volgende garanderen:

1. Uw antwoorden zijn vertrouwelijk en we delen uw persoonlijke gegevens niet met derden, tenzij u hier eerst nadrukkelijk toestemming voor verleent. Om uw gegevens te beschermen, zal de opgeslagen data geen informatie bevatten die u kan identificeren en zullen alle gegevens als collectief geanalyseerd worden.

2. Deelname aan dit onderzoek is vrijwillig. U kunt op ieder moment besluiten om af te zien van deelname aan dit onderzoek. U kunt binnen 7 dagen na het onderzoek verzoeken om uw onderzoeksgegevens te laten verwijderen.

3. Deelname aan dit onderzoek brengt geen noemenswaardige risico’s of ongemakken met zich mee. Ook zult u niet met expliciet aanstootgevend materiaal worden geconfronteerd. Voor meer informatie over dit onderzoek en de uitnodiging tot deelname kunt u te allen tijde contact opnemen met onderzoeker: Marlon Braakman (marlon.braakman@student.uva.nl). Mochten er naar aanleiding van uw deelname aan dit onderzoek klachten of opmerkingen zijn, dan kunt u contact opnemen met bovenstaand adres. Een vertrouwelijke behandeling van uw klacht of opmerking is daarbij gewaarborgd.

Ik hoop u hiermee voldoende te hebben geïnformeerd en dank u bij voorbaat hartelijk voor uw deelname aan dit onderzoek dat voor mij van grote waarde is.

Met vriendelijke groet, Marlon Braakman

End of Block: Openingstekst Start of Block: Akkoord

(37)

Q2 Ik verklaar hierbij op voor mij duidelijke wijze te zijn ingelicht over de aard en methode van het onderzoek.

Ik stem geheel vrijwillig in met deelname aan dit onderzoek. Ik behoud daarbij het recht deze instemming weer in te trekken zonder dat ik daarvoor een reden hoef op te geven.

Ik besef dat ik op elk moment mag stoppen met het onderzoek.

Mijn persoonsgegevens worden niet door derden ingezien zonder mijn uitdrukkelijke toestemming.

Als ik meer informatie wil, nu of in de toekomst, dan kan ik contact opnemen met onderzoeker: Marlon Braakman (marlon.braakman@student.uva.nl).

Ik begrijp het bovenstaande en ga akkoord met deelname aan het onderzoek:

o

Ja (1)

o

Nee (2)

Skip To: End of Survey If Ik verklaar hierbij op voor mij duidelijke wijze te zijn ingelicht over de aard en methode van he... = Nee

End of Block: Akkoord

(38)

Q9 Je krijgt nu eerst een aantal stellingen te zien. Geef bij deze aan in hoeverre je het er mee eens bent of niet.

Hoe groot zijn de klimaatproble men op dit moment? (1)

o

1 :Ontzett end klein (1)

o

2 (2)

o

3 (3)

o

4 (4)

o

5 (5)

o

6 (6)

o

7: Ontzett end groot (7) Hoe sterk ben

je van die mening overtuigd? (2)

o

1: Helemaa l niet sterk (1)

o

2 (2)

o

3 (3)

o

4 (4)

o

5 (5)

o

6 (6)

o

7: Ontzett end sterk (7) Hoe belangrijk

vind je het dat klimaatproble men worden opgelost? (3)

o

1: Helemaa l niet belangrij k (1)

o

2 (2)

o

3 (3)

o

4 (4)

o

5 (5)

o

6 (6)

o

7: Ontzett end belangri jk (7)

(39)

Q40 In hoeverre ben je het eens met de volgende stellingen? 1: Helemaal mee oneens (1) 2 (2) 3 (3) 4 (4) 5 (5) 6 (6) 7: Helemaal mee eens (7) Klimaatproblemen moeten worden aangepakt, voordat het te laat

is (1)

o

o

o

o

o

o

o

De mens is verantwoordelijk voor de klimaatproblemen (2)

o

o

o

o

o

o

o

Er kan wel iets gedaan worden aan de opwarming

van de aarde, in tegenstelling tot wat veel critici

beweren (3)

o

o

o

o

o

o

o

Nederland moet meer kerncentrales bouwen (4)

o

o

o

o

o

o

o

Binnen 10 jaar moeten we helemaal op duurzame energie zijn overgestapt (5)

o

o

o

o

o

o

o

End of Block: Prior Belief Topic Start of Block: Prior Belief Politici

Q8 Hieronder krijg je een viertal vragen te zien over 4 verschillende politici. Met de slider kan je aangeven hoe sterk je iets vindt. Een score van 0 geeft aan dat je de politicus op dat vlak erg slecht vindt, en een score van 100 geeft aan dat je een politicus op dat vlak erg goed vindt.

(40)

Q4 In hoeverre beschikken de volgende personen over de juiste vaardigheden om politicus te zijn?

Helemaal niet Heel erg

0 10 20 30 40 50 60 70 80 90 100 Mark Rutte ()

Jesse Klaver () Gertjan Segers () Lilian Marijnissen ()

Q5 Hoe betrouwbaar vind je de volgende politici?

Niet betrouwbaar Erg betrouwbaar 0 10 20 30 40 50 60 70 80 90 100 Mark Rutte ()

Jesse Klaver () Gertjan Segers () Lilian Marijnissen ()

Q6 Hoe warm is de uitstraling van deze politici?

Niet warm Erg warm

0 10 20 30 40 50 60 70 80 90 100 Mark Rutte ()

Jesse Klaver () Gertjan Segers () Lilian Marijnissen ()

(41)

Q7 Hoe integer vind je deze politici?

Niet integer Erg integer 0 10 20 30 40 50 60 70 80 90 100 Mark Rutte ()

Jesse Klaver () Gertjan Segers () Lilian Marijnissen ()

End of Block: Prior Belief Politici

Start of Block: Introductie Nieuwsartikel

Q12 U krijgt nu een nieuwsartikel te lezen. Ik wil u vragen dit artikel goed door te lezen. Hierna krijgt u een aantal vragen. Het nieuwsartikel blijft in ieder geval voor 30 seconden zichtbaar. Daarna kunt u pas verder gaan.

End of Block: Introductie Nieuwsartikel Start of Block: Conditie 1 - Fake /Do know

Q31 Timing First Click (1) Last Click (2) Page Submit (3) Click Count (4)

(42)

Q10

Page Break

Q11 Het nieuwsartikel dat je zojuist hebt lezen, is nepnieuws. Het circuleerde rond tijdens de provinciale verkiezingen eerder dit jaar.

End of Block: Conditie 1 - Fake /Do know Start of Block: Conditie 2 - Fake /Don't know

Q33 Timing First Click (1) Last Click (2) Page Submit (3) Click Count (4)

(43)

Q12

End of Block: Conditie 2 - Fake /Don't know Start of Block: Conditie 3 - Fake /True

Q34 Timing First Click (1) Last Click (2) Page Submit (3) Click Count (4)

(44)

Q14

(45)

Q13 Het nieuwsartikel dat u zojuist heeft gelezen, is gepubliceerd tijdens de provinciale verkiezingen eerder dit jaar.

End of Block: Conditie 3 - Fake /True Start of Block: Conditie 4 - Real

Q35 Timing First Click (1) Last Click (2) Page Submit (3) Click Count (4) Q16 Page Break

Q15 Het nieuwsartikel dat u zojuist heeft gelezen, is gepubliceerd tijdens de provinciale verkiezingen eerder dit jaar.

(46)

End of Block: Conditie 4 - Real

Start of Block: Candidate Characteristics

Q17 In hoeverre vindt u dat Jesse Klaver... 1: Helemaal niet (1) 2 (2) 3 (3) 4 (4) 5 (5) 6 (6) 7: Heel erg veel (7) Om mensen geeft (1)

o

o

o

o

o

o

o

Mededogen toont met mensen (2)

o

o

o

o

o

o

o

Zich goed kan inleven in andere mensen (3)

o

o

o

o

o

o

o

Eerlijk en oprecht is (4)

o

o

o

o

o

o

o

Betrouwbaar is (5)

o

o

o

o

o

o

o

Geen verborgen agenda heeft en doet wat hij zegt (6)

o

o

o

o

o

o

o

Genoeg vaardigheden en kennis heeft (7)

o

o

o

o

o

o

o

Intelligent is (8)

o

o

o

o

o

o

o

Hard werkt om z'n doelen te bereiken (9)

o

o

o

o

o

o

o

Een inspirerende politicus is (10)

o

o

o

o

o

o

o

(47)

Q36 Zou u, indien er nu landelijke verkiezingen zijn, op Jesse Klaver stemmen?

o

Ja (1)

o

Nee (2)

o

Misschien (3)

o

Dat weet ik nog niet (4)

End of Block: Candidate Characteristics Start of Block: Demografische gegevens

Q18 Tot slot willen we u nog een paar dingen vragen. Goed beslissingen kan nemen (11)

o

o

o

o

o

o

o

Goed leiding kan geven aan een groep mensen of organisatie (12)

o

o

o

o

o

o

o

Erg vriendelijk is (13)

o

o

o

o

o

o

o

Erg behulpzaam en meedenkend is (14)

o

o

o

o

o

o

o

Een warme uitstraling heeft (15)

o

o

o

o

o

o

o

Referenties

GERELATEERDE DOCUMENTEN

The content is based on the ‘10 th Principle’ (Anti-corruption) of the United Nations Global Compact, which could be defined as an international and universal standard. This

The literature regarding anodes and anode ground beds will be used to identify the different types of anodes that are generally used and identifying the

Een hogere mate van hechtingsangst van de vrouw, maar niet van de man, en een langere relatieduur zijn gerelateerd aan een sterker negatieve score op reciprocity en dus

comes into existence. The tangible or physical form of the work embodies two separate items of property, i.e. the copyright in the work of the intellect and

The executional cues that were used to measure advertising effectiveness were based on theory and consisted of nine different variables: celebrity, real people in real

Het inrichten van een woonerf gebeurt niet alleen om sluipverkeer te weren en de snelheid van het resterende verkeer te beperken, maar ook om een

Van 64 rijders onder invloed is de herkomst niet geregistreerd; het betreft voornamelijk lichte overtreders, die geen ademanalyse voor bewijsdoelein- den hoefden te

We investigated the use of prior information on the structure of a genetic network in combination with Bayesian network learning on simulated data and we suggest possible priors