• No results found

The influence of Facebook News Feed on the political attitude of users in two different cultural environments

N/A
N/A
Protected

Academic year: 2021

Share "The influence of Facebook News Feed on the political attitude of users in two different cultural environments"

Copied!
120
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

“The influence of Facebook News Feed on the political attitude of

users in two different cultural environments”

by Maryam Pakmanesh 11375671 Date: 23rd June 2017 Final version

MSc. in Business Administration – Digital Business University of Amsterdam - Economics and Business Supervisor: Dr. Gábor Kismihók

(2)

Statement of Originality

This document is written by Maryam Pakmanesh who declares to take full responsibility for the contents of this document.

I declare that the text and the work presented in this document is original and that no sources other than those mentioned in the text and its references have been used in creating it. The Faculty of Economics and Business is responsible solely for the supervision of completion of the work, not for the contents.

(3)

Table of content

Abstract 1

1. Introduction 2

2. Literature review and theoretical background 4

2.1 Social influence on SNS 4

2.2 Social media, SNS and their impact on users 10

2.3 Social networking sites and politics 12

2.4 Facebook News Feed 15

2.5 Filter bubble 20

3. Method and research design 23

3.1 Observation study 23

3.2 Creating the profiles 23

3.2.1 A typical Trump supporter 24

3.2.2 A typical Trump opposer 25

3.3 Diversity check 26 3.4 Democratic or Republican 27 3.5 Evaluation strategy 29 4. Data 30 4.1 Data collection 30 4.2 Pre-Test 32

4.3 Interruptions with Facebook 32

5. Results 33

6. Discussion 41

6.1 Consequences and implications for the future 48

6.2 Limitations and future research 50

(4)

References 54

Appendix I

A1: Survey results I

A2: Overview of pages followed in Facebook XI

A3: List of political events XIII

A4: Pre-test result 27.04.17 XIV

A5: Complete observation results XV

A6: Comparison of data XVII

A7: Facebook Activity Log screenshots XVIII

List of figures

Figure 1: Processes of social influence 6

Figure 2: Asch experiment 8

Figure 3: Facebook News Feed 19

Figure 4: Conceptual framework: Political attitude bubble 22

Figure 5: Facebook account of Susan Thomas 26

Figure 6: Facebook of James Smith 26

Figure 7: Screenshot: Facebook denies access 33

Figure 8: Comparison of data during a regular day 40

Figure 9: Comparison of data during the withdrawal from the Paris agreement 40

List of tables

Table 1: Social influence 7

Table 2: Diversity check of the News Feed 27

(5)

Table 4: Overview of pages followed in the final settings 30

Table 5: Overview of all observation phases and breaks 31

Table 6: Total post distribution in % 33

Table 7: Contrast of blue and red Facebook Feed 44

(6)

Abstract

More and more people are getting their news from social networking sites (SNS) like Facebook. However, recent research shows that algorithmic filters and personalization put people in ‘filter bubbles’, in which they only receive information they already confirm with.

This study focuses on the influence of Facebook News Feed on the political attitude

(Republican & Democratic) of users in two different cultural environments (USA & Germany). Using a qualitative approach, four self-created Facebook profiles were observed

over a period of six weeks. Additionally, to the research question this study looked at cultural differences and the diversity of the News Feeds. Following insights could be concluded from the results observed: 1. Time and the occurrence of major political events influence the News Feed, 2. Democratic posts are being more pushed to the News Feed than Republican ones, 3. The News Feeds of the users living in Germany are more diverse at the beginning but become narrower as time passes. Eventually, the users create their own filter bubble, in which each activity influences their News Feed. Consequently, for individuals as well as for the government and organizations, the implications are to create awareness and find ways to overcome polarizations on SNS. For the sake of democracy and a global mindset it is

important to recognize the power that algorithms have on us humans and how they impact the way we think.

Keywords: Social networking sites, Facebook, News Feed, filter bubble, algorithm, political attitude, Republican, Democrat, Trump, social influence

(7)

“By giving people the power to share, we're making the world more transparent.”

–Marc Zuckerberg

1. Introduction

Today, the internet is connecting people all over the world. Around 40% of the world population is connected to the Internet while over 20 years ago it was less than 1% (Statista 2017). Everyday 500 million tweets are sent, 4 million hours of content are uploaded on YouTube, 4.4 billion Facebook messages are posted, 5.75 billion Facebook likes and 3.6 billion Instagram likes are made (Schultz 2016). Looking at these numbers, one might wonder how the average user manages to handle the daily information flow they have to face.

Recommendation systems and personalization are one way to overcome the huge amount of data (Pariser 2011). Based on an algorithm, the user receives a filtered display of the content, which is then posted on social media or search engine results such as Google. Facebook as the most used social networking site (SNS) with currently 1.71 billion users (Smith 2016) also uses these kinds of algorithm for their News Feed. All kinds of updates and posts are

displayed on the News Feed. It is the first thing users see when they open their Facebook and therefore plays an important role in a user’s daily social media life. The algorithm is

responsible for what the user will eventually see and what will remain invisible.

However, how the News Feed will look depends on multiple factors like the user’s behavior on Facebook and other external factors. Consequently, each user will get a

personalized view leaving him or her in a filter bubble, which they struggle to escape from. Users constantly get influenced by various inputs such as advertisements or content posted by their Facebook friends. The question then is, how strong can this influence be when each user is ‘trapped’ in their own filter bubble. In the context of for instance political attitudes this gets even more important. Here the different views can be very extreme. Personalization and filters can cause polarization and hence, affect democracy (Helbing et al. 2017). In a filter bubble, users only see posts that they most likely conform to and posts that are shared by

(8)

friends, which they interact with the most. This makes it harder for posts to appear that are not according to their view. Facebook however is a platform that wants people to connect and share their stories with others, leaving room for discussions. The U.S. presidential election in 2016 was one of the most discussed issues on Facebook (Dicker 2016). Users as well the candidates posted politically driven content. For the candidates this was a way to reach their

voters, emphasizing the importance of SNS for political engagement (Alashri et al. 2016).

Currently the U.S. seems to be divided in their opinion about U.S. President Trump (Trump). Due to the current importance of the topic this study will look particularly at this political situation in the U.S. and the attitude of Americans towards rather Republicans (supporting Trump) or rather Democratic (not supporting Trump). Considering the fact that lots of information will be filtered out it is interesting to see how political content, which can be seen as a rather sensitive topic, reaches users on their Facebook News Feed. In fact 44% of American Facebook users get their news from the site, which shows that Facebook plays an important role in influencing users political attitude (Gottfried & Shearer 2016). If a user has a rather strong political attitude, will they ever get in contact with content away from their view? Or will the algorithm only show them content they conform to? How fast can algorithms adapt to users preferences and adjust their News Feed?

This study will examine the influence of SNS, in this case Facebook, on the political attitude (pro or anti Trump) of users in two different cultural environments (USA and

Germany). Not much research has been conducted to see if the News Feed display will differ from country to country, while users follow the same kind of patterns. In particular, this study will show how intensive use of Facebook for a specific interest or activity, will generate a selective (filtered) visualization in the user’s News Feed.

In the context of an observation study, the display of News Feed of four created Facebook profiles will be compared and evaluated. Consequently, this study will deal with two extremes; on the one side users on the Internet are continually being influenced by

(9)

various inputs and on the other side each one of them is stuck in their own filter bubble. The overall research question will therefore be: What is the influence of Facebook News Feed on the political attitude of users in two different cultural environments? The sub research questions will be: What is the difference between the News Feed of users in two different cultural environments? And how diverse are the News Feeds?

The contribution of this study includes access to the Facebook profiles, since so far studies that examined the News Feed were limited with the answers of participants of interviews or smaller experiments (Bucher 2012; Rader & Gray 2015; DeVito 2016).

First an overview of the literature and more insight into the theoretical background is given. Then the method and the data collection are explained. Afterwards the results of the data analysis will be presented and finally the discussion and the conclusion will follow.

2. Literature review and theoretical background

2.1 Social influence on SNS

When talking about SNS and the influence of SNS like Facebook on the attitude of users, one has to understand the main theoretical concepts behind it. Even though SNS are computer-mediated-communication tools they are still based on social networks and the interactions of humans online. Crandall et al. (2008) examined the effect of social influence and selection (an act to interact with people that are already similar to us) in social networks. They found feedback effects between these two factors and were also able to predict activities of individuals based on their social networks, looking at the current activities of their friends (Crandall et al. 2008). “People decide to adopt activities based on the activities of the people they are currently interacting with; and people simultaneously form new interactions as a result of their existing activities” (Crandall et al. 2008, p. 160). So in order to analyze the influential power of SNS one has to understand how social influence in social networks

(10)

works. Social influence can be described as a process where one person’s behavior (or a group) has an influence on the attitude of another person. However, this can be with or without intention (Zimbardo & Leippe 1991). Zimbardo and Leippe (1991) define attitude as ”an evaluative disposition toward some object. It’s an evaluation of something or someone along a continuum of like-to-dislike or favorable-to-unfavorable. Attitudes are what we like and dislike, our affinities and aversions, the way we evaluate our relationship to our

environment. An attitude is a disposition in the sense that it is a learned tendency to think about some object, person, or issue in a particular way” (Zimbardo & Leippe 1991, p. 31). Social influence is therefore persuasive. Advertisements, political campaigns, speeches or even interactions with friends of family can be influential. Furthermore, Zimbardo & Leippe (1991) distinguish between three kinds of settings of social influence: Interpersonal social influence, persuasion and mass media. Interpersonal social influence occurs when one person is influencing one or a few people. This type is very individualized and there is one-to-one communication. Persuasion is a type of social influence where one person is influencing a big group of people for instance through a speech. Persuasion is seen as less personal. The last setting is mass media, which occurs when influence is happening through a medium like TV or radio (Zimbardo & Leippe 1991). The influence through a SNS like Facebook is harder to categorize, since it can fit in all three settings; it can be interpersonal when users send each other messages, it can be persuasion if a politician writes a long post to his followers and it can be mass media when sponsored ads show up. Hence, one can say that interaction between Facebook “friends” are also a kind of social influence, since user are constantly exposed by posts from others on their News Feed. Another categorization was made by Kelman (1958). He identified three processes of social influence: Compliance, identification and

(11)

Figure 1: Processes of social influence

Compliance occurs when someone accepts influence in order to be rewarded or not be rejected. The person may have a different opinion on a subject but he does not share it with the others in order to be accepted. In compliance “we say what we do not believe and what we believe we do not say” (McIlveen & Gross 1999, p.14). Identification occurs when someone accepts influence because he wants to be part of a group or be like someone that he admires or respects. The content is irrelevant; it is more about identifying yourself with the person and believing whatever he says. Internalization occurs when someone accepts influence and also believes in the content himself. In Internalization “we say what we believe and we believe what we say” (McIlveen & Gross 1999, p.14). Kelman (1958) explains the change of attitudes with social conformity. Social conformity is therefore a type of social influence, which

contains a change in someone’s behavior or belief in order to fit in with the social norms of a group. Kelman (1958) gives an example describing how the attitude towards the United Nation (UN) in the U.S. became more favorable, since multiple important public figures talked about the UN in a positive way (Kelman 1958, p.51). So in other words when specific changes occur regarding a specific topic the general attitude towards said topic may change as well. This becomes important when talking about political attitudes. It is interesting to see if this social conformity can also be seen on SNS like Facebook. Building on the idea of social conformity, Deutsch and Gerard (1955) distinguish two psychological needs that make people conform to others: Informational social influence (ISI) and normative social influence (NSI). ISI describes the need to be right and occurs when people have ideas or attitudes, which they

(12)

want to be confirmed by others. NSI describes the need to be liked and occurs when people agree with others because they have the power to reward, punish, accept or reject them (McIlveen & Gross, 1999; Deutsch & Gerard, 1955). Therefore ISI usually involves

internalization, because it also includes the desire to be right even when there is no obvious correct answer (McIlveen & Gross, 1999). For instance if someone really smart in class gives an answer, other students might follow and agree with that. NSI usually involves compliance, since the motivation is to be accepted by the group regardless of one’s own opinion.

Burnkrant and Cousineau (1975) combined the two theories of social influence into one table (Table 1).

Table 1: Social influence (Burnkrant & Cousineau 1975)

In an experiment they observed participants evaluating a product after using other peoples evaluations. The people perceived the product more favorable than they would have without knowing the evaluation of the others. In this case it was rather ISI so simply a base to infer if a product is good or not, since no one had access to direct observation themselves (Burnkrant & Cousineau 1975). This happens all the time; people buy products that others in their social environment have bought, not because they want to be liked or get rewarded but because they want to acquire something that they think is a good product (Burnkrant & Cousineau 1975). However, this experiment supports the theory that other people’s judgment is influencing our actions. Applying this to the idea of political attitude it is interesting to see if critical posts on the News Feed will influence a user’s opinion considering the fact that it has been shared or liked by a lot of ‘friends’. Does it make the post more credible? Huang and Chen (2006) say that the influence on the Internet is rather informational rather than normative, since users do

(13)

not need to conform to other in order to purchase something. This study however will not focus on purchase decision, since this field has been researched quite a lot (Lee et al. 2017; Wagner et al. 2017; Lee & Hong 2016). Studies on social conformity connected with SNS already exist but rather regarding the adoption and diffusion than looking at the social

networks between Facebook friends (Park et al. 2013). This study will look at social influence on SNS regarding political attitude. Since this study includes the access to Facebook profiles (although fake), which are usually restricted due to privacy concerns, the results will deliver insights into a little examined field.

An important study that contributed to the idea of social conformity is the one by Solomon Asch (1956). In his experiments he showed participants pairs of cards, where one card showed three lines with different lengths and the other card showed one line that had the same lengths as one of the three other lines (See figure 2).

Figure 2 – Asch experiment (Asch 1956)

The participants had to say which of the three lines was matching with the other lines. All participants except for one would be hired for this experiment and they would give wrong answers, sometimes on purpose. The real participant was seated as the last one in the row to answer, making the pressure of the others even bigger. Influenced by the answer of the

majority, the real participant would also name the wrong answer. Asch (1956) interviewed the real participants after the experiment, asking why they would give the wrong answer. Some said they did not want to embarrass themselves in front of the others, some thought the other

(14)

must have had better eyesight; “in other words, they used the judgments of the others as a source of information” (Argyle 1998 p.13; Asch 1956).

In the context of SNS there has been similar observation (Waheed et al. 2016); social influence is a feature that could have an impact on user behavior on SNS. The influential drivers are family, friends, significant partners, peers, culture, and social norms (Waheed et al. 2016). Most of the studies looked at network effect theory and social conformity studies (Huang and Chen 2006; Lim et al. 2003; Yoo et al. 2012). Yoo et al. (2012) distinguish between social influence that can ‘push’ us away or ‘pull’ us closer towards a system. For them social conformity is some kind of social pressure, which is uncontrollable and can affect a user’s decision directly and indirectly; they define it as ‘influence as a social push’ (Yoo et al. 2012, p. 267). One of the results was that the perceived value of Twitter increased with the number of users (network effect), which implies that social conformity depends on the

number of participants in the network and therefore this has rather a quantitative outcome than a qualitative (Yoo et al. 2012, p. 278f). However, not only the number of users is important but also the identity of these users, which emphasized the idea of social ties in social networks. The stronger the social ties and agreement of common values and norms are the higher the social conformity (Li 2007; Yoo et al. 2012). In SNS the relationship with ‘friends’ is therefore crucial in order to decide if a user will conform or not. Hence, it is interesting to see if the social influence theories discussed in this study will occur on SNS the same way. Are offline conformity principles applicable to an online context like Facebook? It is interesting to see if social conformity can been observed between Facebook users, who are exposed to political content in their News Feed.

(15)

2.2 Social media, SNS and their impact on users

SNS like Facebook or Twitter are making it easier for people to socialize and communicate with each other, even if they are locally separated (Correa et al. 2010; Hampton and Wellman 2002; LaRose et al. 2001). Boyd and Ellison (2007) define SNS as “web-based services that allow individuals to construct a public or semi-public profile within a bounded system,

articulate a loss of other users with whom they share a connection, and view and traverse their list of connections and those made by others within the system” (Boyd and Ellison 2007, p.211). SNS enables people to meet strangers and present their social networks to others (Boyd and Ellison 2007). According to recent rankings Facebook is the most popular social network worldwide (Chaffey 2016). The main purpose is to connect with people and interact with ‘friends’, while many SNSs also offer different features and mechanism like sending private messages, picture or video sharing or commenting on pictures (leaving messages on friends profile) (Boyd and Ellison 2007; Had and Garijih 2016). Yazdanifard and Yee (2014) say that SNS are a type of social media and allow Internet users to connect with people and create information. SNS are originally used for entertainment purposes but have gained more and more importance (Yazdanifard and Yee 2014). Lin and Lu (2011) have conducted an empirical study on why people use SNS and found that enjoyment is the most important factor affecting the behavior of SNS users (Lin and Lu 2011). Zyl (2009) describes SNS as “an application or websites that support the maintenance of personal relationships and the discovery or potential relationships” (Zyl 2009, p.915). The growing popularity of social media has also been a research topic for marketers that examine the changes of consumer behavior on SNS (Wang 2017). Many studies exist about the different impacts and use of SNSs. Risks of sharing information and building networks, self expression on SNS and legal issues about privacy settings have been an important part of SNS studies (Dwyer et al. 2007; Livingstone 2008; Rosenblum 2007; Kuczerawy and Coudert 2011; Yazdanifard and Yee 2014). The fact that attitudes and values have an impact on social media behavior has also

(16)

been acknowledged by Brännback et al. (2016). They conducted research on how value systems impact the digital native’s behavior with social media and found that “living a life in social media as digital natives might lead to cultivation of alternative value system as well” (Brännback et al. 2016, p.14). Other studies like the one by Centola (2010) looked at how behavior can change using social networks and especially at the network itself, which users build connecting with others (Centola 2010). Centola’s (2010) result shows that “network structure has a significant effect on the dynamics of behavioral diffusion” (Centola 2010, p.1196). Sorensen et al. (2014) give a good overview of social media networks and their usage, differences in the use of SNS by people and also by businesses and challenges of public administration.

However, some studies also examine the side effects of SNS like addiction and losing connection to reality (Can and Kaya 2016; Verma & Kumari 2016). Can and Kaya (2016) found that since things are visible online and users are more exposed to see them, they have more time to think about them and the message behind it. This is a reason why SNS are becoming very crucial for marketers, who can benefit from the huge accessibility. More marketers are using social media for their marketing strategies and to communicate their brands. Companies can build strong ties with their customers by posting content and also getting feedback by followers (Chu and Kim 2011; Hoffman and Fodor 2010). Electronic word of mouth (eWOM) is therefore becoming an important tool. Kaplan and Haenlein (2010) say: “Social media allows firms to engage in timely and direct end-consumer contract at relatively low cost and higher levels of efficiency than can be achieved with more

traditional communication tools” (Kaplan and Haenlein 2010, p.67). Yazdanifard and Yee (2014) say: “For instance, social networking sites like Facebook allow registered users to interact globally and freely with people who share the same interest by uploading photos, sharing posts, chatting, commenting on the posts and also getting replies, such features allow companies to get quick feedback from consumers whereas consumers are able !to receive

(17)

responses immediately “ (Yazdanifard and Yee 2014, p.1). Lee and Hong (2016) also recognized the persuasive characteristics of SNS. They examined online user’s behavioral responses regarding SNS advertising and found among other things that herd behavior (perception of informational social pressure) had a positive effect on expressing empathy. “Users who recognize that many people appear to like a given SNS ad will be likely to perceive that important others expect them to like the ad as well, ultimately leading them to intend to like the ad.“ (Lee & Hong 2016, p. 370). Political campaigns can be seen as

marketing campaign; both have aim at influencing their target group. This makes SNS a very powerful tool. Another study by Zhang et al. (2010) focuses on the engagement of people using SNS by looking at the “extent to which social networking sites influence political attitudes and democratic participation after controlling for demographic variables and the role of interpersonal political discussion in stimulating citizen participation” (Zhang et al. 2010, p.75). Again influencing seems to be an important factor in this field.

2.3 Social networking sites and politics

Since people can not only build networks on social networking sites, but also share texts, images, or articles about topics they are interested in, SNS provide a huge platform for

discussion, also about political topics (Lin & Lu 2011; Zhang et al. 2010). Usually a SNS user follows or connects with friends or people that they know and also share the same interest, but sometimes these networks become very big and represent a mixture of conflicting political opinions (Duggan & Smith, 2016a).

In a survey by the Pew Research Center 9% of American social media users say that they often discuss, comment or post about politics or government and 23% said they do it sometimes (Duggan & Smith 2016a). When it comes to politics disagreement and arguments can follow really easily and many users perceive social media as an especially negative venue for political discussions (Duggan & Smith 2016b). Nevertheless, users still use SNS tools to

(18)

share content related to politics, their own political opinion or to encourage others to take action on a political issues (Raine & Smith 2012). Nowadays one has the chance to create an own customized media environment and therefore just get the information that one already is interested in. However, on SNS people sometimes cannot avoid contrary perspectives, since a Facebook News Feed for instance shows you more than just your own posts (Greenwood et al. 2015).

The use of SNS for political purpose became very popular since former U.S. president Obama started his campaign mybarackobama.com and used SNS as a “competitive weapon” (Alashri et al. 2016, p.795; Zhang et al. 2010) and during the election of 2016 all candidates used SNS as an important element in their campaign (Alashri et al. 2016). President Donald Trump is clearly aware of the power that SNS have had during the election time (McCormick 2016) and he is still using Facebook or especially his private Twitter account to engage with his followers. Even though politics is not the favorite topic to be discussed on social media (Hampton et al. 2014) in 2016 the U.S. presidential election was the most talked-about topic on Facebook worldwide. According to Vitak et al. (2011) popularity of social media

especially among users enables candidates a new view of interacting with voters, but also enables users to interact with each other “about political issues and to share and discuss their opinions through a variety of formats” (Vitak et al. 2011, p.107). They say: “Peer-to-peer interaction drives social media and may provide a more powerful incentive to engage in political activity” (Vitak et al. 2011, p.109). The U.S. election was not the only political topic on SNS. Marzouki et al. (2011) recognized the importance of social media and in particular Facebook for the Tunisian revolution of 2011. They call Facebook a “catalyst that accelerated the Tunisian revolution” (Marzouki et al. 2011, p.241) and say that without it the revolution would have happened more slowly. The advantage of Facebook is that one can “develop civic engagement skills with little to no additional time costs” (Vitak et al. 2011, p.108). As a user, one can join political groups, share and post content, follow candidates or news channels to

(19)

keep being informed, and can view friend’s activities and comment on their post as well keeping discussions alive. There are different views on politics in SNS but it is clear that social media will continue to play an important role in politics. According to Shirky (2011) “social media have become coordinating tools for nearly all the world’s political movements” (Shirky 2011, p.30).

However, political social media use can vary and the motivations for using it can differ (Macafee 2013). Since so many people are constantly ‘online’ and accessible SNS stimulate the political engagement of people. Especially young adults (who always seemed to be less politically active than the older population) get more engaged this way. However, most of them use SNS to get information about their already existing political views rather than to see new information (Baumgartner & Morris 2010). In an article by Wihbey (2015) it is being said that the “probability of being civically and politically engaged” increases when people consume more news media (Wihbey 2015, n.p.). It is very easy to get information through SNS and therefore sites like Facebook should not be underestimated in their political power. Baker (2009) says: “People who are interested in issues are going to get involved, and the Internet is an easy way for both the voter and the candidate to communicate with others and potentially engage in the political process” (Baker 2009, p.74). Engaging people with more political matters via social media can be seen as some kind of political marketing. Cook (2010) even compares it with propaganda. He defines propaganda as a way of purposeful persuasion which aims at “influencing the emotions, attitudes, opinions, and actions of specified target audiences for ideological, political or commercial purpose through the controlled transmission of one-sided messages” in this case SNS or social media in general (Cook 2010, p.158). On SNS it is not only the politicians influencing their followers but also users influencing users. Lee and Lim (2014) examine the effect of other-generated

information on the user’s perception of political candidates. They look particularly on comments that “unknown others” leave on posts (Lee & Lim 2014, p. 554). Social network

(20)

cues with unknown others generate an even wider range of information that can influence users. The potential for politicians to reach more people beyond their existing followers is therefore given, because in best cases their followers will share their content for them. Hence SNS will continue to play an important role for political engagement and enhancement of political attitudes.

2.4 Facebook News Feed

When politicians like Trump make use of SNS like Facebook to inform and share content with their follower, it is crucial that their posts are visible on the News Feed in order to reach a big amount of people. Bucher says: “Becoming visible, or being granted visibility is a highly contested game of power in which the media play a crucial role” (Bucher 2012, p.1165). Facebook was designed to connect users and help them share and engage with content, with the News Feed clearly being the focus point of the site since it shows the users the greatest amount of content (DeVito 2016). Rader and Gray (2015) call the Facebook News feed a “socio-technical system composed of users, algorithms and content” (Rader & Gray 2015, p.1). It is possible that a post does not appear on some people’s News Feed due to the algorithm. This is often called as the “threat of invisibility” (Bucher 2012; Rader and Gray 2015; DeVito 2016). The News Feed algorithm does not give users a set of alternatives that they can choose from but it curates, which means it selects, organizes and presents a

personalized stream of content (Rader & Gray 2015, p.3). There are different setting options for the News Feed. For example does the News Feed usually show the ‘Top Stories’, if users do not change the settings to ‘Most recent’ (Facebook, n.d). The News Feed will then select the posts that are most interesting for a user, considering his or her “likes”, the activity, the amount of comments or their friend network. Sometimes the News Feed shows posts of the activities of friends, for example a “like” on someone’s pages that is not in ones friends list (Facebook, n.d.). A user can select and change their News Feed settings in their Facebook

(21)

settings, follow friends from who they want to see more posts or unfollow pages (Facebook, n.d.).

Facebook used to use EdgeRank as their algorithmic editorial voice and it determined what is shown on someone’s Top News considering different factors. Each post gets ranked based on three major factors namely affinity (between the user and the creator), weight

(amount of likes, tags etc.) and time decay (how long ago the post was created) (Bucher 2012; EdgeRank).!The ultimate answer on how the algorithm exactly works will still remain with Facebook. Moreover, Facebook announced that EdgeRank is “dead” and that they are using a more complex ranking algorithm based on machine learning taking 100,000 factors into account. They still use the three factors affinity, weight and time but other things are becoming equally important (McGee 2016).

DeVito (2016) defines the algorithmic values as “a system of criteria which are used to make decisions about the inclusion and exclusion of material and which aspects of said

material to present in an algorithmically-driven news feed” (DeVito 2016, p.3). The algorithms in Facebook are more complex than others since human values, for example defining what a “close friend” is, have to be imbedded. Engineers who create these algorithms are not only writing codes but also have to decide on human constructs and interpret them into an operational definition. This happens before a code is written (DeVito 2016). A ‘relevant’ post can be determined either by popularity (the majority’s opinion) or individual preference (based on a user’s past actions) (DeVito 2016). DeVito (2016) found 9 descending

algorithmic values that influence the News Feed, namely friend relationships, explicitly expressed user interests, prior user engagement, implicitly expressed user preferences, post age, platform priorities, page relationships, negatively expressed preferences, and content quality (DeVito 2016). Friend relationships and user interest are therefore the most important factors. However, sometimes it can happen that a post of a user’s friend for example is not seen as relevant by the algorithm and therefore will not be visible on the News Feed or it will

(22)

be placed too far down, that the user will only find it by it scrolling far down. This can be annoying and one could question if Facebook is really showing us the things that we want to see or are interested in. The British journalist Bell (2015) talks about the News Feed in an article about the shift from classic news to digital news consumption:

,,If it only features news which is recommended by friends and family, then

Facebook’s users might miss an important event. Does the Facebook news algorithm take into account other factors, like how recently the news happened? Does it worry about whether the stories that its users are spreading are true? Does it get rid of stories, which might be

deliberately biased or misleading? Does it want to show us stories, which are videos before it shows us stories, which are text? Each decision means reprogramming the algorithm, which selects types of news stories. Facebook might see this as an engineering task, but these simple decisions are also editorial.“

When talking about visibility of content in the News Feed two things are important: feedback loops and filter bubbles. Rader and Gray (2015) define feedback loops as “situations where the output of a process becomes an input to that same process” (Rader & Gray 2015, p. 1). Since on social media information consumers are also producers, the content created is constantly interacting and evolving, making it even harder to understand the complexity of algorithms (Rader & Gray 2015). “What users learn about the system when they consume posts affects what they choose to post as producers” (Rader & Gray 2015, p.3). Filter bubbles are defined as a bubble “in which content is selected by algorithms according to a viewer’s previous behavior” (Bakshy et al. 2015, p.1130). The feedback loop and the filter bubbles are both important elements of the News Feed emphasizing the unintended impact of a user’s action on his News Feed display. However, one could distrust the objectivity of the News Feed. DeVito (2016) says: “Algorithms are by no means perfect; they have biases just as

(23)

surely as human editors do” (DeVito 2016, p.4). According to Yazdanifard and Yee (2014) there are many people that see social networking sites “as the most trusted sources of information” because these information are being shared by people they know or “at least have a passing acquaintance with” (Yazdanifard & Yee 2014, p.2). This can however be very risky considering the fact that there can be fake news and misleading posts; especially when it comes to political content veracity plays a crucial role (Baym & Jones 2012). According to Marchi (2012) fake news “provide the type of political communication that promotes public debate” (Marchi 2012, p.254). Hence the perception of the News Feed can also differ. Users can form beliefs and have a specific perception of the News Feed, which will guide their behavior and consequently affect the News Feed outcomes (Rader & Gray 2015). Rader and Gray (2015) conducted a qualitative analysis on user beliefs about the algorithm that

Facebook uses and found that users have a very diverse opinion about it. The range goes from believing that Facebook shows all posts from someone’s friends to fully automated filtering (Rader & Gray 2015). Furthermore, they assume that the News Feed algorithm has two goals. One is about not missing out on important posts (consumption) and one is about generating more interaction or engagement (production) (Rader & Gray 2015). Another study by Eslami et al. (2015) examined the perception of the News Feed by user and came to the conclusion that the majority is not aware of the algorithmic filtering mechanism. Most of the people thought their own actions have an effect on the News Feed rather than Facebook, for example missing a friend’s post because they were scrolling to fast or were not visiting Facebook often enough (Eslami et al. 2015, p.156). “Users incorrectly concluded that friends had dropped them due to political disagreements or their unappealing behavior” (Eslami et al. 2015, p. 161). After being aware of the algorithm users were more satisfied and became way more engaged with Facebook.

(24)

Figure 3 – Facebook News Feed (own figure based on Stray 2012)

Figure 3 shows the concept of the Facebook News Feed. The left side shows the observable part in which the user is exposed to his News Feed. He sees a friend’s post, a cat video he is tagged in, a photo of a brand he follows or a post of the watch he was looking at on Amazon the other day, which will be marked as “sponsored”. The unobservable part includes all external factors that the algorithm takes into account to create the user’s display. On one side it’s the user’s Facebook behavior, which means his interaction with his “friends”, the amount of website visits and his own activities. Activities can be posting, sharing, following, liking, tagging and commenting. On the other side the user’s engagement with other pages outside of Facebook are also important, which will appear as sponsored stories on a user’s Feed.

Engagement can be a user’s visit of an external page or a user sharing content about the page on Facebook. There are four different types of sponsored posts: Sponsored stories, page post ads, promoted posts, and marketplace ads (Darwell 2013). Sponsored stories are based on a user’s former action on Facebook or a Facebook-connected app and are aimed to make friends take the same actions as the user. For instance if a user shares a link by Amazon, Amazon can pay to show this link to friends of the user. Page post ads are advertisements that come from fan pages but can be shown to anyone, even to non-fans and people that are not connected to the page through friends. Page post ads are being used a lot by marketers and promotion

(25)

reasons. The payment for sponsored stories and page post ads are per impression or per click, while for promoted posts are paid with a flat rate depending on how many people they want to reach. Promoted posts are page posts that come from using the page’s “promote” button and can reach fans and friends of fans. Marketplace ads show up on the sidebar of the desktop and will lead to an external page when they are clicked (Darwell 2013). Every action a user takes within Facebook or on pages connected to Facebook will affect how the News Feed will look like. Eventually the display of the News Feed will consequently affect the user’s next action: A never ending Feedback Loop.

2.5 Filter bubble

Personalization and recommender system can be beneficial if we want to get information faster and avoid information overflow and high data usage. In computer mediated

environments like SNS there are no face-to-face interactions and therefore consumer cannot physically experience the product but need to get their information elsewhere. It seems to be the safer choice to follow the opinion of others (Huang & Chen 2006).

However, too much filtering will leave us in a bubble, which we can hardly escape. Pariser (2011) describes three major dynamics of the filter bubble. First of all a user is alone in his bubble, second the filter is invisible and third the user does not choose to be in the bubble (Pariser 2011). Together “a unique universe of information” is being created for each individual (Pariser 2011, p.8). A filter bubble can expose users with updates that confirm their opinion making it harder to see new things and other views on specific topics. This can have personal and societal consequences; in a bubble one cannot learn about new insights, exchange knowledge or widen one’s horizon. “Creativity is often sparked by the collision of ideas from different disciplines and cultures” (Pariser 2011, p. 15). Moreover, the filter bubble has also a commercial intention. Big companies are using a big amount of data about our daily life and in exchange we get the service of filtering. These personalized

(26)

advertisements are based on a lot of information, that some people would not even share with their friends, so one could question how much privacy is being interfered (Pariser 2011). Based on the book by Pariser, Nagulendra and Vassileva (2010) discovered three problems that filter bubble could cause. First the problem of distortion of the content posted: The risk that we will not get information, that might be important but is seen as not “likeable”. Second the problem of information equivalent of obesity: “The users is surrounded by the ideas with which he is already familiar and agrees, while being protected from surprising information, or information that challenges their views, the filter bubble threats people’s open-mindedness and prevent learning.” (Nagulendra & Vassileva 2010, p.2). And finally the problem of matter of control: Since users knowledge is influenced by algorithm and systems, this gives power to the computer scientists (Nagulendra & Vassileva 2010). These problems become more

important as the number of users increase, who use SNS as their source for information and news.

This is especially important when it comes to politics. Filter bubbles can therefore be a risk for democracy since people only see what they already agree on, leaving no room for diversity (Bozdag et al. 2014). “Group deliberation among like-minded people can create polarization; individuals may lead each other in the direction of error and falsehood, simply because of the limited argument pool and the operation of social influences” (Bozdag et al. 2014, p.2). Based on Napoli (1999) Bozdag et al. (2014) name three dimensions of

information diversity: Source diversity (outlets or program producers), content diversity (format, demographics and idea-viewpoint) and exposure diversity (audience reach and consumption) (Bozdag et al. 2014). In order to assess the diversity they propose two

normative frameworks introduced by McQuail and van Cuilenburg (1983). The first one, the norm of reflection, is about “whether media content proportionally reflects differences in politics, religion, culture and social conditions in a more or less proportional way”. The second one is the norm of openness, which is about “whether media provide perfectly equal

(27)

access to their channels for all people and all ideas in society” (Bozdag et al. 2014, p. 4; McQuail & van Cuilenburg 1983). It is hard to realize reflection and openness but they can balance each other out. Too much reflection can mean too much mainstream and more conventional information since the population’s preferences are never really uniformly distributed (Bozdag et al. 2014; van Cuilenburg 1999). However, more uniform access to the channels will help minorities to drive “social discussion and cultural dynamics” (van

Cuilenburg 1999, p.192).

Figure 4: Conceptual Framework: Political attitude bubble (own figure 2017)

Figure 4 shows the idea behind the filter bubble and the impact of the Facebook News Feed. On the left one can see Facebook as a whole with all of its content. As you can see in the figure above, the arrows symbolize stories being posted. These arrows all flow through a ‘filter box’, which represents the algorithm that filters posts. On the right side the ‘filter bubble’ of a specific user is shown and within that, their political attitude. In conclusion ,only a few stories make it to their News Feed. Based on these findings and the diversity

framework introduced by Bozdag et al. (2014) and McQuail and van Cuilenburg (1983) following propositions are made:

P1: The ratio of diverse topics in the News Feed will decrease once the News Feed adapts to

(28)

P2: The political diversity of the News Feed will decrease after a period of time, if the user

likes content with the same political direction.

Additionally, this study will compare the News Feed of two different countries, giving new insights into this field. Not much research has been conducted on differences between countries. However, Facebook revealed which countries do censor their citizens News Feed

and published all government requests reports1

(Stampler 2014). Despite Facebook’s mission to give people the chance to share and connect, sometimes laws of a country interfere and limit users with what they can share (Stampler 2014). So there may be slight differences in the display of the News Feed, which leads to the final proposition:

P3: The News Feed will differ in the two different cultural environments.

3. Method and research design

3.1 Observation study

This study used a qualitative approach. The method used was to create four Facebook profiles (an American Trump supporter living in the US and one in Germany, and an American Trump opposer living in the US and one in Germany) and to observe the News Feed display over time. Each person would ‘like’ multiple posts per day that went well with their ‘political view’ in order to examine if and how the algorithm would adapt to it.

3.2 Creating the profiles

In order to create the four Facebook profiles a small survey was conducted to get an overview of the typical characteristics of American Trump supporters and opposers. Thirty-three Americans living in the U.S. between 19 and 60 years and with a well mixed political attitude (21% Republican, 55% Democrats, 24% others) participated in the survey. Therefore, the survey had considerable representation of both political sides (pro and anti Trump). To !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

(29)

receive more specific answers and also due to simplification the assumption was made beforehand that Trump supporters were Republicans and Trump opposers were Democrats. Questions were then asked regarding the following; which states indicates a more Republican or Democratic view, what educational background does a typical Republican or Democrat have, what ethnicities are mainly represented, and what characteristics are typical? The results can be found in the Appendix (See A1 in the Appendix). However, following characteristics are mostly all based on the answers of the survey and can therefore not be taking as entirely correct.

3.2.1 A typical Trump supporter

According to the data, most Republicans live in Texas, Alabama, and other bodering states that mimic South East. Texas was mentioned by 73% of the respondents. Majority of

Republicans seem to be white (94%); followed by African Americans (30%) (See figure 5 and 6). The highest educational level attained was a college degree (61%) followed by a high school degree (30%). The last question in the survey was an open question asking for further information about typical characteristics (about employment, interests etc.). As a result, the typical Trump supporter read as extremely patriotic, closed-minded and a traditional citizens with relatively racist viewpoint. Republicans are comprised primarily of the working class, coming from small towns and older generations. They support the National Rifle Association (NRA), like to go hunting or fishing and are less concerned about the general social welfare but more about their own financial benefits. On the other hand, there are also extremely rich and wealthy Trump supporters, who work in higher positions, and are highly conservative. The majority however, still belongs to the blue-collar workers.

Though the typical stereotype of a Trump supporter is mainly white, less educated and

poorer, other studies suggest that this is not always true (Economist, 2016). Trump did win a big amount of votes of people with higher education opposed to those with a high school

(30)

degree; meaning a majority of his voters are earnings more than $100.000/year. Thus concluding that not all Trump supporters come from blue-collar background (Economist, 2016). Other studies agree that most supporters are white, usually hard working who suffered economically and were raised in non-interracial small towns (Ehrenfreund & Guo, 2016). USA Today Network interviewed a few Trump voters to find out more about them and the results are quite diverse in terms of characters, ethnicities and states. However, Republicans are white, own (small) businesses, are older than 40 years of age extremely patriotic (USA Today Network, n.d.).

3.2.2 A typical Trump opposer

Most Democrats are from the East Coast or West Coast. 85% of the respondents said that California is the state where most Trump opposers come from followed by New York (70%). Democrats seem to be mostly African American (88%), white (58%) or Hispanic (36%). They are better educated; some even have a higher degree than a Bachelor’s degree (See figures 7 and 8). According to the open question Trump opposers are open-minded, stand for equality and human rights. They are against the NRA, feel social responsibility and care for the environment. Most of them have been to other countries than the U.S. Furthermore, they belong to the younger generation, mostly college graduates that live in bigger cities. Another group next to millennials are white-collar workers or former ‘hippies’ from the 70s, which are also rather liberal.

Studies support that most Democrats are from the east and west coast, especially from populated areas with more minorities and big universities (Fay, n.d.). College and university students seem to be Trump opposers and the most loyal Democrat supporters seem to be African-American and/or women (Fay, n.d.).

(31)

Based on these findings following user profiles were created:

Trump opposer: Trump supporter:

Michelle Thomas (USA)/ James Smith (USA) / John Smith

Susan Thomas (Berlin Germany) (Berlin, Germany)

26 years old, from Los Angeles California, 48 years old, from Longview Texas

graduated from UCLA, loves to travel, feminist, likes guns and hunting, Christian,

and vegetarian small business owner

Figure 5: Facebook account of Susan Thomas Figure 6: Facebook account of James Smith

Figure 5 and 6 show two out of the four user profiles made. These profiles are clearly

stereotyping the usual Trump supporter and opposer, but the goal of these profiles is to make them look realistic. The aim of this study is to find out how the News Feed will adapt to the users’ actions and not how well represented those two opposite sides look like.

3.3 Diversity check

Bozdag et al. (2014) used microblogging data from Twitter und chose different types of users also including mainstream news media and politicians. They examined the data for Turkey and the Netherlands and put them in different political groups. In order to examine the diversity of the News Feed, they used the normative framework about reflection introduced by McQuail and van Cuilenburg (1983), which looks at the amount of diverse posts on the News Feed.

(32)

This study will use similar a method but in this case for Facebook. Thus, the

dimension of normative reflection checks how diverse the News Feed is in terms of offering a variety of different topics and posts from different pages that are not political. The next dimension is content diversity, which is based on the information diversity framework by Bozdag et al. (2014). This dimension looks at political content and sees how much they differ regarding political attitude and if one political direction is leading (more liberal/conservative). A third dimension was added called cultural differences, which only looks at how much the content posted would differ between the countries or if there are any differences at all. Bozdag et al. (2014) compared their results for Turkey and the Netherlands but used a different approach. The research question was: Does offline political segregation affect information diversity in Twitter? (Bozdag et al. 2014, p. 6). However, this study will examine the affects of users activities on the News Feed display. The theory used for the studies of Bozdag et al. (2014) and McQuail and van Cuilenburg (1983) will therefore be used as a guideline for this study and following table shows an overview of the diversity check framework:

Diversity check of the News Feed Proposition

Norm of reflection Ratio between diverse posts (mix of different

topics) (McQuail and van Cuilenburg 1983)

1

Content diversity Ratio between Democratic and Republican posts

(Bozdag et al. 2014).

2

Cultural differences Differences between the German and American

News Feed

3

Table 2: Diversity check of the News Feed

3.4 Democratic or Republican

Since the U.S. presidential election took place quite recently, there has been research on what Democrats or Republicans want. Moreover, the parties publish their political platform on their own websites. For this study only the major differences were taking into account to show the

(33)

contrast. Democrats are rather liberal. They want to higher the minimum wage, economic security for the middle class including supporting social services, create jobs for young people and support small businesses. They request for economic fairness including progressive taxes and for the wealthy to pay their share of taxes. Moreover, they want to close the racial wealth gap, foster clean energy economy and combat climate change, provide affordable education

and health care for anyone, more gun controls and finally support minorities like LGBTQ2

(Democrats n.d.; Diffen n.d.). Republicans on the other side are rather conservative, which means they also value traditional Christian values when it comes to family for instance. Republicans do not support gay marriage or abortion. They want a free financial market including the reduction of corporate taxes in order to support competitions and private investments. In their official platform they criticize Obamacare for increasing costs for

patients and drugs but instead present a new health system Medicaid that will provide a major tax cut. They support the NRA and the idea of each American to be able to protect themselves with arms. Furthermore they want to establish a pro-growth tax code and in general tax reforms for everyone (GOP n.d.; Diffen, n.d.).

The list is never ending, but the main difference for this study is that the Republicans support President Trump and the Democrats do not. Therefore, Facebook posts are very extreme in terms of agreeing with what Trump does/says or not. Besides checking if the posts are Democratic or Republican two more categories were identified namely ‘general news’ and ‘others’. General news was considered to be any posts, which do not have anything to do with Trump or Democrats and Republicans but can be seen as (political) news. Finally ‘others’ unites all other posts, which are whether news nor politics related but rather belong to

entertainment, art, sports, leisure and so on. Table 3 shows an overview of the coding four the four categories:

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

(34)

Democratic Anti-Trump (anything directed against Trump, Trump followers and Republicans), liberal, open-minded towards LGBTQ, gay marriage, social services, health care system affordable for anyone, progressive taxes, gun controls

Republican Pro-Trump (anything directed for Trump, against Trump opposer and

Democrats), conservative, tax reforms (cuts) for everyone, pro NRA, increase of military spending, Christian values, free financial market

General News Anything that is news related but not about Trump or the American

politics (with no clear politically direction)

Other Anything that is not news or politics related (Entertainment, Leisure etc.)

Table 3: Coding categories

3.5 Evaluation strategy

To decide whether a post is largely Republican or Democratic was quite a challenge, since a political direction in a post was not always clear. So in order to simplify the evaluation process the four users followed pages that are already known for having a dominant left or a r right viewpoint In order to observe diversity, the pages selected needed to be diverse

beforehand. Therefore, additionally to political pages and news pages other pages were followed that go well with the interests of the four users. Table 4 shows the list of pages that were followed in the last settings. Logically the same pages were followed for both Trump supporter and vice versa.

A second independent reader helped to evaluate the posts. After each post it was discussed how to categorize it and in cases where there was still no clear direction

recognizable the post were labeled as ‘general news’. In cases where there was no agreement, the articles were read more carefully and further (external) information was taken into

account. However, in most cases it was clear. The pages under ‘Mainstream’ are some of the biggest mainstream news channels. Nevertheless, media in the U.S. can be biased and for

(35)

instance FOX News is known for being rather conservative, which is highlighted in the red ‘Republican’ color, while the others are seen as liberal (Blake 2014; Engel 2014).

Michelle/Susan James/John

Pages followed during the second observation phase Entertainment/Others: - Funny Vines - Travel Leisure - Tasty - Emma Watson - Michelle Obama - Beyoncé - Vegetarian Times - UCLA - Insider Food - Hello Giggles Democratic/Liberal pages: - Huffington Post - Mother Jones - Think Progress - Salon - BuzzFeed - NY Times - Liberal America Republican/Conservative pages: - Drudge Report - Conservative Post - Western Journalism - The Daily Caller - Breitbart - The Blaze - Donald J. Trump - Rush Limbaugh Mainstream: - CNN - FOX - USA Today

- Wall Street Journal

Entertainment/Others: - Funny Vines - Travel Leisure - Tasty

- Jesus Daily

- Gun Owners of America - Country Music Nation - I love America - Texas Rangers - I love hunting - NBA Democratic/Liberal pages: - Huffington Post - Mother Jones - Think Progress - Salon - BuzzFeed - NY Times - Liberal America Republican/Conservative pages: - Drudge Report - Conservative Post - Western Journalism - The Daily Caller - Breitbart - The Blaze - Donald J. Trump - Rush Limbaugh Mainstream: - CNN - FOX - USA Today

- Wall Street Journal

Table 4: Overview of pages followed in the final setting

4. Data

4.1 Data collection

The data collection went on for 5 weeks (Table 5). Each day3

the first 25 News Feed posts were evaluated in being more Republican or more Democratic, general news or other. Afterwards, the posts were compared and differences were noted. In regard to verifying the propositions, a diversity check was done for each user.

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

(36)

24.04.17 – 03.05.17 Pre-Test

01.05.17 Adjustments (adding/deleting pages)

04.05.17 – 19.05.17 Observation phase 1

20.05.17 – 24.05.17 Interruptions and adjustments

25.05.17-07.06.17 Observation phase 2

Table 5: Overview of all observation phases and breaks

In order to keep it as real as possible the four users had to ‘act’ realistic, meaning they had to follow the patterns of an average Facebook user4

. For example; visiting the site at the most busiest times at 11 am, 3 pm or 8 pm, staying on the site for about 30-45 min and giving up to 13 likes per day (Hampton et al. 2012; Leonard 2013; Warren 2010). The Trump supporter for instance liked posts that were rather Republican and also fit well with his ‘profile’. It was important that the Trump supporter in the U.S. and in Germany would like the same posts. However, sometimes posts would not appear in both News Feeds. In that case likes were given for posts that had the same kind of message/direction, so that the same political attitude was being communicated. To make sure that both News Feed of either Michelle and Susan or James and John would refresh at the same time, both users logged in at the same time in two tabs. Initially there was supposed to be just one observation phase. However, some

fluctuations were too obvious and so it was assumed that the settings might not be realistic enough. To begin with all users did followed the same amount of pages but for instance Michelle followed more liberal pages than conservative ones (See A2 in the Appendix). Furthermore, some pages posted more than others (e.g. Occupy Democrats), which led to an unclean News Feed display. The News Feed would be filled with posts from only one page, hence making comparison harder. In order to get less biased results the four profiles followed the exact same pages with the exact same amount of pages in a second observation phase.

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

(37)

Additionally they kept following the same ‘non-news’ pages. The activity pattern stayed the same. The second observation phase therefore should have given more clear results.

4.2 Pre-Test

After conducting a pre-test, first insights could be noted and also some adjustments needed to be done. During the pre-test 30 posts per day were examined. However, the amount of data got too high so it was reduced to 25 posts per day. To keep track of all activities the activity protocol of Facebook was copied after the observation. Data was not collected everyday, but every second or third day as the News Feed would have time to gather new content instead of repeating the same posts. Some adjustments were also made with the pages followed, since some pages could not really be identified in being more left or right (e.g. ABC News). These pages got taken out and instead others were added that showed a clear direction (A2 in the Appendix). During the entire observation phase pages that seemed to ‘bother’ the News Feed were taken out and replaced with others (changes can be followed in the activity protocols). Changes were made consequently in the same way for all four users.

Besides listing if the posts are left, right or not political, suggested pages were also noted. Furthermore, political events that had an influence on how the News Feed would look like afterwards were also taken into account.

4.3 Interruptions with Facebook

On the 21.05.17, Facebook denied access to one of the accounts (John Smith) for the first time with the statement ‘Facebook detected suspicious activity’. The access was blocked for about 2,5 days. The Facebook algorithm obviously suspected that the accounts were

somewhat not real and asked for further identification. Figure 7 shows the notification after a photo was sent. The observation phase therefore had to be interrupted. After this incident Facebook blocked the access three more times. It is important to mention that this only

(38)

happened to the accounts of Susan and John, who are both the Americans living in Germany. This incident is therefore interesting to keep in mind for the analysis later.

Figure 7: Screenshot: Facebook denies access

5. Results

The following results of the data analysis are presented. According to the diversity check framework each profile was examined in the three dimensions and afterwards overall conclusions were made. Table 6 shows an overview of the total post distribution of the start and end day of each phase including the first day of the pre-test. A complete overview of the results can be found in A5 in the Appendix.

Michelle Susan James John

Pre-test 27.04 Democratic 13 7 13 10 Republican 7 7 17 17 General 47 37 33 47 Others 33 50 37 27 Phase 1 Start 04.05. Democratic 32 28 16 20 Republican 16 16 28 36 General 44 44 28 32 Others 8 12 28 12 Phase 1 End 18.05. Democratic 72 80 8 12 Republican 8 8 64 40 General 12 8 20 32 Others 8 4 8 16 Phase 2 Start 24.05. Democratic 20 28 4 0 Republican 12 12 48 40 General 36 44 48 60 Others 32 16 0 0 Phase 2 End 07.06. Democratic 52 52 0 0 Republican 0 4 60 56 General 28 16 40 36 Others 20 28 0 8

(39)

Norm of reflection / Michelle

During the Pre-Test the News Feed was well mixed with different topics; such as food, travel,

entertainment, news and some political issues. 33% of the posts on 27.045

were

non-political/news related compared to the 18.05 with 8%. From the first observation phase on the amount of political posts was always higher than 50%. The diversity of topics decreased drastically over time. After changing the settings for the second observation phase the amount of non-political/news posts went up to 32%. On June 1st and 3rd there were only political

posts. This can be the effect of the withdrawal from the Paris agreement6

, since it was a severe change of circumstances for the American politics and more important than other things at that time. This probably pushed away non-political posts from the News Feed. During the second observation phase more non-political pages were followed as liberal and conservative pages (e.g. 10 compared to 8 conservatives). A ratio of at least 1:3 could be expected.

However, the total amount of news pages exceeded the amount of non-political posts, which might have led to these numbers. One noticeable thing was that during the second observation phase more non-political posts showed up than during the first. In general the diversity of different topics was low and decreased as bigger political events happened. Furthermore, Michelle got three pages suggested on her News Feed; ‘Collgehumor’, an entertainment website, ‘Christiane Amanpour, CNN’s Chief International Correspondent and rather liberal, and ‘Thinking Humanity’; a humanitarians and democratic website. These pages fit well with Michelle’s political attitude and interests. Hence the data supports proposition 1.

Content diversity / Michelle

The amount of Democratic posts was higher than Republican post from the start of the first observation phase, since Michelle already liked posts during the pre-test and the algorithm !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

5 Pre-test results for the 27.04 can be found in A4 in the Appendix

6

(40)

most likely adapted to it. During the first phase the amount of Democratic posts increased almost progressively and stayed up. On the 04.05 32% of the posts were Democratic and 16% Republican. On the 18.05 the Democratic posts rose to 72% and the Republican ones

decreased to 8%. Phase 2 had much more fluctuation. The amount of Democratic posts was always higher and on June 3rd

it reached 92%. Towards the end of the second phase there were no more Republican posts. Furthermore, it has been observed that even if the content of the posts could not be seen as clearly Democratic, the pages that were shown in the News Feed were mostly liberal pages. The conservative pages were almost all pushed aside. P2 is clearly confirmed with this data.

Norm of reflection / Susan

During the Pre-test Susan’s News Feed was also much more diverse. 50% of her posts were non-political/news related. After the first observation phase the diversity decreased and stayed low for the rest of the time and never exceeded 30% of the entire News Feed. After the

change of settings the amount of ‘general news’ went up to 44%, but the amount of ‘other’

post remained quite low at 16%. On 7th of June the amount of ‘other’ posts went up to 28%.

This is the highest it got for the entire observation time. Like in the case of Michelle, a ratio of at least 1:3 could be expected between non-political and political posts but, the diversity was low and stayed low. Towards the end of the observation the News Feed’s diversity was more balanced than before the second phase. Susan got one page suggested namely

‘Congressman Adam Schiff’ who is a Democrat, so the findings do support the first proposition.

Content diversity / Susan

The amount of Democratic and Republican posts was the same at the start. Once the first observation phase started the amount of Democratic posts went up. During the first phase it

Referenties

GERELATEERDE DOCUMENTEN

This demonstrates that the rotational behavior of the molecular motor can be further expanded to influence systems coupled to the motor, thus opening additional venues for

‘Wanneer een erflater iets heeft vermaakt aan een stichting die hij in een bij notariële akte gemaakte uiterste wilsbeschikking heeft in het leven geroepen, is de stichting

Bij de toepassing van emvi wordt meteen duidelijk dat het niet zo rechttoe-rechtaan is als gunnen op de laagste prijs: er zijn meer criteria die meetellen en dan is het de vraag hoe

It consists of software tools and consultancy services that respond to the continuum of land rights and FFP approach (M. The innovative technological solutions include 1)

I don’t understand how some people seem to enjoy spending so much time on maths

The chapter looks at how different substance and form of the brand post can enhance brand post popularity across cultures, by looking at low-context cultures versus

For many inhabitants of the three countries these three persons are absolute heroes, but interesting in this thesis is the question whether the populist parties are also content

They experiment with three supervised techniques: multinomial Naive Bayes, maximum entropy and support vector machines (SVM). The best result, accuracy of 84%, was obtained