• No results found

YouTube's Own Bird Box Challenge: captured between governance and the battle for attention

N/A
N/A
Protected

Academic year: 2021

Share "YouTube's Own Bird Box Challenge: captured between governance and the battle for attention"

Copied!
58
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

YouTube’s Own Bird Box Challenge: Captured

Between Governance and the Battle for Attention

Author: Puck Stallinga

Supervisor: Stijn Peeters

Second reader: Tim Highfield

Programme: MA New Media & Digital Culture

Institute: University of Amsterdam

Document: MA Thesis

Date: 28

th

of June 2019

(2)

TABLE OF CONTENTS

1. Introduction--- 4

1.1. Everything for attention---4

1.2. Extreme behaviour---5

1.3. Governance--- 6

1.4. Research and relevance---7

1.5. Chapter structure---7

2. Theoretical Framework---9

2.1. Understanding the attention economy through pillars of the information and networked society---9

2.1.1. The Information society---9

2.1.2. From information to networks---10

2.1.3. Paying attention as immaterial labour---12

2.1.3.1. Attention is scarce---13

2.1.3.2. Attention is poor---13

2.1.3.3. Attention economy and imitation---14

2.1.3.4. Extremism economy on platforms---14

2.2. Platforms--- 15

2.2.1. Defining the term---15

2.2.2. Neutrality of platforms---17

2.2.3. Platform governance---18

2.2.3.1. Governance OF platforms---19

2.2.3.2. Governance BY platforms---19

2.2.3.3. Safe harbour and platform responsibility---20

2.3. YouTube and participatory culture---22

2.3.1. From website to platform---22

2.3.2. YouTube as a SNS---23

2.3.3. YouTube and participatory culture---23

2.4. Memes, virality and VCMs---24

2.4.1. Memetics---25 2.4.2. Virality---26 2.4.3. VCMs---27 3. Methods--- 28 3.1. Platform studies---28 3.2. Platform selection---28

(3)

3.3. Analysis governance mechanisms---29

3.3.1. Terms of Service---30

3.3.2. Policies--- 30

3.4. Video analysis---31

3.4.1. Videos still on YouTube---31

3.4.2. Videos removed from YouTube---32

4. Results--- 33

4.1. Analysis of Terms of Service---33

4.1.1. User conduct and submissions---33

4.1.2. Disclaimer of Warranties---34

4.1.3. License of Liability---35

4.1.4. General---35

4.1.5. Conclusion sub-question two---36

4.2. Policy analysis---37

4.2.1. Community Guidelines---37

4.2.2. Safety---37

4.2.3. Harmful or dangerous content---38

4.2.3.1. Age-restricted videos---39

4.2.4. Conclusion sub-question three---40

4.3. Video data analysis---41

4.3.1. YouTube Data Tools---41

4.3.2. Analysis videos still on YouTube---42

4.3.3. Analysis of removed videos---44

4.3.4. Conclusion sub-question four---46

5. Discussion and conclusion---48

5.1. “What is YouTube’s role within the attention economy?”---48

5.2. “How do YouTube’s Terms of Use address platform responsibility?”-49 5.3. “How have YouTube’s policies developed over the years and what do they imply?”--- 49

5.4. “How are YouTube’s policies enforced on its’ users and their content?”--- 50

5.5. “How does YouTube balance platform governance and their role within the current attention economy?”---50

(4)

1.

Introduction

On January 2nd 2019, streaming platform Netflix issued a statement via Twitter, trying

to discourage people from “hurting themselves with this Bird Box Challenge” (see Fig. 1). The tweet was posted after several videos of people engaging in possibly dangerous activities with blindfolds on appeared on social platforms. The videos were made as a memetic response to Netflix’ new production: Bird Box, starring Sandra Bullock. The movie revolves around Malorie Hayes, played by Bullock, and her two children Boy and Girl. In the movie, an extensive part of the world population is dying. This is due to an unknown evil force, which attacks you when it reaches your eyes. To avoid it coming in contact with the eyes, the characters cover the windows and cracks and wear blindfolds when they go outside. Netflix’ initial goal when introducing the Bird Box Challenge, namely promoting the movie, was amply reached. Netflix claims that the movie was watched by 46 million accounts within the first week after its release (Alexander). The widely-known movie turned out to be a fitted target for imitation and parody, because a large amount of memes and challenge videos based on Bird Box emerged on different platforms.

Fig. 1: Netflix warning about the Bird Box Challenge (@Netflix)

1.1.

Everything for attention

One of the participants of the challenge was famous YouTuber Jake Paul, who is known for his controversial videos. His videos are viewed millions of times and he makes a respectable living from it (Robehmed). People earning their salary with tasering dead rats and pranking other individuals may sound idiotic and dystopic, but is very much the reality of today. In the twenty-first century – and especially the past decade – there have been many new, bizarre ways to earn money, all funded by the current commodity: attention. Whereas before one could mostly only dream of stardom, today the chances of becoming (internet)famous are very much realistic. But to become more than just a voice in the crowd, one has to stand out. That standing out is what fuels current society and economic situation: the attention economy. The attention economy presents a situation in which attention is a scarcity and decreasing, but where information (e.g. stimuli and things asking for our attention) is bountiful. Because attention is a limited resource, it becomes valuable and worth fighting for. Surrounding us is an ongoing battle between individuals, companies and organizations for our eyeball-time, our thumb-contact, our attention

(5)

(Terranova 13; Grimmelmann 226). Because people can now literally pay with their attention, content is adjusted to what attracts this attention: things that stand out. Many players are searching ethical or acceptable lines to tap or even cross. Nudity, clickbait and dangerous behaviour are just some examples of what extreme behaviour this constant crave for attention can cause (Grimmelmann 226). The environments where a large part of this battle takes place are platforms and social networking sites, such as YouTube, Facebook and Instagram. With intelligently created algorithms that show you preferred content and red, pulsating notifications on your screen, these platforms keep you in their grip for hours to grab as much of your attention as possible.

1.2.

Extreme behaviour

In December of 2018 and January of 2019, a large number of videos showing blindfolded people engaging in various activities appeared on video streaming platform YouTube. A part of the world population had not become instantly blind, nor was this some form of punishment for bad behaviour. The individuals in de videos did all this voluntarily to be a part of the internet’s new hype: the Bird Box Challenge. The challenge was originally thought of and promoted by streaming platform Netflix to attract attention to the new Netflix Original movie Bird Box. The streaming service approached well-known gamers with a Bird Box-package, asking if they would like to take on the challenge to play their favourite game with a blindfold on and stream the event on gaming-platform Twitch (Stevenson). And, the Internet would not be the Internet if the seemingly innocent challenge resulted in a disaster. Once the new hype reached other platforms, such as video streaming site YouTube, the challenge was picked up by non-gamers that adapted the challenge to their non-gamer life and engaged in non-gaming activities while blindfolded; driving, running, applying make-up and climbing. On January 11th, a 17 year old girl caused a dangerous traffic

accident on a highway in Layton, Utah (Horton). Apparently Netflix’ warning about the possible risks of the challenge did not have the effect the company was hoping for.

The Bird Box Challenge was not the first time that one of these challenges reached the public. In the past, similar video-challenges circulated on Facebook and YouTube, such as Tide Pod Challenge, ALS Ice Bucket Challenge, and

NekNominate Challenge. These wide-spread challenges can be defined as Viral

Challenge Memes (VCM) and present a new form of participatory digital culture (Burgess et al. 1036). What differentiated the ALS Ice Bucket Challenge from other active VCMs during that period of time, was that friends and other viewers were “invited” or encouraged to participate in the challenge as well. This aspect of VCMs can be seen in later challenges, such as the NekNominate Challenge. NekNominate was one of the first VCMs that promoted extreme behaviour among its participants. Namely, it challenged users to drink an alcoholic beverage as quick as possible – preferably in one go – and to nominate a number of friends or family members to do the same within 24 hours (Burgess et al. 1037). As the VCM spread, the videos became more creative and the beverages became stronger and larger. The aftermath saw five young men die of alcohol poisoning, due to participating in the

NekNominate Challenge, trying to outdo or impress their friends (Withnall).

1.3.

Governance

Platforms have turned into chaotic conglomerates of information of all sources and natures, and so there is demand – and need – to bring some form of structure into this. Grimmelmann states that responsible content moderation is necessary “in a world where outrageous and offensive content goes viral and spreads and jumps

(6)

from platform to platform” (217). Hence, in an online environment where people act like children on a sugar-rush and would do (almost) anything for likes or recognition, it is important for a platform to take on a role of ‘responsible parent’ and regulate the content users upload. This may be to protect their own image, their users’ image or the effect on the public. Although platforms claim to be open environments where freedom of speech is highly respected, there is some kind of need for this governance (Grimmelmann 218), especially when there are children watching.

YouTube answered this call for governance in January 2019. Sharing Netflix’ concerns about the risky challenge, YouTube announced that it was going to remove the videos that showed people participating in the Bird Box Challenge in dangerous or irresponsible ways. YouTube has given channels a two month grace period to remove or alter videos that might not be allowed on the platform according to the new regulations. During this grace period, channels will not be punished if they still have this content on their channel. If, after the two months, the content is still not appropriate according to the guidelines, the channel will be flagged (multiple flags can lead to a ban from the platform) and the videos will be removed. This is not the first time YouTube took active control over uploaded content. In early 2018, videos of the infamous and dangerous Tide Pod Challenge, that dares participants to eat pods, capsules filled with laundry detergent (a well-known laundry detergent brand is called

Tide), were deleted from the platform (Grimmelmann 217). In the past, YouTube

received criticism in the past for not acting appropriately or responsibly in response to disturbing content such as self-harm, extremism, or copyright infringement (Tufecki; Nicas). While they did take some action, YouTube was never absolutely clear about their guidelines and what content fell under which section. To (hopefully) end the discussion once and for all, the platform adjusted its guidelines on January 15th 2019,

in particular the section on harmful or dangerous content guidelines. A sentence referring to considerably dangerous challenges and encouraging people in participating in those events, now justifies YouTube in removing the videos uploaded by numerous vloggers, influencers and regular users. In January and February, the section of the website where visitors can find the guidelines showed a yellow warning field in which YouTube shows the recent changes:

Fig. 2: Changes in policies on harmful or dangerous content (YouTube)

1.4.

Research and relevance

Because attention is still and increasingly decreasing, and information and stimuli are still in abundance, e.g. the battle for attention is very much alive, and a lot of it is happening on platforms, it is important to gain an in-depth understanding of the ways platforms engage in shaping and regulating the chaos. While there has been previous research on both the attention economy and platform governance, none has combined the two and seen them as synergetic concepts, and how platforms balance the two. Furthermore, as the Bird Box Challenge is not the first, and will presumably

(7)

not be the last dangerous VCM appearing on the platforms, it is important to build on previous work and expand scholarly knowledge of this contemporary phenomenon. Because of YouTube’s prominent role as environment where these VCMs emerge and spread I want to study YouTube’s enforcement of policy, and see how they get governed and govern their users. YouTube has seen an increase of channels with more than one million subscribers over the years (Business Insider ), which means that more people are watching and admiring these YouTubers. YouTube also houses many content that is considered stupid or ridiculous, but that is its charm. The aim of this thesis is to analyse the features of the attention economy, platform governance and VCMs through theoretical research, and to connect these features with YouTube’s practices of platform governance through a case study; the Bird Box Challenge. Qualitative methods will be used to gain in-depth information on YouTube’s governing documents and enforcement of policies and will identify how this is brought into practice. This data will be contextualized with a review of classic and recent literature on the information society, the network society, the attention economy, participatory culture, memetics, and platforms. Subsequently I propose the following research question:

“How does YouTube balance platform governance and its role in the attention economy?”

To answer this question, I have to research and understand the different compartments of this field of study. Therefore, I have divided my research question in the following sub-questions:

“What is YouTube’s role within the attention economy?”

“How do YouTube’s Terms of Service address responsibility?” “How have YouTube’s policies changed and what do they imply?” “How are YouTube’s policies enforced on its users?”

1.5.

Chapter structure

The second chapter of this thesis functions as a theoretical framework in which I will explore and explain several concepts: platforms, YouTube and participatory culture, and memes, viral content and VCMs. All these concepts are viewed from a bigger context: the attention economy. This chapter will also function as the answer to my first sub-question, and will determine the role and practices of YouTube within the attention economy. The answer to this question gives the context needed to interpret the rest of this thesis.

The third chapter is the Methodology. In this chapter, I will explain the goal of my research and which methods and tools I will use to achieve this goal. First, I elaborate on my field of study: platform studies. I explain what one can expect of a study in this field and what it will contribute to. An explanation of my analysis will follow. Then, I provide the reader with the methods that I used for the different steps in my research. I start with the selection of a platform, then the methods used for the analysis of my data: the WayBack Machine, the YouTube Data Tools, and running queries through Google.

(8)

The fourth chapter of this thesis contains the results. The first part of the results is retrieved from an analysis of governance mechanisms: the close reading of YouTube’s Terms of Service-documents and policy documents. For this part of the analysis, I use the WayBack Machine, which will provide me with older versions of the documents. I will analyse a number of changes within the writing, categorisation, and website positions of the documents. I end the chapter with a short conclusion and a list of characteristics that I will use to analyse the videos in the second part of my analysis. Then, the third part of my analysis, the analysation of the data I retrieved from the YouTube Data Tool provides the additional results. I used this tool to retrieve a list of YouTube videos that contained the query “bird box challenge” in either the title or description of the video. From this list, I selected three videos to analyse. For comparison, I also selected three videos that were removed from YouTube after the warning and policy-change. I retrieved those videos from websites and social media channels. To analyse the videos I used points that emerged from the first part of the analysis. The characteristics of inappropriate content that were revealed from the different policy-documents and other texts function as a starting point for these criteria. At the end of this paragraph I will provide a short conclusion.

The last chapter of this thesis contains the discussion and conclusion. Here I will shortly describe my main findings, discuss and interpret the results in relation to the critical review of the key concepts, and answer the research questions. I will also indicate some limitations of this research and give advice for additional research on the subject.

2.

Theoretical Framework

2.1.

Understanding the attention economy through pillars of the

information and networked society

In this paragraph, I will elaborate on the attention economy and the significant effects it has (and has had) on society and technology. But before I get to defining the attention economy, I will provide background information on the situations existing before the rise of the attention economy: the information society and network society. To understand the characteristics and practices of the attention economy, one has to first understand how it is established and emanated from previous paradigms. The critical analysis will lead to an introduction in the attention economy, where I will disclose (critical) views and draw helpful conclusions to build an image of current society and economy.

(9)

2.1.1. The Information society

In former times, economies were built on industry or trade. In the twenty-first century the economic focus shifted to information, which was (partially) made possible, but mostly enhanced and sped up by the rise of the Internet (Webster 385). In 1995, Frank Webster discussed this matter in his book Theories of the Information Society, coining the term “information society” and analysing its implications and practices thoroughly. According to Webster, information has been prioritized as one of the distinctive features of modern day society for over three decades. He suggests that this will not change in the next period of time (Webster 442). We are constantly able to receive and send out information, making it an important factor in private-, business- and academic fields, therefore we can truly call ourselves “inhabitants of an information society” (Webster 444). He suggests six phenomena that help to define the information society (444):

 Technological innovation and diffusion

Technological innovations have increased in volume in the past three decades. Often, new technologies are seen as the accelerators of new, more modern, times. Therefore, the suggestion that this must influence the social aspect of society in some way, cannot easily be swept aside. The rise of the internet in the 1990’s and early 2000’s, referred to as an “information superhighway” by Webster, is used as the starting point of a new,

informational, era;

 Occupational change

Changes in the occupational area are also characteristic for the shift to an information society. Whereas previously jobs were largely focussed on industry and production, new service-oriented jobs that concerned information and accompanying technology became the new preponderance (Webster 2002, 24). Information has become a resource for work, hence many analysts have labelled it “informational labour” (Webster 2002, 24);

 Economic value

Information turned out to be a profitable resource that lead to an increase of the gross national product (GNP). A substantial part of the economic activity was taken up by informational labour rather than industrial- or agricultural labour. Webster mentions Porat calling the USA an “information-based economy” (25). “As such it is an information society [where] the major arenas of economic activity are the information goods and service producers, and the public and private (secondary information sector) bureaucracies” (Porat in Webster 25);

 Space

When talking about “space”, Webster alludes to the “information networks which connect locations and in consequence can have profound effects on the organization of time and space” (25). What he means by this is essentially that we are becoming more and more connected to each other, companies, governments and other actors, and that the previously mentioned information superhighway leads to a necessary revision of time and space (25);

(10)

Webster states that this abundance of information and its connected technological developments unmistakably have their influence on culture. He presents a few logical, hardly surprising examples, such as the increase of tv-channels, ways to listen to the radio, and receive news-updates (26). What makes his argument worth mentioning is Webster engaging with the information society on a much more intimate, personal level. He suggests that prioritizing information in our society contributes to humanity being hyper-focussed and sensitive for the images that might be projected by wearing a certain type of clothing or talking about certain topics (26). “Reflection on the complexities of fashion, the intricacy of the ways in which we design ourselves for everyday presentation, makes one well aware that social intercourse nowadays involves a greater degree of informational content than previously” (26).

Webster argues that there are ways to quantitatively measure an information society, but that it comes with difficulties and controversies. The determining of informational and non-informational professions is hard and could fall victim of subjectivity and interpretation. This counts also for determining relevant technologies and how much impact they have on society (27-28). One can conclude from the things stated above, that the information society is not just that because of the abundance of information, but the affect it has on occupational, political and social norms and standards. In his later work, Webster argues that the term information society should be abandoned and replaced with a form of co-existing we practice nowadays; more networked and connected through the internet and other technologies than ever before. The argument that Webster quotes about the “space”, brings us to a newer, even further developed concept of society: the networked society.

2.1.2. From information to networks

Although he agrees with Webster’s concept of an information society, Castells argues that we should abandon this term, and shift our focus to what he states “a new technological paradigm, centred around micro-electronics-based, information and communication technologies, and genetic engineering” (9-10). He points out a characteristic that distinguishes these two the most: the critical role knowledge and information play within the society. According to Castells, these aspects are not determining of a particular form of society, because they were present in all societies (10). Rather, we should focus on the new set of information technologies this era has brought us, such as the Internet being the “universal tool for interactive communication” (10). We find ourselves in a new economy, which Castells characterizes by three fundamental aspects:

 Informational

Within this economy, the success and competitiveness of all kinds of economic units, such as companies and governments, is determined by the capacity of generating knowledge and processing information. Growth in this area can be spotted in various sectors;

 Global

Practices and activities, such as science, trade, media, and other forms service, can be performed at all times, be that real time or chosen time;

(11)

 Networked

This characteristic regards the new infrastructures that Castells calls information networks, which “link up suppliers and customers through one firm, with this firm being essentially an intermediary of supply and demand, collecting a fee for its ability to process information” (11). This characteristic is highly connected to, and maybe even caused by the former two aspects described above.

Castells classifies the network economy as capitalistic. Similar to Webster, Castells ascribed several changes in society to the new economy. Work and employment take on a new shape as labour becomes more individualized. Flexible and part-time work are growing in popularity and demand (11-12). Shifts are also taking place in the cultural field. Media are becoming more diverse and specified to target audience-segments. This inclusiveness results in a rise of interactivity between media broadcasters and audience (12-13). Another field in which change occurs is the field of politics. A significant part of the population receives its information through media (note: now even more than in the time of writing). Therefore, media become a space of politics and, because interactivity and specificity on media are becoming more important, leads to politics being more personally-oriented. Directing political messages directly at different segments of citizens, and presenting politicians as people rather than vague organizational bodies have made politicians “very expensive businesses” that need maintenance, polishing and even scandals (13).

Time and space have also experienced redefinition because of the network economy, resulting in two new concepts. The first is ‘timeless time’, which insinuates that the past, the present, and the future can all occur in random order, making established patterns diffuse or even disappear (14). The second is ‘space of flows’, which implicates “the possibility of organizing the simultaneity of social practices without geographical contiguity” (14). In other words: social practices can be performed without being tied down to geographical factors”. The last actor Castells mentions that experiences change is the state. Castells argues that almost every state has transformed into a network state, existing of “power-sharing and negotiated decision-making between inter- and multinational, national, regional, local and non-governmental political institutions” (14).

When comparing Webster’s characteristics for an information society and Castells’ characteristics for a network economy, one can conclude that it offers several similarities; information and accompanying technologies restructure occupational and social areas, making time and place less important. These theories form essential fundamentals for understanding our past and current society and economy. The next social structure I will discuss is a direct offspring of the former, and is an environment in which the changes in society have resulted in a new commodity opening up for marketisation: attention.

2.1.3. Paying attention as immaterial labour

The intensive growth of the Internet and SNS have shown to be fertile ground for paradigm- and society shifts. Castells argument on what caused the transition to a networked society mentions that not necessarily the abundance of information, but the social change it brings about is a characteristic of this shift (10). The informational and networked society form productive ground for more social change. One of the fruits born from this ground is the attention economy. As Michael Goldhaber phrased it in his The value of openness in an attention economy:

(12)

“By the Attention Economy, then, I mean a system that revolves primarily around paying, receiving, and seeking what is most intrinsically limited and not replaceable by anything else, namely the attention of other human beings.” (Goldhaber)

In this article, he describes the increase of the value of human attention, and how it has been capitalized and made profitable for others. Another name for the situation is the “like-economy”, where a like, view or share functions as measurement for user engagement and is, as mentioned above, marketable and profitable. The term like-economy was introduced in 2013 by Gerlitz and Hammond, who described the concept as “not only a social web, but also a recentralized, data-intensive infrastructure” (1349). By looking at, or interacting with interfaces the user pays the party responsible for the content with his or her attention. Crogan and Kingsley argue that this is a form of “immaterial labour”, meaning that we spend most of our leisure time “working” for organizations by paying attention to the content on their platform (3-4). In this context ‘paying attention’ has never made more sense, and also works vice versa; by receiving attention, profits can be made. This makes it attractive and easier for “common individuals” to enter and participate in this attention economy, and hope for a life of luxury and (micro)celebrity-status (Marwick 138-139).

Tiziana Terranova ascribes three attributes to the attention economy: scarcity, poverty, and imitation. I will follow and explore those three characteristics in this section of the theoretical framework. In his paper on dangerous challenges, Grimmelmann ties another characteristic to the attention economy: extremism. With extremism he does not aim at terrorism, but at nudity, danger and other exceptional behaviour of users. He argues that this extremism is carried and amplified by the role and practices of platforms within the attention economy (Grimmelmann 227). Because my case study is similar to Grimmelmann’s, I will explore this concept in addition to the three Terranova proposes.

2.1.3.1. Attention is scarce

In theories written on the attention economy, attention is a scarce resource, making it valuable. Scarcity of a resource is what, following traditional market economics, makes new economies able to rise. Because “the sum total of human attention is necessarily limited and therefore scarce” (Goldhaber), the battle for our attention is situation of ‘the survival of the fittest’ where only the most attractive content earns a piece of our desperately wanted attention (Terranova 2). Both Webster and Castells seemed to struggle with the question on how to measure information and networks. Where information is a commodity difficult to research, Terranova states that “the abstract quality of attention and at the same time the fact that the ‘attentional assemblages’ of digital media enable automated forms of measurement (as in ‘clicks’, ‘downloads’, ‘likes’, ‘views’, ‘followers’, and ‘sharings’ of digital objects) open it up to marketization and financialization” (2-3) and create measurable data. These affordances of digital media offer us the quantitative research-methods Webster and Castells were so thoroughly searching for.

In the attention economy or like-economy, the social is of particular economic value, as user interactions are instantly transformed into comparable forms of data and presented to other users in a way that generates more traffic and engagement. Theorists go even further and call attention not only a scarce resource, but a kind of capital, due to its scarcity and measurable nature (Terranova 2). Each platform has its own affordances and symbols that express popularity and overall sentiment of the

(13)

uploaded content. On YouTube, (dis)likes and comments are not as important for the popularity and attention a video attracts as the view count is (Morreale 116). Even though the content of the video may not be perceived as positive, if a video has a high number of views, the popularity of the video will gain, and the video will acquire a prominent position within YouTube’s algorithm (Morreale 118).

2.1.3.2. Attention is poor

In contrast to other economic situations, the attention economy portrays a situation where the scarce resource (attention) is consumed by the resource in abundance (information), “hence a wealth in information creates a poverty of attention” (Terranova 4). Terranova argues: “By consuming attention and making it scarce, the wealth of information creates poverty that in its turn produces the conditions for a new market to emerge” (4), meaning that when the amount of information increases, attention is equally decreasing. This makes the battle for our attention even more intense, and results in actors trying to catch our attention with extreme content, such as nudity, bizarre behaviour, or violence (Grimmelmann 226). This ongoing battle between actors does not only occur on the Internet, but also in other media-fields, such as radio, television and literature (Goldhaber). Theorists and other experts talk about “ a ‘crisis of attention’ which portrays a “degradation of attention provoked by digital technologies and its economic effects” (Terranova 4).

The degradation of attention then comes from attention that is lost in the process of paying attention. Carr writes: “Every time we shift our attention, the brain has to reorient itself, further taxing our mental resources. Many studies have shown that switching between just two tasks can add substantially to our cognitive load, impeding our thinking and increasing the likelihood that we’ll overlook or misinterpret important information” (Carr 1). And, as Jonathan Crary argues, the current digital era pushes the boundaries for attention and distraction with numerous new technologies and other sources of stimulation, which results in paying attention to even more different stimuli. Crary states that the latter causes the ‘ongoing crisis of attentiveness’ (13). We, humans that possess the valuable commodity of attention are facing a dilemma when participating in the attention economy: on the one hand, we must enter technological assemblages of attention; on the other hand, if we enter these attentional assemblages, we experience a significant loss of attention, which results in a decrease in attentiveness of the brain (Terranova 6).

2.1.3.3. Attention economy and imitation

Tiziana Terranova proposes a third attribute she ascribes to the attention economy: the process of imitation. Quoting Latour, Terranova argues that paying attention to other’ daily activities and practices triggers potential forms of imitation. She claims this is caused by “the hyper-sociality of the connected brain”, and not just by the mere exposure to new media technologies, as was the case with the scarcity and poverty of attention (7). The triggered imitation can take on various forms; a passing impression and actual practices such as writing content, reading content, uploading content, sharing content, and watching, liking and following (Terranova 7). These practices suggest that the attention economy is also a social economy, where participation and imitation are triggered and encouraged (8). A similar view on the attention economy comes from Gabriel Tarde, stating that “the labour of attention enables social cooperation and is thus the real source of the production of value – a social kind of production steeped in reality” (Tarde in Terranova 10). She writes that the value produced by invention is socialised through imitation (Terranova 10). This

(14)

can be recognised when looking at the Bird Box Challenge; the challenge was ‘invented’ by Netflix, but became socialised (and memetic) due to imitation by others. Terranova uses research on mirror neurons and imitation to further explain how this process develops.

2.1.3.4. Extremism economy on platforms

The attention economy pressures extremes. In an environment where information is in abundance and the attention to distribute is scarce and limited, there is high pressure to go extreme in an environment where ‘normal’ and ‘decent’ performances are left out of recommendation-algorithms and must-watch lists, and thus are unlikely to catch some of that attention (Grimmelmann 226). Grimmelmann quotes Evan Williams, saying that the internet rewards extremes:

“Say you’re driving down the road and see a car crash. Of course you look. Everyone looks. The internet interprets behavior like this to mean everyone is asking for car crashes, so it tries to supply them” (Williams in Grimmelmann 229).

The car crash metaphor suggests that the internet simply supplies what it is asked for, and because of the profitable incentives that platforms often reward, the urge of users to engage in their own car crashes grows (Grimmelmann 230). Grimmelmann blames big platforms for encouraging their users to gain as many likes, views, shares and followers as possible (226) by incenting them with money. He writes that platforms often give their users the opportunity to make money off of their content by paying them themselves, based on advertising revenue (225), or that users can earn money by selling advertisements on their own websites. The YouTube Partner

Programme is a prime example of the first payment-construction, which I will explore

further in the next part of this theoretical framework. Grimmelmann concludes that “therefore, there is a nearly linear relationship between the spread of content and its monetary value to the creator” (226). As a result, creators of videos will aim for what they know scores best – and thus has the biggest chance of providing them fame and wealth.

But, since platforms are mostly primarily driven by advertisements, the individual users that get a chance to profit from their uploads are not the big winners here. If a specific channel or video attracts a significant amount of attention, the biggest profit will be in the hands of the platform. This portrays the second trend Grimmelmann establishes in his work on the attention economy, extremism and platforms: “they [platforms] optimize their designs to maximize advertising revenue” (227). This concerns the maximization of user-engagement, which ideally results in staying on the website for as long as possible, absorbing content and, most importantly, advertisements (227). To achieve maximum engagement, platforms have tools in place that are designed to keep users on the platform by suggesting and auto-playing content most likely to attract the attention of users. I will provide a more enhanced exploration of platforms and explanation of these tools in the next paragraph.

2.2.

Platforms

As briefly discussed in the previous paragraph, platforms function as a pathway for the distribution of attention. Because their (mostly) social nature, they are highly connected and possess affordances that allow information and interaction to travel fast and diffuse widely. Through platforms, the trend of the attention economy

(15)

amplifies and extremism finds its way to the public. This is partially to blame on the increasing importance and number of platforms on the web, as they are products of the Web 2.0. This development is, as Helmond calls it, the platformization of the web. She writes: “I use the term “platformization” to refer to the rise of the platform as the dominant infrastructural and economic model of the social web and the consequences of the expansion of social media platforms into other spaces online” (5). The term ‘Web 2.0’ was coined by Tim O’Reilly, and stands for the era of “internet interactivity, social networking, customization, and collaboration” (Shifman, “Memes in Digital Culture” 2). O’Reilly states that the Web 2.0 emerged after the dot-com bubble bursted in 2001. While many concluded that the web was overhyped and therefore destroyed, O’Reilly found that the web was still important and that this importance would only increase over time. He defines Web 2.0 as follows: “A set of principles and practices that tie together a veritable solar system of sites that demonstrate some or all of those principles, at a varying distance from that core” (O’Reilly). Using O’Reilly’s concept of Web 2.0 as context, I will go deeper into the functions and affordances of platforms and look at the other side of the coin: the regulation of the extremism content intermediaries are fuelling and ‘giving a platform’. For this, I will explore the platform metaphor and its suggested neutrality, and look at platform governance, its implications, opportunities and challenges.

2.2.1. Defining the term

To understand the ways platforms govern and influence our daily information consummation and relationships, one has to understand what a platform is and what the term implies. To explain the term and its implications, I follow Tarleton Gillespie in his examination of the definition. Gillespie considers the current use of the term platform a metaphor, something “specific enough to mean something, and vague enough to work across multiple venues for multiple audiences”. Aiming to explain the metaphor, he looks at four different definitions of the term:

 Computational

The term platform can have a computational meaning when it concerns “an infrastructure that supports the design and use of particular applications, be they computer hardware, operating systems, gaming devices, mobile devices or digital disc formats” (Gilelspie, The Politics of Platforms 349). This definition is most applicable when researching platform studies.

 Architectural

A raised level surface on which people or things can stand, usually a discrete structure intended for a particular activity or operation’ (Gillespie,The Politics

of Platforms 349-350). This definition is the easiest imaginable one. When we

think about a platform, we often think about train platforms or drilling platforms that are higher than normal ground.

 Figurative

In this sense, the term is mostly used to describe a start, a beginning, a foundation for something in the future. Nowadays it can also imply the starting point for a career (Gillespie, The Politics of Platforms 350).

 Political

(16)

politician proclaims to its colleagues, the media or followers (Gillespie,The

Politics of Platforms 350).

One of the principals O’Reilly aims at is ‘the web as platform’, mainly focusing on the computational meaning of the term (see above). He now saw the web as “an infrastructure to build applications on, a distributed operating system that could deliver software services” (Helmond 3). But, as Gillespie argued, one should not focus on one aspect of the definition, but take all four in account: political, figurative, architectural and computational (349-350). They all come to one communal connotation: that a platform is a raised level surface, something that stands at the basis of “activity that will subsequently take place”, and “suggests a progressive and egalitarian arrangement, promising to support those who stand upon it” (350). He states that the term platform is already loosened from its strict computational meaning, because it currently covers many more fields (Gillespie, The Politics of

Platforms 351). Bogost and Montfort partially agree with Gillespie, and argue that the

term platform has indeed gained a much broader meaning, but they do insist on emphasizing the importance of the computational aspect of the definition. They write: “Gillespie’s view of the term “platform” is perhaps reasonable when we consider only the public rhetoric of “online content providers,” but if he reads the computational sense of “platform” as outdated, this view is not at all tenable” (F.A.Q. 4-5). Thus, when following Bogost and Montfort, one needs to also keep the technological aspect of platforms in mind when aiming to understand their actions. Because YouTube-content – for the most part – get sorted, reviewed and deleted by algorithmic technologies, I have considered the computational aspect of platforms as highly present in and relevant for my research. For example, governance by and of platforms form the shape of those algorithms and eventually the information that becomes available for individuals. This governance works its way into regulations in the shape of policies and user agreements platforms enforce on their users. Gillespie adds that a platform also implicates “openness”, which he explains as follows:

“It implies a neutrality with regards to the activity, though less so as the term gets specifically matched to specific functions (like a subway platform), and even less so in the political variation. A computing platform can be agnostic about what you might want to do with it, but either neutral (cross-platform) or very much not neutral (platform-dependent), according to which provider’s application you would like to use” (Gillespie 351).

This semantical examination offers us insight in the motivation of online social intermediaries to get called or call themselves platforms. Most of these platforms were developed with the idea of offering users an environment where they could openly speak, share and discuss with peers or politicians. The overall sentiment of the openness and neutrality of the term platform gives the impression that this is indeed what these platforms offer. However, according to Gillespie, “the term suggests a lot, but really says a little" (351). Gillespie states that the term “platform” suggests that it has little to no responsibility for the people that find themselves on one. In a somewhat accusatory way, he describes them as “opportunities for people to communicate, interact or sell” (Gillespie, The Politics of Platforms 351). He claims (and blames) these organizations that they title themselves with a term like “platform” to downplay their role and accompanying responsibility, instead of amplifying it. He uses the implications of the platform metaphor as an explanation. The term platform

(17)

suggests that there is “an impartial space where actors communicate and exchange information on an equal basis, but actually, the sphere consists of many layers and flows of information, coordinated by algorithms, categories and sponsored content (Gillespie, The Politics of Platforms 348).

2.2.2. Neutrality of platforms

With the platformization of the web came also the platformization of cultural production, according to Nieborgh and Poell. They define platformization as follows: “the penetration of economic and infrastructural extensions of online platforms into the web” and state that it has effect on “the production, distribution, and circulation of cultural content” (4275). Looking at the effects of the increasingly dominant position and role of platforms is important, because they have significant influence on the relationships between users and institutional actors (4276). As Gillespie pillories in his papers on the platform metaphor and the ‘politics of platforms’, the term falsely suggests that content intermediaries function just as infrastructural tools to pass along the speech of their users to their networks, which would imply that they are open and neutral environments where users can speak freely. Agreeing with Gillespie, Anupam Chander and Vivek Krishnamurthy provide 8 reasons why one should be careful and reserving to blindly follow the claim of the neutrality of platforms. First, they argue that platforms “shift the landscape of speech” by curating access for traditional editors, and consequently “change the nature of what is available in the market place of political ideas” (Chander and Krishnamurthy 403). As a second reason, they argue that platforms are profit-oriented and therefore will provide the user with content that makes them stay on the platform for as long as possible, managed by algorithms (404-405). The third reason Chander and Krishnamurthy present is the algorithmic bias. As algorithms are designed to recommend content based on society-bias, and user-preference, an increasing number of cases where algorithms recommend or auto-play discriminative content has been reported (405). The fourth reason why one should be sceptic of platform-neutrality is the non-platform-neutrality of governing documents these organisations enforce on their users, such as policies or community guidelines. These guidelines often follow common Western law, but also take stands on issues where no legislation has been initiated. Therefore, some actors or content is not welcome on the platform, which then makes it not open for everyone (405-407). As a fifth reason, they discuss the profession of content-moderators big platforms hire to regulate content. Because most platforms rely on advertising revenue for their profit, these moderators are hired to filter inappropriate content that may scare off (potential advertisers) (407). The sixth reason Chander and Krishnamurthy provide discusses the political standpoints some platforms take on by, for example, withdrawing their services from countries like China, because of fear they might be embedded in censorship or other practices that contradict Western values (407-408). The seventh reason regards the re-designing of technologies to match better with the implicit politics discussed in the previous reason (409). The last reason Chander and Krishnamurthy provide states that “platforms are actively promoting substantive visions” and take “many myriad decisions that have normative effects” (409-410).

But, since there are numerous reasons to contradict the claim of neutrality, why would platforms still consistently label themselves that? According to Gillespie, “to position themselves both to pursue current and future profits, to strike a regulatory sweet spot between legislative protections that benefit them and obligations that do not, and to lay out a cultural imaginary within which their service makes sense”

(18)

(Wyatt in Gillespie, The Politics of Platforms 348). Exploring platform neutrality affirms the core of Gillespie’s critique on the term ‘platform’: that it suggests more freedom and openness than there actually is. In the next sub-paragraph, the ways this freedom is restricted will be explored and examined.

2.2.3. Platform governance

When theorists and other voices speak about ‘platform governance’, they usually speak about how platforms are structured, organized and regulated (Gorwa 856). Platform governance has two sides: on the one side, the platform governs its users with ‘governance mechanisms’ that ‘facilitate cooperation and prevent abuse’ (Grimmelmann 2015 47), often in the form of policies, terms of service, and algorithms (Gorwa 856). On the other side, platforms themselves are being governed by “local, national and supranational mechanisms of governance” (Gorwa 857; Gillespie, Governance of+by platforms 2). Nieborg and Poell write: “Examining platform governance, we can observe that transnational platform companies tend to set global, rather than local, standards regarding content. Informed by political economic research on corporate concentration, we note that the dominance of the US-owned and operated GAFAM platforms effectively entails a globalization of US cultural standards concerning what is and what is not permitted (Jin, 2015). Such standards are operationalized through platform policies, codified in Terms of Service, Terms of Use, and developer guidelines, such as Apple’s App Store Review Guidelines. On the basis of such policies, platforms filter content, block users, and remove content from platforms and app stores. Typically, these regulations prohibit violence, nudity, and discrimination, which can scare away advertisers and end-users or become a source of legal issues (Van Dijck, 2013). How such rules are interpreted and acted upon is, however, opaque and it frequently causes controversy (Frenkel et al., 2018), as platforms intervene deeply in the curation of culture and the organization of public communication” (4285).

2.2.3.1. Governance OF platforms

Advertisers and stakeholders are just two of the long list of actors that (can) have a say in what should and should not be visible on a platform. Despite the fact that most of the widely-used platforms such as Facebook, Instagram and YouTube are built around United States of America-law, platforms have to also follow and obey local law (e.g. government-requests for the take down of content) (Gorwa 861). Governments are more and more demanding the platforms to reveal information about their users, and demanding content to be blocked or receive limited access (Commissioner for Human Rights 2014). This is a technique known as the “invisible

handshake”, a cooperation between law enforcement agencies and organizations in

the private sector. This allows governments to practice forms of surveillance and censorship without the risk of accountability, because the platforms do the dirty work for them (Birnhack & Elkin-Koren 2). There are growing concerns about this cooperation and the responsibility of platforms within these matters. These concerns are sometimes privacy-directed and criticize the extent to which users are required to give platforms, advertisers and governments insight in their personal data, to make use of the services provided by the platform. Other concerns are focused on the visibility and ordering of information, aiming at the algorithms and other ranking-tools many platforms have in place to enhance user-attention and engagement. The algorithms are accused of bias and the criteria used for the categorization of content are often vague and leave much room for interpretation (Suzor 2), and thus platforms

(19)

are criticized on the way they censor and regulate speech. The term “censor” is sensitive, because platforms rely on their own claim of being impartial spheres. Due to growing critique on the matter and the motivation for platforms to maintain some of that open and neutral image, platforms have been releasing ‘transparency reports’ that show details on government demands for user-data or requests for content or profiles to be removed from the platform (Suzor 5).

Aside from the external forces governing platforms discussed above, there is also potential to govern platforms from the inside. As platforms are often companies that employ hundreds of people, there are possibilities for them to unionize and protest for chance of business, restructuring of processes and shaping practices (Gorwa 861).

2.2.3.2. Governance BY platforms

As discusses in the previous paragraph, platforms are not as neutral as they claim to be, as they possess numerous tools, or governance mechanisms, to shape and arrange the information users receive and therefore our reality. These mechanisms often come in the form of terms of service, policy-documents, user-agreements and algorithms (Gorwa 856). With tools like policies in place, platforms can justify acts as filtering and deleting content, and blocking users. These policies are being developed over time, adjusted, restructured or sometimes removed. Every platform has its own way of articulating their policies, and some subjects will be more prominently present in policies of different platforms, but the essence is basically the same: no pornographic content, no violent content, no content that shows harassment of users, no hate speech, no promotion of self-harm, and no promotion of illegal activity (i.e. drug use) (Gillespie, Governance of+by platforms 14).

As mentioned before, YouTube has recently changed their policy on dangerous and harmful content, after videos of the Bird Box Challenge spread quickly on the platform. It is not uncommon for platforms to develop and adjust their policies in response to inappropriate content. According to Gillespie (14), this can be an internal process, in which the curators of the platform see an unwanted category emerging, or an external process, in which the regulation of content comes from public controversies or mass-critique. The rules set in these policies must of course be enforced by someone or something. First, this was mainly voluntarily done by web administrators on forums and small social networks. But, since platforms have taken on immense forms, the volume of content cannot be moderated by only people anymore. Today, bots and algorithms are in place to do the dirty job for us humans. But also then a lot of inappropriate content still slips through the cracks and stays on the platform for hours, days, or even years. There is simply too much content to review. Therefore, widely-used platforms can never assure users that there will be a 100% success-rate on the regulation of inappropriate content (Gillespie, Governance

of+by platforms 15-16). This examples show that users are increasingly realising the

impact and importance of governance by platforms

Would it not be easier to just stop the governance and let users upload whatever they want? Yes, of course. But one has to keep in mind the true nature of platforms: they are companies that depend on stakeholders and advertisers, and are looking to make profit. As Nieborgh and Poell argued, allowing everything (violence, nudity, crime) to be uploaded on the platform could potentially scare away investors, advertisers and users, or could become the subject of legal issues (4258). The latter risk is carefully anticipated and dodged as much as possible, mostly by conducting a

(20)

Terms of Service document. The next paragraph will elaborate on these Terms of Service and platform responsibility and accountability.

2.2.3.3. Safe harbour and platform responsibility

The different forms of governance discussed in the previous paragraphs demonstrate that platforms do indeed have much control over the way users perceive information and what information they see. One would argue that actors with so much power would also have a ton of responsibility towards users and other stakeholders. But they do not. The question of the responsibility and accountability of platforms for the content uploaded by their users has been around for decades. As an attempt to conduct clear legislative rules on the responsibility of “interactive computer service providers”, the Congress of the United States adopted the Communicative Decency

Act (CDA) in 1996. The act concerned legislation on telecommunication and

contained a section on “safe harbours against any liability for harmful materials users of those services may provide” (Gillespie, Governance of+by platforms 5): Section 230 safe harbor. Section 230 stated two important rules: intermediaries could not be held accountable for the content their users uploaded and that they do not need to regulate that content, and if they do decide to engage in regulation and policing, they do not lose the protection Section 230 provided them (Mueller 2).

Outside of the U.S., content intermediaries carry partial responsibility for the content of their users. In Europe, platforms are only to be held responsible if they were aware of the presence of the prohibited content and/or if they (helped) produce or initiate it (Suzor 5). Around the world there are different views on the liability of platforms and what kind of consequences are connected to the disobeying of the law. Sometimes, platforms have to also obey media laws. For example, Russia adopted a law that internet-pages that receive over 3000 views a day, automatically fall under traditional media law and have to play by those rules (Gillespie, Governance OF+BY

Platforms 11).

Nicholas Suzor argues that there is a growing consensus to review Section 230 to make platforms more accountable for the rules they create and the decisions they make (2). Researches have showed that YouTube’s algorithm often recommends extreme content and content that is created for radicalization (Tufecki). In her article on extremism on YouTube, she gives the following example: “At one point during the 2016 presidential election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations. Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacist rants, Holocaust denials and other disturbing content” (Tufecki). With the current Section 230 in place, YouTube is not responsible for leading its users to that type of content. Hence, activist groups and other interest-groups call for an increase in accountability and responsibility of platforms.

When talking about platform responsibility, one is often referred to the Terms of Service document. This document is the first form of governance presented by a platform, and in order to use the service a platform provides, a user has to agree with those terms. Suzor defines Terms of Service as “contractual documents that set up a simple consumer transaction: in exchange for access to the platform, users agree to be bound by the terms and conditions set out” (Suzor 3). Because they are legal documents, users have to obey and practice the rules in them. ToS documents often ascribe a great power position to the platforms, but tend to keep responsibility at a distance. The ToS often state that users are entirely responsible for their own content

(21)

and that the platform is not accountable for any inappropriate messages in that content or potential harmful consequences (Suzor 2-5). Normally, when states create and implement laws, they have to take in account the limitation of power they grant themselves, and be open about ‘procedural’ values. Suzor researched the Terms of Service of 15 big platforms worldwide, and came across these two principles as matters of concern. He writes: “Like constitutional documents, Terms of Service grant powers; but unlike constitutions, they rarely limit those powers or regulate the ways they are exercised” (Suzor 10). According to him, ToS should be created and implemented in a way that is similar with constitutional documents, because platforms have too much power and too little responsibility (Suzor 12).

Although platforms are protected by Section 230 of the CDA, users and other interest-groups have argued that platforms have to take this responsibility as they are “architects of public places” (Gillespie in Suzor 2). Helberger, Pierson and Poell suggest that the responsibility for the content existing on a platform comes from three sides: the platform itself (as architects of online environments (Gillespie in Suzor 2), the user (as individuals making decisions on their online behavior), and the government (which provides laws and rules for platforms to exist) (2). This situation is often referred to as ‘the problem of many hands’ (Helberger, Pierson and Poell 2-3). Only if all actors work together, the problem could be solutioned. For users, their task is to obey policies and rules set by the platforms. For platforms, it is necessary to get governed and govern themselves.

2.3.

YouTube and participatory culture

2.3.1. From website to platform

YouTube has grown out to be one of the largest platforms dominating our current screen-time. With its current 1.9 billion visitors per month watching and generating content in 91 different countries, the video streaming site has reached a third of the population with access to the internet (YouTube for Press). YouTube was founded by three former PayPal-employees; Chad Hurley, Steve Chen and Jawed Karim, in February 2005. Made possible by its relatively simple interface, the initial idea of the creators of YouTube was building a website that allowed users to easily upload, share and view unlimited content. According to Henry Jenkins, the popularity of YouTube was not just a coincidence, but logical, because it was something that many groups had long been searching for (Jenkins, What Happened Before

YouTube 110).

The first video that was uploaded on the website (“Me at the zoo”) was a video that showed Karim, one of the co-founders, at the San Diego Zoo while standing in front of some elephants. The video appeared on YouTube on April 23rd 2005. The

first YouTube video to reach a million views was on the platform in September of that same year. The video was made in collaboration with Nike, and featured world-famous footballer Ronaldinho receiving a pair of golden Nike shoes. The rapid spread and popularity of the video showed that YouTube could not only function as a UGC-environment, but also as a sphere where advertisers could reach large amounts of people in considerably little time. Henry Jenkins states that this is due to the distinctive technical affordances YouTube offers. He writes: “The YouTube business model creates value through circulation. Its distinctive technical affordances make posting YouTube content elsewhere a trivial matter of copying and pasting code, allowing videos to be inserted into diverse cultural economies and social ecologies” (What Happened Before YouTube 116). YouTube’s programmers made it

(22)

easy to share and post YouTube videos on other websites, such as Facebook or Twitter.

In November of 2006, YouTube received funding of approximately 3.5 million dollars, and would develop to be the fastest-growing website on the internet in 2006, with nearly 20 million users every month (Snickars and Vondereau 27). Their growth and success did not come unnoticed, because Google made moves to purchase YouTube in the end of 2006, for a rough 1.65 billion dollars in stock. YouTube being acquired by Google marks an important shift in phase for the company. Many speak about a “YouTube before Google” and a “YouTube after Google”, and argue that YouTube evolved from a website for sharing content to a far-reaching, commercial platform “with the largest video community on the internet” (Snickars and Vondereau 10).

The YouTube after Google started to live up to the expected commerciality in 2007, when the company introduced its YouTube Partner Programme (YPP). The YPP offered non-professional users of the website a chance to make money off of their content by showing advertisements over, before or during a video. The motivation to create, upload and share content was now rewarded with the possibility of making profit. Channels that uploaded videos with many views could apply for partnering up with YouTube, and would receive money for showing these advertisements to the viewers. There are an additional number of ways to monetize content: channel memberships, merchandise shelf, super chat, and YouTube Premium Revenue (How To Earn Money On YouTube). Today, the YouTube Partner

Programme is still active. Owners of YouTube-accounts can apply if they meet a

number of requirements: the YouTube account has to have a minimum of 1000 subscribers, a minimum of 4000 watch-hours in the past twelve months, the channel has to be connected to an AdSense account, and the YPP has to be available in the country where the holder of the account is based (YouTube Partner Programme

overview, application checklist and FAQ). Currenly, the YPP has teamed up with over

20,000 users originating from 22 different countries (Wattenhofer et al. 359).

The development from content intermediary to powerful platform lies within the practices of governance YouTube enforces on its users and their content, the monetization of content and attention, and the extent to which YouTube offers an environment for users to express themselves, interact and connect with others.

2.3.2. YouTube as a SNS

Over the years, YouTube has developed characteristics of a social networking site. YouTube joins Facebook, Instagram and Twitter in this legendary list and connects perfectly with its ‘competition’ (Wattenhofer et al. 354). As expected from a SNS, the social aspect is one of YouTube’s priorities, practiced by connecting users, possibility to subscribe on channels, and providing content-updates (354). What distinguishes YouTube as a social network, is that users are not traditionally connected through undirected links, but often through the content they upload. According to Wattenhofer et al (359), “social popularity on YouTube is tied not to typical performance in content popularity, but to maximum achieved content popularity”. This implies that a single hit-video can make the responsible channel immensely popular, despite the rest of the content being weak and not worth watching. The social aspect comes, then, via responses on other users’ videos by reviews, ratings or reaction-videos. Wattenhofer et al. call this a user-content-user relationship and state that this makes YouTube a content-driven social network (356). The users that utilize YouTube primarily as a social network are considered “drivers of the attention economy of YouTube” and

Referenties

GERELATEERDE DOCUMENTEN

thermal conditions for the composite formation in the Mg–B phase diagram. 53 A good numerical agreement with the exper- imental results was achieved. 3 combines the experimental

Basically, a program respects SSPOD if (SSPOD-1) for any initial state, each public variable individually be- haves deterministically with probability 1, and (SSPOD-2) for any

Both the molecular and immunological mechanisms underlying cancer development and disease course are increasingly understood, offering novel possibilities for improved selection

South African clinical trial research community senior stakeholders and decision-makers voiced the significant need for a national clinical trials initiative to support and enhance

This study is set out to provide insights to marketers that help them to evaluate the effect of sponsorship disclosures within the context of online review videos on

De aanleiding van dit onderzoek kwam voort uit de praktijk; de Rijksdienst voor Ondernemend Nederland (RVO.nl) liep tegen een vraagstuk aan over hoe de eigen medewerkers

[16] on simulation of hip joint movement during Salat activity showed that impingement of hip joint prosthesis could occur during three positions include sitting

The objectives of this study were to compare plant and arthropod diversity patterns and species turnover of maize agro-ecosystems between biomes (grassland and savanna) and