• No results found

Routes to Radicalization: An Exploration into a YouTube Video Network of Controversial Content and the Alt-Right

N/A
N/A
Protected

Academic year: 2021

Share "Routes to Radicalization: An Exploration into a YouTube Video Network of Controversial Content and the Alt-Right"

Copied!
30
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

University of Amsterdam

Routes to Radicalization: An Exploration into a

YouTube Video Network of Controversial Content

and the Alt-Right

Media and Information BA

Aidan Fahle

Student number: 11788178

Supervisor: Maxigas

Word Count: 10,007

(2)

Abstract

Throughout recent years, many different groups in society have become aware of the fact that social media platforms such as YouTube have gained significant amounts of power and

influence. Due to this fact, it has become increasingly important to seriously consider the various consequences and potential repercussions associated with these extremely popular social media outlets. One such consequence, which has been at the center of much debate and controversy in the last five years, is the fact that extremist groups are beginning to utilize these social media outlets with hopes of spreading their ideologies to the masses. In this research, the extremist group of the alt-right is investigated through YouTube with the aim of examining just how effective of a tool YouTube can be with regards to content dissemination and thus towards the path of radicalization. To do this, six videos were chosen from six different channels which can be broadly separated into three different categories of the intellectual dark web, the alt-lite, as well as the alt-right. By creating a network visualization through the help of YouTube Data Tools and Gephi, the related video network between the three different categories of videos shows that while indeed there is a distinct correlation between videos of less extreme content and those of the alt-right, the extent to which this analysis substantiates enough evidence supporting direct paths to radicalization on the platform remains unclear. Overall, this research is relevant insofar as there has yet to be a network analysis of the alt-right on YouTube throughout academia. Thus, this study serves as an initial guide towards understanding this group’s effectiveness on the platform more generally.

(3)

Table of Contents

Abstract 2

Table of Contents 3

1. Introduction and Academic Relevance 4

2. Theoretical Framework 6

2.1 What exactly is the alt-right and where did it originate from? 6 2.2 Contrarian communities: Defining the intellectual dark web and the alt-lite 9 2.3 Personalization in the age of filter bubbles and echo chambers 11

3. Research Question 13

4.​ ​Methodology: Network Analysis 14

4.1 Rationale for channel and video decisions 14

4.2 Data extraction and visualization 15

4.3 Potential concerns and limitations 16

5. Results of the YouTube network analysis 17

5.1 Unpacking the network and its various clusters 17

5.2 Edges: Analyzing connectivity within the network 20

6. Discussion 22

7. Conclusion 25

Bibliography 28

(4)

1. Introduction and Academic Relevance

The advent of social media has allowed for a bastion of free speech which has significantly changed the ways in which we interact with one another as well as the world around us. These contemporary platforms provide its users with new found capacities as well as state-of-the-art capabilities which have never been so easily earned. Anyone with access to the internet and/or a computer has the ability to project their ideas, beliefs and opinions out into the world with the click of a button and within seconds at that. There is no doubt that these tools, when used appropriately, can be immensely useful, fun, and entertaining, however as these tools and their respective sites become more commonplace in society, many various setbacks, repercussions , and overall adverse effects have begun to creep more and more into the limelight. Among these problems are topics and themes such as trolling and hate speech, cyber-bullying, cancel culture, and social media addiction to name a few. There has already been much literature and conducted research geared towards social media and its corresponding shortcomings and pitfalls, but considering this topic is much too broad for any one paper to cover on its own, social media and its faults have been deconstructed to more specific topics and areas of discourse. One such example that has received little attention in not only academic discourse, but also in society more generally, is online radicalization efforts and experiences via social media. Forms and methods of radicalization can differ from platform to platform. Sometimes radicalization can be very obvious and explicit, while other times it can be very subtle and sophisticated. With this being said, it is interesting to examine the factors at play with regards to radicalization and how it works on a more distinct, micro level. This paper intends to do just that. More specifically however, this study intends to examine radicalization processes via social media through the ideology of the alt-right or “alternative right” (ADL).

First, this study is relevant and useful for the study of the alt-right and white nationalism in general as well as understanding how this ideology is contained and exercised in various internet related contexts. Many scholars (Adams and Roscigno 759; Johnson 100; Futrell and Simi 76) have examined white nationalism and its newfound home on the Internet, however, little academic research has been focused towards the radicalization efforts from the perspective

(5)

of a network visualization of YouTube videos. Rather than examining white nationalism and its increased visual presence on the Internet like much of the academic literature has done so far, this study wants to dive into the specifics of how and why a platform such as YouTube creates such a viable and efficient space for extremist groups such as the alt-right.

Second, this study will benefit the state of research with regards to popular social and online movements of the intellectual dark web and the alt-lite. Both of which have proven to be rather ambiguous in nature, making it difficult for scholars to understand and research with any absolute certainty. Both of these terms have quickly crept into the limelight of public discourse and debate and have been the subject of many different controversies. Examining a network of YouTube videos relating to these two different categories as well as the alt-right will help to better understand how the alt-right have quite rapidly established themselves in many online spaces such as YouTube. Moreover, a network visualization will help to better recognize how viewing and engaging with the specific content found in these categories can make users especially susceptible to radicalization efforts of the alt-right.

Lastly, this research is academically useful for research on the theory of filter bubbles and echo chambers. Similarly to white nationalism on the internet, the theory of the filter bubble has been discussed more from a broad media perspective. As the term itself is relatively new with regards to the history of the Internet, media scholar Eli Pariser coined the term in 2011 and it has since then altered many ways in which we view and use the Internet today. As much of the literature related to the filter bubble has focused on more wide-ranging ideas and concepts as well as potential future threats, only recently has this concept been examined academically and been more attributed towards extremist ideologies (O’Hara and Stevens 401; Tait 36). However, even with this being said, much of the more specific accounts and literature focus on extremist beliefs related to Islam (Speri; Nurminen 14). Still, a focused look into the alt-right via a YouTube video network is lacking in academia. Considering YouTube is one of the largest and most popular social media sites in the world today, this “popularity [YouTube] has led to its usage by extreme right groups for the purpose of content dissemination” (O’Callaghan et al. 2). Therefore, this study is advantageous towards developing a better understanding of how these

(6)

fun, useful sites can become dangerous and unhealthy spaces filled with hate, bullying and at times, lead to violence.

2. Theoretical Framework

In order to structure the research and properly understand the phenomena at play, a theoretical overview has been deemed necessary. First by further defining the alt-right and examining its origins through an explanation of the Great Replacement Theory. This will help lay a foundation for understanding the alt-right ideology more generally. Second, into the significance of the intellectual dark web as well as the alt-lite. These communities have established themselves as two very prominent communities on YouTube and are integral to this research as they constitute over half of the data collected. Lastly, into the meaning of filter bubbles and echo chambers and how they are significantly altering the ways in which we associate and consume information. This will later be connected to the analysis in hopes of establishing a connection between alt-right radicalization and YouTube as well as serve as an aid in providing a more specific understanding of the group’s more recent advancements in the last few years.

2.1 What exactly is the alt-right and where did it originate from?

As it will be integral towards understanding this research, it is important to have a clear-cut definition of the alt-right as it is a term that has been thrown around, seemingly without a care in political rhetoric and debate. According to Merriam-Webster, the term “alt right” refers to “a right-wing, primarily online political movement or grouping based in the U.S. whose members reject mainstream conservative politics and espouse extremist beliefs and policies typically centered on the ideas of white nationalism” (Merriam-Webster). The alt right have reentered the public sphere in recent years and have been at the center of much controversy and debate, particularly stemming from the violence at their Unite the Right rally in Charlottesville, North Carolina in the summer of 2017 (Tuters). According to the Anti-Defamation League, or ADL, the alt-right “originated with extremists but increasingly has found its way into the mainstream

(7)

media” (ADL). Furthermore, though the alt-right is not a movement, per se, “the number of people who identify with it is growing. It includes a number of young people who espouse racist and anti-Semitic beliefs. It has a loud presence online. The intellectual racists who identify as part of the Alt Right also run a growing number of publications and publishing houses that promote white supremacist ideas. Their goal is to influence mainstream whites by exposing them to the concept of white identity and racial consciousness” (ADL). This concept alone is

frightening and it represents the first time we have seen such a radical group embrace technology with hopes of furthering pathways towards radicalization.

Traces of alt-right rhetoric and figures can be found within many social media sites which is troubling in and of itself, however, YouTube has been chosen for this study due to its

uniqueness and its inherent quality of being the platform of choice for alt-right figures and commentary. As the extremist alt-right ideology has made its way into the spotlight of mainstream media, it is of the utmost importance to become consciously aware of this fact. Perhaps more importantly, however, is the necessity to understand how they have effectively done so, specifically through social media platforms such as YouTube, and investigate any potential concerns in order to properly address them at an early stage. This research serves as a valuable contribution to existing literature on various levels, mostly having to do with the theory and debates around white nationalism, YouTube’s recommender system and controversial subject matter, as well as filter bubbles/echo chambers,

Before exploring these concepts further, it is necessary to provide a brief history of the alt-right’s origins. In 2011, French philosopher and writer Renaud Camus coined the term ‘Great Replacement’ in his book titled ​Le Grand Remplacement ​(The Great Replacement). Proponents of this theory belong to the Identitarian group Generation Identity, which is an “organization that wants to preserve ethnocultural identity globally” (Davey and Ebner 5). Furthermore, advocates of this theory believe that “white European populations are being deliberately replaced at an ethnic and cultural level through migration and the growth of minority communities” (Davey and Ebner 7). They go on to argue that this propagation often “relies on demographic projections to point to population changes in the West and the possibility that ethnically white populations are becoming minority groups” (Davey and Ebner 7). Due to this idea, proponents of the Great

(8)

Replacement Theory have installed a great sense of fear in themselves as well as to others and because of this, “certain ethnic and religious groups, primarily Muslims, are typically singled out as being culturally incompatible with the lives of majority groups in Western countries and thus a particular threat” (Davey and Ebner 7). This notion of incompatibility is shared using a variety of methods including “dehumanising racist memes, distorting and misrepresenting demographic data, and using debunked science. Great Replacement propagandists have found ways to co-opt the grievances of different fringe communities on the internet by connecting anti-migration, anti-lesbian, gay, bisexual and transgender (LGBT), anti-abortion and anti-establishment narratives” (Davey and Ebner 5). What is particularly frightening about this theory paired with the previously mentioned methods is the fact that it can inspire calls to action from its adherents “ranging from non-violent ethnic cleansing through ‘remigration’ to genocide. This is in part because the theory is able to inspire a sense of urgency by calling on crisis narratives” (Davey and Ebner 5).

As these narratives have gained significant traction over the last few years, it is important to note how the theory has developed into different sub-theories in different parts of the world. For instance, while the Great Replacement Theory originated in France, the “White Genocide Theory” (also known as the White Replacement Theory) originated in the U.S. and was influenced heavily by Camus (Schwartzburg). The theory was first popularized by white supremacist David Lane, who argued that “white populations are being replaced through immigration, integration, abortion and violence against white people” (Davey and Ebner 7). Unlike its similar counterpart, the White Genocide Theory is often explicitly associated with anti-Semitic conspiracy theories, “suggesting that Jewish people deliberately orchestrate

population change” (Davey and Ebner 7). Even though these two theories have different origins and somewhat different names and definitions, together the theories are among the “most widespread ideologies in far-right spaces, and the primary catalysts of far-right mass violence” (Schwartzburg). With this being said, however, these subtle differences certainly do not oppose one another to a great extent and are clearly linked. The minute differences between the theories themselves are often cast aside in academic and political discourse and therefore, it is common for these theories to be used interchangeably.

(9)

2.2 Contrarian communities: Defining the intellectual dark web and the alt-lite

Before moving into the methodology and analysis, it is important to define as well as establish a clear, coherent understanding of the various categories involved in this research. As briefly mentioned earlier, the term ‘alt-right’ has a history of being thrown around in political rhetoric and debate, however, this usage of the word is often incorrect as the alt-right is specifically framed around extreme conservative ideals, particularly through the idea of white nationalism, rather than overt conservative views. With this being said, much of the misuse of the term can be more attributed towards those associated with the intellectual dark web and the alt-lite.

It is no secret that YouTube serves as a haven for discussions related to society, politics and culture to flourish and thrive. One such community found on the site that has since

flourished since its creation is the intellectual dark web. This group was coined by American mathematician and economist Eric Ross Weinstein to refer to a particular group of academics and podcast hosts. Many of these individuals have a large presence on YouTube and use the platform as their primary method of content dissemination. Some of these channels and their respective videos related to the intellectual dark web include figures such as Joe Rogan, Jordan Peterson, Ben Shapiro and Sam Harris, to name a few. To put this popularity in perspective, for instance, since the creation of his account in January of 2013, Joe Rogan has garnered close to seven million subscribers and has surpassed well over one billion total views. With a reach this large, it is clear that members of the intellectual dark web have gained significant traction online as well as in society.

While membership to the intellectual dark web remains a bit vague and obscure to academics and scholars, as well as those such as Joe Rogan who have been categorized, one thing remains clear. The intellectual dark web can be used to describe a “collection of iconoclastic thinkers, academic renegades, and media personalities who are having a rolling conversation — on podcasts, YouTube and Twitter, and in sold-out auditoriums — that sound unlike anything else happening, at least publicly, in the culture right now. Feeling largely locked out of legacy outlets, they are rapidly building their own mass media channels (Weiss and

(10)

Winter). What makes the intellectual dark web even more popular is the fact that its core members “have little in common politically” (Weiss and Winter). These political differences, paired with its members and their tendency to engage in a myriad of controversial subjects and issues such as abortion, biological differences between men and women, identity politics, religion, immigration, and so on, clearly provides viewers with interesting and entertaining debate and discussion.

Somewhere in the middle of both the intellectual dark web and the alt-right lies none other than the alt-lite. The alt-lite is similar to the two categories mentioned above insofar as the alt-lite can also be broadly classified as contrarian, or rather, opposing mainstream views and attitudes. More specifically, however, “the term alt-lite was created to differentiate right-wing activists who deny to embracing white supremacist ideology” (Horta Ribeiro et al.). Terms such as this, especially when it revolves around the world of politics, are fuzzy and have been prone to misuse and therefore misrepresentation. There are many people with views and opinions which broadly correspond with those found among alt-right rhetoric, however those same people are not necessarily racist and/or anti-Semitic for believing in broadly defined conservative values. Thus, the term ‘alt-lite’ appeared in the spotlight around mid-2016. To put it simply, according to alt-right writer and white supremacist Greg Johnson, the alt-right and alt-lite differ by their views of nationalism: “The alt-lite is defined by civic nationalism as opposed to racial nationalism, which is a defining characteristic of the alt-right” (ADL). To further put this in perspective, the alt-lite is an “appropriate description of people whose views on immigration and race relations partially overlap with those on the alt-right yet do not cross the line into open white nationalism” (Hawley 143-144). With this being said, the alt-lite does indeed perceive there to be considerable issues with contemporary American society, however they are unwilling to accept and declare that these problems can only be resolved by means of white-identity politics. These definitions of the separate categories of content, namely the intellectual dark web, alt-lite, and of course the alt-right, will prove to be beneficial in understanding this research as well as the current political climate and its slew of nuances.

(11)

2.3 Personalization in the age of filter bubbles and echo chambers

It is no secret that the advent of Web 2.0 has brought about a multitude of new, fascinating and efficient methods to access information and connect with the world and its inhabitants more generally. The most striking characteristic(s) of this shift from Web 1.0 is the focus on

user-generated content and participatory culture. Whereas Web 1.0 limited the user experience, to no fault of its own, providing users with a passive experience, Web 2.0 turned this user experience on its side and suddenly allowed for a dynamic and stimulating user experience.

While the term Web 2.0 is still understood as a monumental shift in Internet culture and history, its specific features and new found capabilities that made it a special and appealing term at the time are no longer referred to as novelties. Many of Web 2.0’s features have become ubiquitous and standardized and therefore the popularity of the term itself has dwindled. Even with this being the case, only in recent years are some of its effects protruding into the limelight and therefore open to public discourse and debate. Two of these effects that have proven to be the most prominent among scholars, politicians and average internet users alike are filter bubbles and echo chambers. While the two terms are relatively different from one another, they share similarities that will aid in developing a more concrete understanding of the network

visualization and thus how the alt-right functions on YouTube.

The Internet as we know it has changed considerably over its relatively short lifespan. More specifically, in recent years there has been a shift from the Internet being a neutral tool simply aiding in the collection and retrieval of information to one specifically tailored to our own personal interests. Seemingly everywhere on the Internet, and most certainly on all social media sites and platforms, personalization algorithms serve as the gatekeepers and determine all of the information that you see, as well as all of the information that goes unseen. This is where the concept of the filter bubble comes into play. According to media scholar and author Eli Pariser, this filter bubble creates your “own personal, unique universe of information that you live in online” (Pariser). More specifically, when companies employ these algorithms, they look at items or information you seem to like whether it be due to the fact that it was clicked on or due to some other various forms of engagement such as liking a video on YouTube, following a

(12)

group on Facebook and so on. These algorithms are designed as prediction engines, “Constantly creating and refining a theory of who you are and what you’ll do and want next” (Pariser). In a 2011 TED talk, Pariser explains that it is of the utmost importance to examine these new

methods of personalization online, now more than ever, due to their quick standardization across the Internet. Last but not least, in his TED talk Pariser states that “If algorithms are going to curate the world for us, and they are going to decide what we get to see and what we do not get to see, then we need to make sure that they are not just keyed to relevance. We need to make sure that they also show us things that are uncomfortable, or challenging or important…other points of view” (Pariser). As it will be touched upon later in this paper, this quote will become

increasingly consistent with themes related to the alt-right, their presence on YouTube, as well as their radicalization techniques on the site.

Continuing along the same line of thought, the term ‘echo chambers’ has similarly become a popular buzzword in debate and scholarly articles. Popularized by American legal scholar Cass Sunstein, echo chambers occur when “Information, ideas, or beliefs are repeatedly pushed in an enclosed system like your mind, your newsfeed, or your social circle, while other views are prohibited” (Minute Videos). ​While filter bubbles are indeed similar, they refer more towards the content itself whereas echo chambers contain a more inherent, social quality to them. In Sunstein’s book titled ​The Republic​, he explains that the Internet is driving political

fragmentation, polarization, and even extremism. He wants society to rethink the relationship between democracy and the Internet and describes how the online world creates "cybercascades," exploits "confirmation bias," and assists "polarization entrepreneurs" (Sunstein). Being the fact that he is a legal scholar, he aims to uncover and expose the flaws related to the Internet today and proposes several practical and legal changes that can be made to the Internet moving forward as he believes echo chambers pose a serious threat to democracy.

Another subtle difference between filter bubbles and echo chambers which also relates to echo chambers typically being categorized among humanistic qualities rather than from an algorithmic perspective is the fact that “We ourselves are perpetuating it [echo chamber]” (Greenwood). In his TED Talk, Adam Greenwood explains “Everytime that we post to social media, a lot of us don’t post for ourselves, we post for them. We post content that we want to see

(13)

liked, or shared or commented on and we do that because we want validation…” (Greenwood). It is this sense of validation that is particularly striking with regards to the alt-right. As it has been covered in the media, many people who were once closely associated with the alt-right and its ideology have shared their experiences stating that there was a sense of belonging and

acceptance that had been previously lacking from their lives that they then acquired after joining the alt-right. Clearly, these notions of filter bubbles and echo chambers can quickly change one’s own perspective and they can drastically affect how we perceive the world, its people, and its multitude of beliefs, ideas and opinions.

3. Research Question

The aforementioned theories and concepts of the Great Replacement theory, the intellectual dark web, the alt-lite, and filter bubbles / echo chambers paired with preexisting knowledge of the alt-right and its relevance beyond academia has begged the question:

Does engagement with controversial content of the intellectual dark web and alt-lite

communities on YouTube lead towards engaging with extreme content of the alt-right and thus towards the path of radicalization?

To answer this question, six of the most popular videos from six different YouTube channels have been chosen to be further analyzed by means of a network analysis. With this being said, not all channels are directly associated with the alt-right per se. This research aims to distinguish a connection between varying levels of content, particularly among iconoclastic thinkers and figures found on the intellectual dark web as well as those associated with the alt-lite and/or conservative ideals. By creating a visualization of a video network, this research aims to examine whether or not engaging with this content through YouTube can lead towards engaging with more extreme content and therefore, towards the road of radicalization. This will be further explained in the methodology to follow. In order to guide the research question, several sub-questions have been formulated and are as follows:

(14)

- How is the alt-lite portrayed in the network in relation to the intellectual dark web and the alt-right?

- Do channels associated with the alt-right have as big of a presence as the other two categories?

4. Methodology: Network Analysis

With regards to this research, radicalization into the alt-right ideology has been examined through the perspective of a network analysis on YouTube, particularly by examining six of the most popular videos from six respective channels ranging from three separate categories: The intellectual dark web, the alt-lite, as well as the alt-right. By conducting a network analysis, this research explores relationships and connections between the six videos and their respective channels to explore if there is a distinct correlation between whether or not viewing content deemed as controversial, such as that found among intellectual dark web channels, eventually leads towards the viewing of more extreme content such as content related to the alt-right ideology. In the following sections I will first explain how the videos were chosen, then into which tools were used towards developing the analysis as well as how they function, and lastly into a few concerns and potential limitations with regards to the research itself.

4.1 Rationale for channel and video decisions

As far as the choices go for which channels were chosen, the decisions were primarily based on an audit called ​Auditing Radicalization Pathways on YouTube​ conducted by five researchers at the Federal University of Minas Gerais in Belo Horizonte, Brazil. More specifically, these researchers conducted an audit of YouTube’s recommendation algorithms to examine whether or not those algorithms could be attributed towards online user radicalization. In order to analyze this topic further, they devised a devised a meticulous methodology which can be described as follows: “(a) collect a large pool of relevant channels; (b) collect data and the recommendations given by YouTube for these channels; (c) manually labeling these channels according to the

(15)

communities of interest” (Ribeiro et al.). Additionally, for deciding which channels were deemed relevant, they further specified their methodology. First by choosing a set of seed channels extracted from the Anti-Defamation League’s report on the alt-right as well as Data & Society’s report on YouTube Radicalization. Then they chose a set of keywords related to each community by utilizing YouTube’s search functionality and considered the first 200 results. Lastly, they iteratively searched the related and featured channels collected in the previous two steps and added relevant channels. After following this methodology, they then created a pool of channels for each community and included the top 16 channels for each category based on total channel views.

From here, I took the top two channels from each community (with the exception of the alt-right, as several of its top channels have since been deleted by YouTube) and selected the most viewed video from each channel. With this being said, the channels that were chosen are as follows: ​Intellectual Dark Web​ - ​PowerfulJRE​ and ​PragerUniversity​; ​Alt-Lite -

StevenCrowder ​and ​Rebel Media​; ​Alt-Right - ​Black Pigeon Speaks​ and ​The Golden One​ (#2 and

#5 most viewed channels).

4.2 Data extraction and visualization

In order to conduct a proper network analysis to see if YouTube aids in alt-right radicalization pathways, two separate tools were used in order to generate data from YouTube for the purpose of creating a visualization of a video network. The first tool used for this research was YouTube Data Tools. This tool is a set of scripts created by associate Professor Bernhard Rieder of the University of Amsterdam and can be used to extract data from the YouTube platform via

YouTube’s data API (Application Programming Interface). Among YouTube Data Tools are five separate modules, each with its own different, specific function. These modules include: ​Channel

Info​, ​Channel Network​, ​Video List​, ​Video Network​, as well as ​Video Info and Comments​. The

module that was used in this research was ​Video Network​ which, in short, “Creates a network of relations between videos via YouTube’s ‘related videos’ feature, starting from a search or a list of video ids” (Rieder). With this module, there are two different methods to begin. One such method is by entering a search query while the other is by seeds. The latter method was chosen

(16)

for this research as this method would lead to more refined, accurate visualization of the six videos and their related video network. After deciding on how to start, six different video IDs were entered with a crawl depth of 1. Crawl depth can be understood as the extent to which a search engine indexes pages within a website. The value of 1 was chosen based on Rieder’s own recommendation due to the possibility of the script running out of memory. To continue, once the module had finished running, I was presented with a graph file in GDF format and opened it in Gephi. Gephi was the second tool used in this research and it can be defined as “an open source network exploration and manipulation software” with modules that can “import, visualize, spatialize, filter, manipulate and export all types of networks” (Bastian et al.). Moreover, the visualization model found within Gephi uses “a special 3D render engine to render graphs in real-time” (Bastian et al.). With tools and features such as these, I believe this method is quite viable with regards to conducting a network analysis in order to attempt to visualize radicalization and to discern if there is indeed a correlation on YouTube between controversial and extreme content and whether users can in fact be pulled into the alt-right rabbit hole.

4.3 Potential concerns and limitations

As this research contains many similarities with regards to content found among various research studies, namely the aforementioned study by researchers at the Federal University of Minas Gerais in Brazil, it is important to stress here that it differs in the following ways. One, this research has been conducted through the use of a network analysis rather than a complete audit of YouTube and its various algorithms. On the same line of thought, this study attempts to work off of previous research and shed a new light on radicalization and the alt-right; two very relevant and controversial subjects. Moreover, to my knowledge there has not yet been an academic study related to collecting YouTube videos varying in their political extremity and subject matter with the aim of examining avenues of radicalization by means of a network analysis through the usage of YouTube Data Tools and Gephi.

(17)

As for roadblocks in this research, several potential limitations have arisen. For one, it could be possible that in order to gain a more descriptive, accurate representation of the video network, more than six seeds are required. Additionally, due to Gephi’s extensive list of

capabilities and features it is possible that certain functions were missed and therefore were not used or represented in the analysis. Understanding what software does is one thing, but utilizing it to its fullest capacity is a whole separate task and takes a lot of time, often with multiple researchers involved. Furthermore, perhaps the examination and decision of which video to choose from each channel to be further analyzed was misguided, as not every video of the different channels was related to political rhetoric of some sort. Perhaps this could lead to unnecessary, inadequate data being accumulated into the network. Lastly, during the process of gathering video IDs I did not use my own personal YouTube account. Instead I created and used a new account as I believed using my personal account would clutter the network with many unnecessary nodes and edges. It is possible that this decision could alter the network in various ways, however it has yet to be concluded. In any case, I suspect conducting a network analysis using the two separate tools mentioned above will be sufficient in generating meaningful, relevant data in hopes of learning more about YouTube and its role within the alt-right community more specifically.

5. Results of the YouTube network analysis

The network of related videos among the three separate categories of the intellectual dark web, the alt-lite, and the alt-right has been investigated. By conducting this analysis of related videos, the extent to which users become engrossed in extreme content relating to the alt-right has become evident.

5.1 Unpacking the network and its various clusters

After working extensively with YouTube Data Tools and Gephi, a network visualization was created (Fig. 1), located on the following page. While the graph itself may look confusing, after categorizing each cluster and briefly defining the various attributes assigned to the graph, its

(18)

significance should start to become more clear. First and foremost, the first feature assigned to the graph was that of the Indegree sizing attribute. In short, this attribute sizes the nodes, or rather each video in the

network,​ according to how often they are recommended as related video by the videos in the network; the larger the node, the more recommended that video is in the network. This will be important to keep in mind as the analysis

continues. Furthermore, ​at first glance it becomes apparent that the graph is separated into six different clusters, each with its

own assigned color. This can be defined as the graph’s modularity class and can be seen below in Figure 2. ​Each color represents a different category of videos in the overall video network which

was generated from the six various seeds mentioned earlier. With this being the case, the purple cluster represented the highest

percentage of videos in the network with a value of 24.45%. What is particularly interesting about this, as it will be further discussed in the chapter to follow, is the fact that this cluster stems from the seed of the aforementioned alt-right channel, ​The Golden

One​. The next most abundant cluster is the light

green cluster with a percentage of 18.34%. This cluster can be mostly attributed to Joe Rogan’s channel as well as one of the seeds for the analysis, ​PowerfulJRE​. Looking back at Figure 1, it is

(19)

interesting to note that this cluster is clearly separated further from the others. This can be understood as being a structural gap within the network and can attributed to the fact that these videos do not contain any similarities to the other clusters specifically with regards to

controversial titles. Nodes of this cluster mostly contain podcast episodes titles with minimal information such as the guest’s name. Moreover, there is minimal deviation into a small

subcategory of YouTube videos from the YouTube original ​Mind Field​, which is a series relating to science and the human psyche. Lastly, there was also a small category of political videos relating to American Presidential candidate, Bernie Sanders, most likely due to the fact that he was a popular guest on Rogan’s podcast a few months ago. The next most abundant is the blue cluster with an overall percentage of 16.47%. This cluster can generally be understood as videos relating to the seed of ​Prager​U. Unlike the light green cluster, these video titles contain much more controversial, sensitive material with video titles such as “Why I Left the Left,” “Is Islam a Religion of Peace?” as well as “Make Men Masculine Again.”

Moving on to the final three clusters, the next most abundant would be orange constituting 15.62% of the network. This cluster is largely associated with alt-lite figure and channel, ​StevenCrowder​. The nodes in this network contain some of the most controversial titles in the entire network with titles such as “There Are Only 2 Genders | Change My Mind,” “Rape Culture is a Myth | Change My Mind,” and “Male Privilege is a Myth | Change My Mind.” All of these videos in particular can be found on Crowder’s channel, however what is especially

interesting, and can be seen in Figure 1, is the fact that these nodes constitute as several of the largest nodes in the entire network. Meaning these videos are some of the most recommended across the network at large. Next is the dark green cluster which calculates to 14.09% of the graph. When analyzing this cluster, most of the videos were from the channel ​RebelMedia​, another alt-lite channel. More specifically, this cluster contained a few notable subcategories. One of these being videos including far-right political commentator, Gavin McInnes, noted for his promotion of violence against political opponents. This cluster represents a clear distinction between controversial videos from the various political perspectives found in the previous clusters and far-right videos. Moreover, this cluster introduces a lot of content related to American President Donald Trump, all of which contain pro-Trump rhetoric, as well as a

(20)

significant amount of nodes related to Canadian Prime Minister Justin Trudeau, all with negative connotations. Last but not least is the pink cluster constituting of the final 11.04% of the graph. This group of nodes includes content mostly from the alt-right channel ​Black Pigeon Speaks​. What is interesting with regards to this cluster is the fact that ​Black Pigeon Speaks​ is the only alt-right channel to use such explicit alt-right rhetoric with nodes titled “Why the West is Lost,” “Japan | No Country For Islam,” and “Merkel’s Madness and the Unstoppable Rise of

Nationalism.” Other than similar nodes relating to ideas and themes as seen here, there is a small number of nodes relating to Japan in general, however nothing of particular significance. Now that the clusters themselves are categorized and more clearly defined, it is important to describe the findings of the actual connections, or edges, between the nodes and their significance.

5.2 Edges: Analyzing connectivity within the network

While inspecting the clusters is integral to begin a proper analysis of the network, it is just as important to understand the connections between the various clusters which can be defined as the graph’s edges. As it can be seen in Figure 1, the purple cluster was by far the most spread out across the network and contained the highest amount of videos in the network overall. However, many of its nodes can be defined as having a low degree and low betweenness centrality. This indicates that these nodes are not nearly as well-connected within the cluster they belong to as well as to other nodes in different clusters. This means that the purple cluster is not nearly as clear with regards to the types of content found within it. It seems this is due to the fact that the alt-right channel, ​The Golden One​, tends to be very implicit with regards to his video titles which means traces of alt-right can only be found by actually watching and listening to his videos. Moreover, the seed attributed to this cluster contained the word “Sweden” in the title which caused the nodes to be sparsely connected to a wide array of different content such as videos related to tourism in Sweden, anti-immigration, and President Trump to name a few. With this being said, there are no clear connections to the intellectual dark web or the alt-lite as far as this cluster is concerned.

(21)

The orange cluster as seen in Figure 1 however, while only constituting of 15.62% of the network, is by far the most influential in the network at large. Contrary to the aforementioned purple cluster, this category of videos, specifically pertaining to alt-lite channel ​StevenCrowder​, has a very high degree as well as a high betweenness centrality. This means that videos found in this cluster are not only well-connected with each other, but the overall network. For example, in Figure 3, the extent to which the orange cluster (consisting mostly of alt-lite content) connects with the overall network is made clear. More specifically, in Figure 3 which can be seen below, shows the largest node, or rather the most recommended as a related video in the network, clearly is associated with content in the other clusters, particularly videos in the blue, dark green, and pink clusters. It is interesting to note that the orange cluster is especially connected to

alt-right content such as that found among the pink cluster. This is a clear indication that videos relating to less extreme content, although still controversial, are indeed connected to alt-right content such as that found on ​Black Pigeon Speaks​ to a degree.

As for the light

green, blue, and dark green clusters, while they each contain nodes with high degrees, their betweenness centrality is rather low. Thus, there does not seem to be a distinct correlation between their respective content and the alt-right. More specifically, the light green cluster as seen in Figure 1 is quite distant from the five other clusters which would indicate that its content does not have any significant correlations with more extreme content found among this network. Perhaps this is due to the fact that the light green cluster has a tendency of having very similar, matter-of-fact video titles without any sort of controversial words or phrases. As far as the blue

(22)

and dark green clusters are concerned, they were definitely more connected to the other clusters, however only the dark green cluster containing alt-lite content from ​Rebel Media ​can be

classified as being connected with more extreme content relating to the alt-right. However even with this being said, the extent to which this is significant enough to garner valuable, substantial insights with regards to radicalization into the alt-right cannot yet be concluded.

With this network visualization it is clear that there is some sort of correlation between less extreme content and that of the alt-right. However, the extent to which this is true seems to greatly depend upon the channel and its use of explicit words and phrases relating to the alt-right ideology as was seen among the purple cluster with regards to ​The Golden One​’s assortment of videos and their rather covert titles. Furthermore, it is apparent that engaging with highly controversial content such as alt-lite videos found on ​StevenCrowder​ can lead to engaging with the extreme content such as that on ​Black Pigeon Speaks​. It is evident, however, that as far as this network visualization is concerned, that viewing content of the intellectual dark web does not pave a clear path towards the alt-right ideology and radicalization more generally. In the following section, the results as well as the analysis will be further discussed in hopes of shedding light on as to why this seems to be the case.

6. Discussion

Since the popularization of the Internet with the advent of Web 2.0, the internet has long been “celebrated for disseminating knowledge and supporting conversation, dialogue, and debate” (O’Hara and Stevens 401). The Internet’s inherent quality of being decentralized certainly comes with many benefits. It provides a voice for people to be heard by the masses and it allows people to share and post opinions, beliefs, news articles, photos, videos, and so much more, all with the click of a button. However, in recent years this same characteristic of the Internet has been adopted by groups with more sinister motives and ideals. More specifically, the Internet’s complicity with regards to extremism and radicalization is not exactly a new concept throughout academia as well as the public sphere, however it is clear that is not yet fully understood among scholars and remains ambiguous in the eyes of society as a whole. Due to this fact, this research

(23)

attempts to bridge the gap and provide a new perspective on this modern tactic of extremism and online radicalization, specifically through the lens of the alt-right on YouTube as this their “primary battlefield” (Tuters 39). Through YouTube, the alt-right are particularly interesting due to “their innovative infusion and deployment of high concept fan culture in the form of political tactics” (Tuters 39). With this in mind, this study aimed to critically assess YouTube as a platform as well as the group’s effectiveness by examining three different categories of content to discern if users engaging with less extreme content can eventually be pulled into viewing extreme content supporting the alt-right.

As far as this research goes, the first step was to create three different categories of YouTube videos, namely, the intellectual dark web, the alt-lite, and of course, the alt-right, and then analyze the different channels associated with each as well as their various content. To do this, an audit by researchers at the Federal University of Minas Gerais in Brazil was used to accurately categorize the different channels by popularity and choose channels with the most total views. Next, the most viewed videos from each channel were selected for data extraction in YouTube Data Tools in order to generate a graph file and visualize the network of related videos in Gephi. Finally, after working extensively with the data in Gephi by means of a network analysis, several key findings were made.

On the basis of the network analysis, the first interesting discovery was that the purple cluster, largely associated with the alt-right channel ​The Golden One​, contained the most videos in the network overall. At first glance this was especially surprising, however after further examination, it became evident that due to ​The Golden One​ containing videos with rather vague and sometimes implicit video titles, this led to the cluster expanding across the network rather than in a tight-knit, easy-to-follow assemblage of videos. This is interesting for several reasons. For one, this suggests that videos relating to more extreme content can be somewhat hidden from YouTube’s moderation policies and practices. With the example of ​The Golden One​, many videos’ true message of white nationalism was hidden mostly under the guise of bodybuilding content as well as general life advice. Moreover, it is clear that this works relatively well as many channels which are downright explicit in their message pertaining to white nationalism and the alt-right in general have since been deleted and/or suspended. This in and of itself begs the

(24)

question of whether there are many other channels operating under the same premise. However, even with this being said, this cluster did not contain any nodes of especially large size, meaning that they were some of the least recommended videos in the network even if they constituted the highest percentage in the network. Furthermore, the light green cluster which consisted of videos mostly from Joe Rogan’s channel, ​PowerfulJRE ​which was by far the most viewed channel out of the six, did not have any clear links to more extreme content or even to the similarly

categorized intellectual dark web channel of ​PragerU​. Relating to ​The Golden One​, it can be deduced that since many video titles in the light green cluster were very straightforward in nature, containing only the name of the podcast and its respective guest. This is yet another example of how the titles of videos, rather than the content itself, heavily influence the visualization of the network.

Perhaps the most significant finding, however, was after examining the orange cluster which is associated with ​StevenCrowder ​and the alt-lite more generally. The alt-lite, as it was previously mentioned, can be understood as more or less the midway point between content found among intellectual dark web channels and channels associated with the alt-right. Here there were obviously heavy connections between nodes within the cluster itself, however this cluster was by far the most connected to other clusters in the network, namely the dark green and pink cluster which consists of alt-lite content as well as alt-right content. Moreover, the orange cluster contained some of the largest nodes in the network. These two characteristics are especially relevant for a few reasons. One this suggests that the more controversial,

conservatively framed content found in the orange cluster is the most recommended as related videos in the network as well as the fact that it is the most connected to other videos of similar or more extreme nature. Unlike the previous explanations of the purple and light green cluster, this suggests that there is, to a degree, a distinct correlation between intellectual dark web, alt-lite, and alt-right content. In other words, through the orange cluster, it is evident that YouTube does indeed have some sort of effect with regards to engaging with alt-right content through less extreme videos. The extent to which this can be seen as concrete, absolute evidence, however, remains unclear. While it is true that there is a connection between the varying degrees of content found within the network, it seems to be highly dependent on the titles of the videos

(25)

rather than the content and more importantly, there does not seem to be enough evidence to support that engaging with less extreme content will eventually lead towards the path of radicalization into the alt-right ideology.

Lastly, it is interesting to note that every cluster, with the exception of the purple cluster, can be designated as having relatively high modularity with their dense connections between the nodes. This means that even though not every cluster was clearly connected to the others, the five still contained nodes which were highly connected, or rather related, to other nodes in the same cluster. This is important due to the fact that this signifies the formulation and presence of the aforementioned filter bubbles and echo chambers. By engaging with videos in one cluster, it is clear that this will lead towards the recommendation of other videos in that same category. This is what makes the orange cluster particularly problematic as it has significant connections to alt-right content which means that if a user were to be recommended videos from the pink

cluster, as it can be seen in Figure 3, then it is possible that the same user could be recommended more videos from the pink cluster over time and thus being trapped in the filter bubble. This is not a process that simply happens organically, but rather one that can be attributed towards the usage and creation of algorithms. The network visualization itself is particularly frightening because it supports filter bubbles and echo chambers in alt-right communities, but more so, it shows that there are connections between the other filter bubbles, allowing users to seamlessly enter these various social worlds. While filter bubbles and echo chambers do not account for how users become trapped in these social worlds, as the definition would suggest it to be

contradictory and paradoxical, it is clear that this is achievable due to the advent of algorithms, specifically through recommendation algorithms. With regards to the alt-right, it is clear that its members are “innovation opportunists” and are “finding openings in the latest technologies to spread their message” (Daniels 62). Moreover, according to Professor and author Jessie Daniels, these algorithms have many abilities and they greatly “speed up the spread of white supremacist ideology” as well as “amplify and systematically move white supremacist talking points into the mainstream of political discourse” (Daniels 62). With this being said, it is clear that this unseen consequence of YouTube needs to be seriously considered and further examined in academia as well as in the public sphere.

(26)

7. Conclusion

This research has focused on the alt-right’s growing presence in online spaces, particularly through the platform of YouTube. Moreover, this study aims to critically analyze a network of YouTube videos to see if there is a distinct connection between whether or not viewing less radical content eventually leads to viewing more extreme content such as that found within the alt-right ideology, thus serving as a “radicalization pathway” (Ribeiro et al.). Due to the fact that there is still much debate and controversy over the alt-right’s presence and effectiveness in online spaces, this research attempts to shed a new light on the subject by means of a network analysis. With this method, this analysis is able to dive deep into the extent which the three different communities of the intellectual dark web, the alt-lite, and the alt-right were connected due to their varying levels of political extremity and overall popularity on the platform.

Several conclusions can be made from the analysis. For one, it is evident that video titles play a large role towards how connected one node is from another as was seen among the purple and light green clusters of ​The Golden One ​and ​PowerfulJRE​. Due to this, as well as other potential factors, there was no concrete evidence supporting the idea that engaging with videos of the intellectual dark web leads towards viewing more extreme content found among alt-right channels. However, when it came to examining the alt-lite channels, there was a considerable correlation between their videos and videos of the alt-right. Additionally, the content in the orange cluster, which is associated with alt-lite videos, contained the largest nodes in the network meaning they were the most recommended as related videos in the network. With this being said it would support the idea that engaging with alt-lite content can lead towards viewing alt-right videos. Even though the alt-lite is more controversial, it still remains as a very popular subculture on YouTube and especially within the network. Finally, the network highlighted the concepts of of filter bubbles and echo chambers within the various communities to a high degree. This can be mostly attributed towards YouTube’s recommendation algorithm.

This study could serve as an initial guide into network analysis and the various tools which can be associated with it such as YouTube Data Tools and Gephi. Even though this study

(27)

was concerned with the alt-right on YouTube and the potential avenues towards radicalization, the model can be assigned to a wide range of topics requiring a visualization of a network.

As far as this research is concerned, a few suggestions for future studies can be made. First, in order to create a more accurate representation of the network to gain more precise insights, the inclusion of many more videos and channels need to be used. Additionally, perhaps it would also be important to create a list of channels designated as the control to see how the various types of controversial, alt-lite and alt-right content connects with popular channels much less political in nature.

In conclusion, it remains as important as ever to maintain a critical eye when it comes to powerful, popular platforms such as YouTube due to the many unseen, often unknown

consequences that lie within as it can be seen with regards to the recommendation of alt-right content in this study. There seems to be much literature on filter bubbles and echo chambers as well as research dedicated towards algorithms and their alluring capabilities, however little to no research has been devoted towards gathering data and visualizing a network of videos and/or channels. Moving forward, the utilization of these tools to their fullest capacity will be integral in understanding not only the ever growing amount of social media platforms, but more specifically and just as important, the rise of extremist groups such the alt-right and their growing

(28)

Bibliography

Bastian M., Heymann S., Jacomy M. (2009). ​Gephi: an open source software for exploring and

manipulating networks.​ International AAAI Conference on Weblogs and Social Media.

Bucher, Taina, and Anne Helmond. “The Affordances of Social Media Platforms.” ​The SAGE

Handbook of Social Media​, by Jean Burgess et al., SAGE Publications Ltd, 2018, pp.

233–53. ​DOI.org (Crossref)​, doi:​10.4135/9781473984066.n14​.

Futrell, Robert, and Pete Simi. “The [Un]Surprising Alt-Right.” ​Contexts​, vol. 16, no. 2, May 2017, pp. 76–76. ​Crossref​, doi:​10.1177/1536504217714269​.

Greenwood, Adam. “Challenge the Echo Chamber | Adam Greenwood.” ​YouTube​, TEDx Talks, 8 Apr. 2019, ​www.youtube.com/watch?v=UKyFL389qe8&t=357s​.

Jacob Davey, and Julia Ebner. ​The ‘Great Replacement’: The Violent Consequences of

Mainstreamed Extremism​. p. 36.

Nurminen, Elina. “The creation and use of efficient social media strategy in modern terrorism and how to tackle it: an ISIS case study.” ​Tallinn University of Technology.

O’Hara, Kieron, and David Stevens. “Echo Chambers and Online Radicalism: Assessing the Internet’s Complicity in Violent Extremism: The Internet’s Complicity in Violent Extremism.” ​Policy & Internet​, vol. 7, no. 4, Dec. 2015, pp. 401–22. ​DOI.org

(Crossref)​, doi:​10.1002/poi3.88​.

Pariser, Eli. “Beware Online ‘Filter Bubbles’ | Eli Pariser.” ​YouTube​, TED, 2 May 2011, www.youtube.com/watch?v=B8ofWFx525s&t=256s​.

(29)

Ribeiro, Manoel Horta, et al. “Auditing Radicalization Pathways on YouTube.”

ArXiv:1908.08313 [Cs]​, Dec. 2019. ​arXiv.org​,​ http://arxiv.org/abs/1908.08313​.

Rieder, Bernhard (2015). YouTube Data Tools (Version 1.12) [Software]. Available from https://tools.digitalmethods.net/netvizz/youtube/.

Schwartzburg, Rosa. “The 'White Replacement Theory' Motivates Alt-Right Killers the World over | Rosa Schwartzburg.” ​The Guardian​, Guardian News and Media, 5 Aug. 2019, www.theguardian.com/commentisfree/2019/aug/05/great-replacement-theory-alt-right-k illers-el-paso​.

Speri, Alice. “ISIS Fighters and Their Friends Are Total Social Media Pros.” ​Vice​, Vice, 17 June 2014,

www.vice.com/en_us/article/wjybjy/isis-fighters-and-their-friends-are-total-social-medi a-pros​.

Sunstein, Cass R. author. ​#Republic : Divided Democracy in the Age of Social Media /​. Princeton :: Princeton University Press,. Print.

Tait, Amelia. “The Alt-Right Is Thriving Far Away from Twitter and Facebook, Creating an Echo Chamber of Hate.” ​New Statesman​, 12 Jan. 2018.

Tuters, Marc. “LARPing & Liberal Tears. Irony, Belief and Idiocy in the Deep Vernacular Web.” ​Post-Digital Cultures of the Far Right​, edited by Maik Fielitz and Nick Thurston, transcript Verlag, 2018, pp. 37–48. ​Crossref​, doi:​10.14361/9783839446706-003​.

Weiss, Bari, and Damon Winter. “Meet the Renegades of the Intellectual Dark Web.” ​The New

York Times​, The New York Times, 8 May 2018,

www.nytimes.com/2018/05/08/opinion/intellectual-dark-web.html.

“Alt Right: A Primer about the New White Supremacy.” ​Anti-Defamation League​,

www.adl.org/resources/backgrounders/alt-right-a-primer-about-the-new-white-supremac y.

“Alt-Right.” ​Merriam-Webster​, Merriam-Webster, www.merriam-webster.com/dictionary/alt-right​.

(30)

“From Alt Right to Alt Lite: Naming the Hate.” ​Anti-Defamation League​, ADL,

web.archive.org/web/20190422202936/https://www.adl.org/resources/backgrounders/fro m-alt-right-to-alt-lite-naming-the-hate.

List of Figures

Figure 1: YouTube video network with the indegree sizing attribute

Figure 2: Screenshot from Gephi of the assigned colors and their respective percentages Figure 3: Visualization of the orange cluster’s connectivity

Referenties

GERELATEERDE DOCUMENTEN

Als zodanig achten wij het waarschijnlijk dat counter-narrative initiatieven meer kans hebben effect te sorteren wanneer zij worden ingezet als preventiemiddel, bijvoorbeeld bij

Er is in dit onderzoek geen evidentie gevonden voor individuele voorspellende waarden van leeftijd, opleidingsniveau, angst en depressie voor succesvolle brain training.. Zoals

Uit onderzoek van Dishion en anderen (1995; 1996; 1997) komt naar voren dat wanneer er bij jongeren met een leeftijd van 13/14 jaar sprake is van deviancy training, zij op

De verwachting was dat, als er een multi-factor model zou zijn met de drie factoren veiligheid, sociaal contact of ondersteuning en ruimte scheppen voor leren en ontwikkelen, er

While existing notions of prior knowledge focus on existing knowledge of individual learners brought to a new learning context; research on knowledge creation/knowledge building

Door het Comfort Class principe te maken tot ijkpunt/richtpunt voor andere welzijnsinitiatieven, kan deze verbinding worden gelegd. Wanneer de initiatieven langs deze lijn

' nto account. These include reduced consplC lrt y of vulnerable road users, Increased fue l usage, environmenta l con c erns, more frequently burned-out bulbs, and

Figure 9.1: Schematic representation of LIFT (adapted from [131]), where the absorbed laser energy (a) melts the donor layer resulting in droplet formation [16, 18] or (b) transfers