• No results found

Obama's Legacy: a Research in the History of Technological Change, Internet and Data Gathering from the Perspective of a Political Star.

N/A
N/A
Protected

Academic year: 2021

Share "Obama's Legacy: a Research in the History of Technological Change, Internet and Data Gathering from the Perspective of a Political Star."

Copied!
67
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Obama's Legacy: a Research in the History of

Technological Change, Internet and Data Gathering

from the Perspective of a Political Star.

By Rik van Eijk

10003025

Supervisor: dhr. dr. E.F. van de Bilt University of Amsterdam

(2)

Introduction

“I hope we will use the Net to cross barriers and connect cultures...”

...was the answer by Tim Berners-Lee, the major inventor of the World Wide Web, when CNN asked him what he thought the internet's biggest impact would be (CNN 2005). In 1990, the general public was introduced to the World Wide Web, developed in the rooms of CERN in Geneva. Through this world wide web, people could talk, discuss and enhance their opinion on every subject that they wanted. This world wide web became quickly known as the internet. Academics could simply send and peer-review their colleague's publications. E-mail introduced a fast way to send text to a particular person; families all over the world were able to quickly (re)connect when instant messengers took over and webcams were introduced. Companies got involved in the business and started their own websites. Sites became less static, introduced pictures and video’s and designers and programmers improved websites to optimize the user experience. Games focused more and more on a multiplayer function, because the internet allowed people all over the world to play with and against each other. SixDegrees launched personal websites in the 90’s, just like Facebook or Twitter offer today. By selling advertorials based on a specific user data, internet companies, such as Google and Yahoo, form a billion dollar industry. They started to enhance the experience of the internet by their own web browsers and operating systems and by introducing whole new technologies and standards to society, such as Google Glasses or Oculus VR. Almost one third of the world population has access to internet: from a cable, wireless modem connection to 3G connection on a smartphone, the internet is available everywhere.

In the winter of 2011 I challenged my 120 Twitter followers to try getting from the online shopping mall Amazon to the website of the University of

(3)

Amsterdam, without typing the UvA address in the browser, without copying and pasting and without the help of search engines. The only thing that was allowed was clicking. The links on different sites should determine whether it was possible to travel from one site to the other. One of my followers actually tried it and needed fifteen steps to accomplish the journey, whilst it would only have taken one search to accomplish this by the help of a search engine. Try to imagine that you do not want to go to the website of the University of Amsterdam, but want to know the answer to the question whether CERN published a paper of Pluto’s orbit in relation to that of Mars and Venus in 1999 (the answer is no). The point that I want to make is that the current experience of the internet has changed by the implementation of online tools by commercial companies: they make the internet easier to use. These companies, as the short history of the internet above also suggests, use information of people that enter searches in order to better aim their advertorials. The user pays the usage of the service with the information: every piece of information that can be obtained, will be obtained, analyzed and used for marketing purposes. It will be used to read the user and to achieve capital gain. The internet pioneers started with a positive vision The first users of the internet also became the main contributors of the medium. The position of these contributors has been taken over by tech companies, trying to maintain their dominance in internet service and data collection.

It is this data collection that recently created a controversy in the discourse of privacy vis-a-vis security. Edward Snowden, a NSA contractor employed by Booz Allen Hamilton, stored classified information which he passed on to journalists of The Guardian, The Washington Post or The New York Times. The discourse of data collection – or the so called big data – shifted from a commercial gain towards a security and privacy issue. The data was not only used to enhance the experience of the internet; to make it easier for people to search what they want; how to communicate and what to share. The information formed the major database for

(4)

the secret service agencies of the United States and other countries to analyze and correlate when and where security threats were most likely to occur. Snowden revealed that the NSA conducts mass surveillance, by collecting internet and telephone data of their own and of foreign citizens. They even tapped foreign country leaders that were known as allies, such as Germany’s Chancellor Angela Merkel. The discourse of internet privacy became relevant overnight: everyone, from politicians to newspapers and bloggers, talked about Snowden’s revelations, the effects they had on privacy and the position of governments that were involved. This paper will not discuss the difficult privacy perceptions of the current state of affairs with regard to data collection by commercial companies. Neither will it discuss the history of the Internet, how it should have been used and why it did not turn out that way. Instead, this paper will research the position of one particular man in the internet discourse, a man that has used data collecting technology in such a way that he now holds residence in the most powerful estate of the world. A man that is smart, charismatic, a good public speaker; a man known for his skin color and his position, but who is now in a difficult position regarding the discourse of internet and its freedom, the path he chose and the population’s awareness of the issue. This paper will discuss the current president of the United States, Barack Obama: his accomplishments and failures with regard to his ideology and the way he uses the internet and the digital sphere.

The United States plays a major role in the current data collection debate. As the president of the United States, Obama needed to take a stand in the discourse. In his State of the Union address of 2014 he said “That’s why, working with this Congress, I will reform our surveillance programs – because the vital work of our intelligence community depends on public confidence, here and abroad, that the privacy of ordinary people is not being violated”. With this statement he took a stance in internet privacy for US citizens, whilst not devaluing the need of the intelligence community. At the same time, publications such as those of Sabato

(5)

(2010) and Issenberg (2012) expressed that social media, internet tools and big data played a major role in the campaign strategy for the elections of 2008 and 2012, either to reach out to the voters or to collect donations. This contradiction in Obama’s use of the concept of data, surveillance, social media, appeal to voters and citizens, combined with Obama's background, education, career and his Change and Hope campaign strategy, led to the subject of this master thesis.

One of Obama’s main subjects in his campaign was the introduction of a collective, nation-wide health care system. Through his experiences as a community organizer in the South side of Chicago, Obama realized that a collective health care system would solve major problems of the poor. The Patient Protection

and Affordable Care Act (PPACA, also known as Obamacare) was signed to law by

Obama in 2010. The website was launched in October 2013, but faced major technical problems. One part of Obama’s solution was to ask CEOs and other prominent employees of tech companies to advise him on the issue, introducing them into the White House. Among the people that were invited were Tim Cook, CEO of Apple, Reed Hasting, CEO of Netflix and Dick Costolo, CEO of Twitter (Hall 2013).

The introduction of these CEOs questions the notion of internet-interpretation by Obama and his administration. As described in the first chapter of this thesis, there are two dominant categories in the current internet-paradigm, as mentioned by Morozov: the cyber-utopians and the internet-centralists. According to Morozov, both categories have an unrealistic view of the internet and its possibilities. Cyber-utopians think of what the internet should be, in all kind of ways – a set of political, social and technological beliefs:

Cyber-utopians ambitiously set out to build a new and improved United Nations, only to end up with a digital Cirque du Soleil. Even if true – and that’s a gigantic ‘if’-- their theories proved difficult to adapt to non-Western and particularly non democratic contexts (Morozov 2011: xvii).

(6)

Internet-centralists not exactly opposes the Cyber-utopianism, but it certainly does not agree with it. Where the cyber-utopians think of the what of the Internet, Internet-centralists think of how practices should be done:

Unlike cyber-utopianism, Internet-centrism is not a set of beliefs; rather, it’s a philosophy of action that informs how decisions, including those that deal with democracy promotion, are made and how long-term strategies are crafted. (...) Internet-centralists like to answer every question about democratic change by first reframing it in terms of the Internet rather than the context in which that change is to occur. They are often completely oblivious to the highly political nature of technology, especially the internet, and like to come up with strategies that assume that the logic of the internet, which in most cases, they are the only ones to perceive, will shape every environment than it penetrates rather than vice versa (Morozov 2011: xvii).

Following the path paved by Morozov, I will examine the current state of affairs in the Obama administration from both the Cyber-utopian as the Internet-centralists perspective. I will examine the life of a young Barack Obama and influence of the counterculture that eventually led to the utopian internet vision of Silicon Valley (which will be chronologically explained in the second chapter of this paper). It is common knowledge that Obama is known to have used digital technologies, such as social media platforms and big data, particularly in his (re)election as president. Moreover, as a president, he dealt with internet related issues, such as the SOPA/PIPA debate, his Obamacare project and the revelations of global espionage by the NSA and American related secret services by Wikileaks and Edward Snowden. What is his stance to it, does he influence it, how aware of the technology is he actually? Why does he invite CEOs of the tech companies to the White House, whilst they were not invited to be advisors in the SOPA and PIPA hearings in Congress? What is the position and the function of the prominent tech people, their culture and their technology in the race for Obama’s election and re-election? Questions such as these will be answered in this paper.

(7)

Careful analysis of several biographies, such as Kloppenberg’s Reading

Obama, in combination with his speeches, his own writings, his campaigns and his

legislative decisions with regard to technological issues when in office, such as the SOPA and PIPA acts, will provide an answer to the above mentioned questions. The main objective of this paper is to place Obama in a society described above: where the internet is common, data gathering a must—and most importantly, how Obama uses this technology to reflect his own ideology. This paper is divided into three parts. First a theoretical background of the current digitized world relating the theory of Cybernetics. It will be followed by how Obama and his campaign team used this digital technology to reach the presidential seat; a seat that is a necessity to conduct Obama's ideology. The last chapter will focus on the use of digital technologies during Obama's terms. This will result in a paper of Obama in which his youth, education, career and presidency will all be evaluated from a technological perspective: the internet, social media, big data and others. The result will be Obama’s technological legacy.

(8)

Cybernetics and the Commercialization of the Internet

In the spring of 2014, Associated Press reported the existence of a Twitter-like social medium especially created for the Cuban market, called ZunZuneo; launched in 2009. So far, nothing seems odd about this story, until one discovers that the medium was created by the US Agency for International Development, a federal international development organization run under the aegis of the Department of State. The plan was to introduce the medium in the Cuban youth-sphere so they can post about the weather, sports and other subjects. After a while ZunZuneo should have become a medium comparable to Twitter in the Arab-spring, where several Arab governments fell, partially – but not exclusively – because of the public use of Twitter (and other social media). US officials hoped the Cuban-spring would occur with the help of ZunZuneo, but the medium failed in its purpose and was discontinued in 2012 when they ran out of the granted money (The Guardian 2014).

The above-given example of ZunZuneo can be linked to 2009, when a controversy was created among the Iran elections. “The Revolution Will Be Twittered”, was the first title of a series of blogposts posted on The Atlantic blog by Andrew Sullivan (2009), or, as Jon Stewart, the satiric news anchor of the Daily Show puts it: “Why did we have to send an army when we could have liberated them the same way we buy shoes?"(Siegel 2011) Though the amount of Twitter users in Iran is only a small fraction of the total population, the almost-revolution in Iran changed the way the internet is used in these kind of political spectrums. The example of ZunZuneo portraits the naivety of the current American discourse on internet related topics: “we should give them – the citizens of authoritarian countries – Twitter, so they will start a revolution”.

(9)

Cybernetics

In his book The Net Delusion, Belarus-born Evgeny Morozov argues that the image of internet – social media, blog posts and entertainment sites – does not match with the real image one should pursue when it is put into a political context. In his chapter The Google Doctrine, which is all about the naivety of the Iranian revolution in 2009 and the democratic, western response on it, the Iran example is further exemplified. The book critiques the current paradigm in the discourse of political cyberspace by offering an other, according to Morozov, more realistic view. The history of internet analysis can be divided into two categories. First there are the cyber-utopians, especially around in the 1990’s, reflecting a positive, political way to approach the internet. Members of the other category belong to the internet-centralists: “While cyber-utopianism stipulates w ha t has to be done, Internet-centralism stipulates how it should be done” (Morozov 2011: xvii). To know Morozov’s point, we first need to examine the two paradigms that he is critiquing.

In 1940, Vannevar Bush, a former MIT professor and administrator persuaded Roosevelt to create the National Defense Research Committee. Through this institution, government dollars for military research would be funneled to civilian contractors, including MIT’s Radiation Laboratory, better known as the Rad Lab. The Rad Lab its main goal was to develop a more effective way to track and shoot down enemy planes, and, because of the bombings in Britain and the US’ own involvement in the war through Pearl Harbor, enlarged exponentially during the years of the war. Though the lab was funded with the support of large bureaucracies, the Rad Lab was a site of flexible, collaborative work and a non-hierarchical management style wherein researchers crossed former non-crossable boundaries; constantly pursuing the goal of creating better gear for the troops abroad. It was this flexible, boundary crossing working style that led to the creation

(10)

of the computational metaphor and the new philosophy of technology in which it made its first public appearance: Cybernetics. The father of this theory is a former mathematics prodigy, Norbert Wiener, who joined the faculty of MIT in 1919 and worked closely with Vannevar Bush in creating analogue computers in the 1930’s. Wiener was not only interested in mathematics, but also in biology, computers and electrical engineering. Together with a young engineer, Julian Bigelow, Wiener created the predictor, a machine which had to embody the philosophy of tracking the future path of airplanes:

Early in the process, Wiener and Bigelow recognized that the enemy bomber and the anti-aircraft system each depended on both mechanical and human components. From a theoretical point of view, this combination of the organic and the mechanical presented a problem (Turner 2006: 21).

They started to imagine the pilot of the bombers as a mechanical devices, so they were able to model the behavior of these persons:

… [B]y means of the same imaginative transformation of men into information processing devices, Wiener and Bigelow offered up a picture of humans and machines as dynamic collaborating elements in a single, highly fluid, socio-technical system. Within that system, control emerged not from the mind of a commanding officer, but from the complex, probabilistic interactions of humans, machines, and events around them. (...) In the predictor Wiener and Bigelow presented an example of a system in which men and machines collaborated, amplifying their respective capabilities, sharing control and ultimately serving the human good of stopping the Nazi onslaught (Turner 2006: 21).

Two years after his publication of Cybernetics, and after implementing his philosophy in fields such as biology, Wiener published his book The Human Use of

Human Beings: Cybernetics and Society. According to Wiener, society as a whole

consists of a mechanism, receiving and sending messages and all were simply patterns of ordered information in a world otherwise tending to entropy and noise. Against Vannevar Bush’s vision of destructing the intense collaborative research climate after the end of the war, it continued its existence in the military-industrial

(11)

complex; becoming an everyday practice and training a whole generation of computer engineers, scientists and technicians. Wiener’s Cybernetics found its way in research projects and academic disciplines: management theory, clinical psychology, political science, biology and ecology; and ultimately in the urban renewal projects of Lyndon Johnson.

“By the late 1950’s, many Americans had begun to fear that the military, industrial and academic institutions that had brought the atomic bomb into being, were beginning to transform all of American life” (Turner 2005: 28). Cybernetics as a computational metaphor meant that the dominant minority – the minority controlling the military-industrial complex – will dehumanize man’s role; man will be fed into the machine or controlled for the benefit of collective organizations. This vision found a large and passionate audience on the college campuses of the 1960’s, where students rose up against this vision of dehumanization of people; not wanting to be a handle within a machine; using their intellect to feed the bureaucracies of the military-industrial complex and demanding the humanization of universities. At the same time the amount of students rose exponentially from fourteen percent in 1961 to fifty percent in 1970. These students did not have difficulties finding jobs, but had an anxiety of feeding the postwar industry; feeding the Vietnam war.

This anxiety of feeding the postwar industry led to the creation of two different kind of youth movement, combined in the concept of the Counterculture:

The first grew out of the struggles for civil rights in the Deep South and the free Speech Movement and became known as the New Left. (...) The second bubbled up out of a wide variety of cold war-era cultural springs, including Beat Poetry and fiction, Zen Buddhism, action painting, and, by the mid 1960’s, encounters with psychedelic drugs. If the New Left turned outward, toward political action, this wing turned inward, towards questions of consciousness and interpersonal intimacy, and toward small-scale tools such as LSD or rock music as ways to enhance both (Turner 2006: 33).

(12)

The New Left was a movement that used existing tools – politics, networking – to create a new world, while the New Communalists, the second category mentioned above, did the exact opposite and saw the mind as the key to social change. Creating communes in the forests or countrysides of California or New Mexico, pursuing Reichs third consciousness of bureaucratically leveled communities, harmonious collaborations in which each citizen's was honest and together with every other, and in novels published the idealistic ecotopia: a future, car-less, community driven California (Barbrook & Cameron 1996). They openly declared their rejection through their cloth preference, sexual promiscuity, music and drugs. They were liberal in the social sense of the world. They turned away from the New Left, the parties, politics in general; and at the same time opened the door to a new kind of mainstream culture, particularly to high-tech research culture: “If the mind was the first site of social change, then information would have to become a key part of a countercultural politics” (Turner 2006: 38). As Turner claims, the New Communalists did not reject technology – as many historians suggested. “Even as they set out for the rural frontier, the communards of the back-to-the-land movement often embraced the collaborative social practices, the celebration of technology and the cybernetic rhetoric of mainstream military-industrial-academic research” (Turner 2006: 33). Cybernetics offered an ideological alternative, where a world was build around looping circuits and information: “These circuits presented the possibility of a stable social order based not on the psychologically distressing chains of command that characterized military and corporate life, but on the ebb and flow of communication” (Turner 2006: 38); some even proposed an

electronic agora, where the convergence of media, telecommunications and

computers would lead to a virtual place where everyone could freely express their opinions, following the thoughts of Marshall McLuhan:

Electronic media … abolish the spatial dimension … By electricity, we everywhere resume person-to-person relations as if on the smallest village scale. It is a relation

(13)

in depth, and without delegation of function or powers … Dialogue supersedes the lecture (in Barbrook & Cameron 1996).

Encouraged by McLuhan's predictions, collaborative communities on the West Coast started developing new communication technologies for alternative press and radio stations. These communities believed they were at the forefront of the fight to build a new America, resisting the then dominant military-industrial complex paradigm by embracing technology as a tool of communication. The creation of the electronic agora was the first step in the goal to create a direct democracy; hypermedia will be the tool to reach this. What follows is an increasing do-it-yourself culture in and around Silicon Valley, which is build upon the free-wheeling spirit of the hippy culture made by the New Communalists.

At the time of the emergence of the counterculture, Obama was a child, living either in Indonesia with his mother, or attending Punahou on Hawaii, living with his maternal grandparents. It was at Occidental where Obama first should have been in contact with the counterculture, reading literature relating to the free speech movement such as DuBois and the biography of Malcolm X (Kloppenberg 2011: 16). Obama left the free-wheeling California, at the time home of research laboratories such as XeroX and IBM which will contribute to the creation of the personal computer, to attend Columbia University. At Columbia he wanted to experience race segregation and its aftermath, eventually researching the idea of being a writer. After his graduation and a short commercial job, he travelled to Chicago to become a community organizer before attending Harvard Law School.

The Californian Ideology

Where the hippies in the 60’s technology saw as a way to be dehumanized by the military-industrial complex, the idea is reversed in the 1990’s, where the internet,

(14)

and other previous communication software such as ARPANET, was the way to address oneself freely in cyberspace. This bizarre fusion o f t h e cultural

bohemianism of San Francisco with the hi-tech industries of Silicon Valley is

sometimes called the Californian Ideology, after an article published by Barbrook and Cameron in 1995. This utopian view of the internet in the early years of the 90’s can be best explained by Vedel:

Not only did the emergence of the Internet in the 1990s bring about an entirely new communication medium that became inexpensive, instantaneous and user-friendly (in the context of industrialized countries), but it was also accompanied by a new ideology boates a new way of being together and a novel polity, which no longer takes place within the bounded territories of nation states, but in an open, deterritorialized, non-hierarchical space (Vedel 2006: 229).

The members, or virtual class, as Barbrook and Cameron call them, of this so-called Californian Ideology shared the freewheeling spirit of the hippies of the sixties, but where at the same time right-wing minded: the government should stay out of their lives. The virtual class enjoy the cultural freedoms, but they are not actively involved in the creation of the ecotopia. These are the people, the core-cluster of people that started the utopian view of the cyber-utopians and internet-centralists.

Where the new-communalists turned inwards and started the emergence of a hypermediated world, the Free Speech movement in the south turned outwards toward political actions. Though not participating in this free speech movement because of being born too late, Obama was influenced by members of the movement. Being confronted with his race and his definition of it, fueling and expanding his self-consciousness, the Free Speech movement provided enough thoughts and meaning to feed Obama’s mind. This indirectly influenced Obama to participate politically – wanting Chicago’s mayor seat according to Kloppenberg – and thus not participating in the new-communalism lifestyle and its antecedents in California (Kloppenberg 2011: 28-38). Though one must say that Obama is a gifted

(15)

academic in law. He graduated Magna Cum Laude from Harvard, where he also held the prestigious function as the head of the Harvard Law Review. After graduating Obama refused several prestigious jobs, and started to teach at the University of Chicago and did part-time consulting jobs. According to Kloppenberg, Obama must be seen as a product of antifoundationalism, particularism, perspectivism and historicism, which we can see back in his books, speeches and the issues of the Harvard Law Review for which he was responsible – and opposing the idea of law and politics as an unchangeable basis whereon can be build.

The internet was formed on the idea of the academic world, wherein scholars review and peer-review their colleagues; using citations to refer to other sources. Collectively and by trial and error, knowledge will be expanded for the greater good. In 1996, Larry Page and Sergey Brin, two PhD students at Stanford started a project called Google. With the idea of sources and reviews in mind, they started to value sites according to the links between the site and other sites by using crawlers. By valuing these links (some may have more value than others) they indexed the internet. Websites with many outgoing links, or many other websites linking to one particular website, received a higher score than websites having less or no links. Offering a search tool on their website, Google is nowadays one of the biggest multinationals, surpassing previous technological companies such as Microsoft and IBM. Starting with small text-based advertorials, nowadays Google still earns its majority of revenue from the selling of advertorials on their website. By collecting data, such as (estimated) location, time, possible personal information if the user is linked to Google’s social website Plus and previous collected data, Google is able to personalize advertorials according to the demands and interests of the user of the service. It is this data collection – the quantification of human thought – that Obama uses in his campaigns of 2008 and 2012.

Google was not the first start-up created in the late nineties (think of Amazon), but Google exemplifies the possibility to make money of the internet.

(16)

This meant that gradually the utopian thought of the Internet, the electronic agora, is surpassed by the capitalistic motives of companies such as Google and Amazon (and later Facebook, Twitter and other). Money was to be earned by selling personalized advertorials, thus creating the demand of personal information and information about information: metadata. Instead of internet-users creating their own websites and blogs, companies started to offer services, such as the above mentioned Google search function, but also social media such as Friendster, Twitter, Facebook and Youtube, making it easy to share and publish information, to connect with friends and far relatives and stumble upon new knowledge. Obama and his campaign team started a similar web based service called MyBarackObama, making it possible for supporters to participate in the campaign. These sites collectively all worked because of the collection of data and the selling of advertorials – though in the case of MyBarackObama, this meant knowing to whom to sell Obama to, instead of offering the best product for a particular person. The campaign was able to collect the data because of a user's own participation in the service: a status-update involving certain brand names or politicians might be useful to certain parties. This status-update is then analyzed and the user is categorized in a certain group. The companies that collected, analyzed and categorized their users then offers the service to parties to promote their product or (political) message in exchange for money. This so-called participation-internet is referred to as web 2.0, introduced by Tim O’Reilly in 2004 (O'Reilly 2004). Web 2.0 is seen as having an end-user focus and is built upon user-generated content, as introduced by Morison (2010). Users become an integral part of the experience of the web, adding new content and information to it daily. Social network sites such as Facebook and Twitter would not exist without this user generated content. Data added by one source, altered by a user with new data to provide new information creates something out of reach of the official purpose; mashups are

(17)

created, either for commercial, open source of personal use. Information will from now on be the fuel on which the internet is driven.

The amount of processing power, data-storage possibilities and with the exponentially increase in internet services, and the information gathering related to it, creates a sort of mythical idea of the possibilities it can create. Recent discourse of the field of big data are dominated by the possibilities of it, such as self-driving cars, the changing landscape of academic research and quantification of human information, together with an increasing concern in the lack of privacy, the question who owns information and (prevention of) misinformation. Though the services that are offered by commercial companies can be used to envision the internet as a utopia, such as what happened in the Arab Spring, there is, according to Morozov, a downside to the idea of big data: capitalism. In his second book, To

Save Everything, Click Here (2014), Morozov discusses the concept of

technological solutionism, which is attributing every problem with a technological minded idea. People do not need health insurance in the United States, people should just be made aware of their health and technology should help them with technology; such as FitBit, which is an electronic watch, collecting data on the wearer’s health and counts the steps of this particular person, providing information how to live healthy. According to Morozov, technological solutionism is not the answer for everything, because the lack of health could be of poor feeding due to a person’s three job necessary to provide for his family. In his two books, Morozov tries to create awareness of the dystopian future of the internet – oddly enough not providing any solution himself for his created problem.

With the help of cybernetics, the counterculture, the Californian ideology and information gathering, one should now have a sufficient theoretical frame in which the ongoing topics of campaigning, Government 2.0, Obamacare, ACTA/PIPA and Snowden must be analyzed.

(18)
(19)

Campaign

Imagine a young woman. Being twenty-two years of age, she is full into live. She has a job at a communications company, sports in a local gym and parties in the weekends. One morning she receives a letter in the mail, containing information of presidential nominee Obama, promising the continuation of birth-control and abortion possibilities and lowering the taxes of young starters on the labour market. All these promises could be directly linked to the current position of the woman in society and her life. By enhancing the points to this person in this certain way, Obama’s campaign team enlarges the chances of her voting (or donating) to Obama. The question now is whether it was just good luck of sending the mail to this young woman, or did the campaign team had information of her so they were perfectly able to target her? It is the way the Obama campaign made use of

targeting that made the difference between his and for example, Hillary Clinton’s

or McCains campaign in 2008.

This chapter will go into the presidential campaigns of 2008 and 2012; both won by Obama. It will make a connection between Obama’s past, including his college life, his ideas about a grassroots campaign in combination with his work as a community organizer and his practical approach to connect his followers by the means of technology, such as Twitter and Youtube, creating a grassroots campaign in which the people felt empowered to help with Obama’s Change. At the same time, this chapter will unravel the data crunching practices of the Obama campaign, and thus relating the grassroots campaign which authors such as Sabato (2010, 2013), Johnson (2009) and Clayton (2010) write about, with the possibilities of crunching and analyzing data for the sole purpose of targeting possible voters for either their vote or donation, as written by writers such as Issenberg (2012), Baker (2011) and Lee (2013). This results in an analysis of Obama's use of digital technology to reach the seat of president.

(20)

Social Media Marketing

Being a state-senator from the South Side of Chicago, Obama’s national career started when he was selected by Mary Beth Cahill, Kerry’s campaign manager, to deliver the keynote address on the Democratic National Convention in 2004 (Clayton 2010: 22-3). Formerly defeated in the 2000 senate race against Bobby Rush, Obama again ran for senate in 2004 and won, defeating Alan Keyes. In the speech, Obama talked about his parents, his grandparents, his background as a kid of many places and the pluralization of ‘Red and Blue states’. The atmosphere of the convention, the speaker and the message were all aligned, people were crying, and overnight, Obama became a political superstar. The (media) attention remained and Obama announced his candidacy for presidency on the 10th of February, 2007, opposing senator and former first-lady Hillary Clinton in the race for the Democratic nomination.

“Let’s talk, let’s chat, let’s start a dialogue about your ideas and mine”, were one of the lines that Hillary Clinton said when she announced that she was running to get the Democratic presidential nomination in 2008. In a movie clip published on 20th of January 2007 on the social networking and video-sharing platform Youtube, Clinton sits on a sofa, talking or ‘chatting’ to imaginary people about current states of affairs. Being a second term senator and former presidential wife, her name was already known, her possibilities to fundraising enormous, she had a better track-record of accomplishments than Obama and the polls gave her the lead (Johnson 2009: 57). But then Obama came along.

Obama and his campaign team rejected the traditional top-down model of campaigning. This started in 2003 when Obama ran for a senate seat. David Plouffe, one of Obama’s close consultants during all of his campaigns, says about

(21)

Obama’s campaign philosophy for senator in 2003 the following:”... he was determined to win not with thirty-second ads and clever sound bites, but by building a grassroots campaign throughout Illinois” (Plouffe 2009). This philosophy continued and was perfected for his presidential campaign. Instead of a top-down model of campaigning, Obama created a grassroots campaign, some even call it a movement (Clayton 2010: 136). This rejection of the top-down movement can be traced back to Obama’s time as a community organizer. In his time as community organizer in the South Side of Chicago, Obama learned the tricks of organizing political and social action. He was thought to use the community organizing philosophy of Saul Alinsky, but rejected it and started experimentation with organizing change together with his colleagues in the South Side (Kloppenberg 2011: 31-8). Being a gifted speaker, Obama quickly knew the ins and outs of organizing, which reflects in his campaign strategy. As his national field director Temo Figueroa says: “We decided we didn't want to train volunteers, we wanted to train organizers … folks who can fend for themselves” (Clayton 2010: 139). According to Obama, real change comes from the bottom up. The objective was to empower the volunteers to make decisions on the local level; to skip micromanagement from the how-to-campaign-guide. His mission to use the medium of internet as a basis for a grassroots campaign, suited well with the appealing young people, the digital natives who know their way around on the internet, and the demand for Change. These people, young, excited and usually with a low voter turnout, were given the tools to organize, to participate and to spread the Change.

By 2008, the internet was a fully institutionalized medium. The medium offers a various amount of free services, such as games, sharing platforms and news. The internet is a space wherein news and breaking-stories follow up fast, where you can befriend someone who is living 5000 kilometers away or, as Castells

(22)

says it: the internet offers a new way to look to time and space, introducing his concepts of Space of Flows and Timeless Time (Castells 1998).

In 2004, presidential candidate Howard Dean paved the way to use the internet as a medium to raise money. The Dean campaign employed blogger outreach, online mobilization, online fundraising, social networking and niche outreach all in their 2004 primary campaign (Johnson 2009: 153), but because of ill instructed volunteers, the output was not as expected, and Dean finished as third (Clayton 2010: 143) As Joe Trippi, Howard Dean’s campaign manager during the campaign said: “I like to say that we at the Dean campaign were the Wright Brothers. We put this rickety thing together and got it off the ground” (Johnson 2009: 153). The Obama campaign perfected the online strategy with the help of tools such as Facebook, MyBo, MySpace, Twitter and Youtube.

Communication over the social networking sites Facebook and MySpace became an important part of the campaign. In comparison with the McCain campaign, Obama’s new media department was not a part of the campaign’s tech team, but was a separate team wherein team leader Joe Rospars directly reported to campaign manager David Plouffe (Hendricks & Denton 2010: 56). In 2004, active online-volunteer Joe Anthony set up the MySpace page barackobama. Over the next two years, Anthony published, improved and maintained information relating to Barack Obama. When in 2007 Obama announced his candidacy, the page already had 30.000 friends. The campaign team and Anthony worked on the page in close collaboration and used it as the official Barack Obama MySpace page, but the idea of a volunteer having the control on such an important part of the campaign disliked a few members of the team. When an offer of $39.000 was rejected by Anthony, the campaign team went directly to the MySpace administrators and received the control of the site. Although this is not the finest moment of the campaign, it illustrates the importance of social tools and control in the context of campaigning. At the end of the 2008 campaign, the Obama MySpace

(23)

had more than 800.000 friends, opposing 200.000 friends of McCain (Hendricks & Denton 2010: 56-8).

When Barack Obama announced his candidacy on February 10, 2007, Farouk Olu Aregbe, affiliated with the University of Missouri started the Facebook group “One million Strong For Barack” (Hendricks & Denton 2010: 57). The amounts of members raised exponentially until almost two hundred thousand in one week. At this time, hundreds of groups were created supporting the candidacy of Obama, but none reached the amount of members Aregbe reached. In the end, there were almost five hundred unofficial Facebook groups, and close to 2.4 million people ‘friended’ with Obama, compared to six hundred thousand friends for McCain.

Another major advantage of the Obama campaign was the recruitment of Chris Hughes, one of the three founders of Facebook. He joined the campaign team in 2007 and became responsible for the creation of a separate social networking site. My.barackobama.com was launched in 2007. Members of MyBO, as the social networking site was called by campaign members, could participate in events and fundraisers for Obama. At the same time they were encouraged to talk to undecided voters. Hughes called MyBO “the connective tissue” (Johnson 2009: 156) In retrospect to the opponent’s McCain Space, MyBO was set up to be a fundamental part of the campaign. They used MyBO to both target voters and to organize its get-out-the-vote activities. Members had conversations between each other instead of a top down system where information was spread: “The system was designed to harness all of the excitement and energy ... of a charismatic online candidate and channel that energy into real activities that met the goals of the campaign” (Johnson 2009: 156). The idea of having a grassroots campaign was reflected in the creation of MyBO. In some areas tens of thousands volunteers already organized fundraisers or campaign activities before even a paid campaign staffer showed up. Almost 75 thousand offline events were organized by more than a million members

(24)

with the help of MyBO. The Obama campaign nearly raised 650 million dollar, compared to the 360 million dollar of McCain. More important was that these donations were relatively small (almost always lower than 50 dollars), but by reaching a large basis, the pool of donators was large.

Youtube, which is a part of online multinational Google, became a vital source for the campaign. In 2006 Youtube launched YouChoose, a section of the site devoted to videos of candidates running for various offices. They encouraged candidates to start a channel and contribute with videos to reach people. Viewers could contribute to the candidate via Google check-out, a payment service. At the same time, Youtube started to host presidential debates (together with CNN). Obama ended with over 1800 videos posted on his Youtube channel, while McCain had 330 videos. Sites such as BarelyPolitical.com started to post videos relating to Obama, such as the famous “I got a crush on Obama” song, which was viewed more than sixty million times. Artists also participated in their way by posting videos of songs on Youtube containing their support for Obama; such as rappers Will.I.am of the Black Eyed Peas and Jay-Z.

MySpace, Facebook, MyBO and Youtube played a major role in reaching the youth-vote. By popularizing elections, and especially the suggestion to vote for Obama, the Obama campaign was able to get the vote of this crucial part of the voters in comparison to its opponent. However, this major role of new media is not the only reason of the high proportion of youth-votes for Obama, as Johnson suggests. Other factors for the high proportion of youth-votes was the momentum from 2004-2006, where youngsters ‘doubled, tripled, and even quadrupled their turnout’; the youth vote organizations outreach such as Rock the Vote, which allowed voters to register online, and the New Voters Project which actively promoted the participation of students into politics; the outreach from political campaigns, for example by hiring youth outreach directors (for Obama this was Leigh Arsenault); organizing Camp Obama to let users actively participate in the

(25)

grassroots campaign; and the above mentioned usage of social media to reach the youth (Johnson 2009: 110-122). Obama received 15 million votes in the 18-29 voter range, which is the same amount of votes that this range contained in the whole 2000 elections. The support put him over the top in crucial states such as Indiana, North Carolina and Iowa and reaching a 53 percent of the vote nationwide; receiving 66 percent of the youth vote against 31 percent for McCain (Pew Research 2008).

Though it seems that all these social media campaigning strategies turned out to be profitable (and change the way campaigns are set up dramatically), there is a hidden layer in the campaign strategy of Obama. The keyword of this hidden layer is data. Although it seemed that there was a grassroots campaign going on, with the help of data, people can be traced and placed in certain categories. As mentioned in the introduction of this chapter, the campaign was able to target voters precisely. The hidden layer of the campaign contained big data and algorithmic analysis. The next part will go into data collected by the campaign, for example with the help of their own social networking site MyBO or buying data from Google or other commercial companies, and how this data was used to target specific people for specific purposes.

Metamarketing

Being a two party democracy, the US has a history of fierce campaign battles between candidates. The demand for knowing voters increases because of the winner-takes-all principle: when the popular vote of a state reaches towards one side, all the electors in that state vote for that candidate. The demand for the candidate to publicize themselves is huge, but they can find an outlet in the different media platforms that can be used. Media always influenced the voters.

(26)

Initially by use of the printed press (including the improvement of transportation possibilities), then radio, television and internet. The television became the most important medium to reach voters in the 50’s. It was a medium of joy and information, standing in every household. The expansion of the television made presidential campaign workers aware of the possibilities, and led to the first debate between two candidates: the famous Kennedy vs. Nixon debate in 1960. This was also the time when (negative) commercials were introduced to the households of the US through the television. The continuous play between campaign and the media lasted until the introduction of the internet. By sending particular messages into the households of certain people, keeping age and geographical location in mind, targeting was possible by means of commercial placement. Information about where and which households needed to be targeted, became an interest for campaign workers to reach the voters (Maarek 2011). It is this information of voters in combination with big data that changed the way Obama campaigned.

There are different ideas about big data. Looking at the c oncept, one should think when there is big data, what can be seen as small data, or medium data? One could say that big data is a hype, a buzz. Seen from the perspective of business, big data can be described according to the three V’s (Zikopoulos & Eaton 2011): Volume (kilobyte, megabyte, gigabyte, terabyte etc.), Variety (movies, blogpost, interlinks, metadata etc.) and Velocity (realtime, near realtime etc). Commercial companies such as Google, Facebook and Microsoft, but also lesser known companies such as LexisNexis and Axciom, collect and sell data (or services relating to collected data). Google is able to collect and label what you are searching on their website, or on Youtube or Gmail, because these are all services of Google. By analyzing what is searched, at what time it is searched, and where it is searched and by cross-linking this information, they are able to create a profile of the person that searches. This profile can be used to target consumers with advertorials. With the help of a share button, for example Facebook’s Like button, and cookies (small

(27)

temporarily files stored on your computer, such as a pre-filled login box when you want to login to Facebook), they can track and trace you all over the web. By analyzing this information, they know what sites you are on, and what you do on the web. A great example of the possibilities of big data is the Google Flu1 project,

wherein the search results of flu-symptoms is linked to geographical location; Google can predict within weeks when and where a flu-epidemic will occur in a country. Imagining that Facebook now owns WhatsApp, and Google’s browser Chrome is used by more than half of all internet-users (W3Schools), they are able to track your preferences and target their services to almost everyone.

The idea of big data provoked a diverse set of reactions in the academic and public world. The idea of having a big pile of data which you can search for your need has influenced the way science takes place. Chris Anderson, a writer for WIRED, published an article in 2008 called The End of Theory (Anderson 2008), wherein he predicts a future of non-hypothesis science, wherein data will give the answer on every question. Up front made hypothesis do not count because the data will reveal anything interesting. Lev Manovich (2011: 5) says this is not the case, but big data will make a digital divide between those who have access to the data, for example researchers within Facebook and Google, and those who do not have acces According to Manovich you still have to find a way to analyze the data; a big pile is not enough. Though there are more definitions of big data and the academic discussion of big data and relating subjects goes on with authors like Shirky (2009), Morozov (2013), Mayer-Schönberger and Cukier (2013), for the sake of this article the definition of big data, from now on data, will be Manovich’s: “Big Data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Big data sizes are a constantly moving target currently ranging from a few

(28)

dozen terabytes to many petabytes of data in a single data set”. This definition will help the reader understand the journey of Obama’s data crunching.

72 hours before the closing of the last voting offices, George W. Bush stood head on in the polls of the 2000 presidential elections, which makes it odd that the elections needed a special recount procedure, and Bush finally won with the help of the Supreme Court. A special delegation of the GOP started a task force to research this odd event, and eventually advised to intensify the principle of micro targeting. This meant that people needed to be approached in a clear, non-political way which could change per person or per group (Baker 2011: 100-128). In the 2004 elections, a substantial part of Bush’ budget went to voter surveys: what was happening under the people, where was more targeting needed, which persons are eligible to be target according to our principles and which persons are likely to convert to vote for Bush? With the help of focus groups, the Bush campaign started to survey groups in society.

In 2005, inspired by the micro targeting activities of the GOP and the re-election of Bush, Joshua Gotbaum, a former employee of the Carter and Clinton campaign staff, started Spotlight Analysis and started cooperating with Yankelovic, a consumer-survey company. By buying commercial data, such as that of LexisNexis and Axciom, they created a database wherein they could search for certain preferences of citizens, such as financial records, magazine subscriptions and the value of the house they live in. The keyword of targeting became proxies. With the help of the bought database, they could categorize people according to superficial facts, commonly known in society as prejudices (‘people who buy biological products tend to a more social state’), which they turn into proxies (or just ‘searchable words’). By cross-analyzing these proxies with each other, in combination with previous voting records and focus groups, they were able to categorize the whole database in five categories, ranging from the likeliness of voting for the Democratic party. The middle groups, in comparison with the two

(29)

outer groups, were the most interesting because they are the people open for political change. In total they formed 47 percent of all the people in the database. By cross-analyzing these proxies they were able to categorize each voter into a group with a 25 percent error. They were now able to target certain groups for certain purposes, like the GOP did with surveys. The difference was that Gotbaum led the algorithms do the work that was first done by surveyors and target groups.

By the time Obama was the official Democratic candidate for the presidency, every week 10.000 people were called and surveyed, followed by 1000 people who did a longer and more intensive survey (Issenberg 2012). These people formed a sample of the society. With the help of the sample, every piece of data that could be found about a person was analyzed with the sample group and categorized. Every possible voter received a number between the 1 and 100. The groups 1-10 and 90-100 were not interesting because the campaign already knew they would either vote McCain or Obama. The people who had a score between the 55 and 75 received the most attention from the campaign (Issenberg 2012 and Beckett 2012). Obama’s focus was on the group of voters that voted Bush in 2004, but started to dislike the government after the Katrina hurricane and the ongoing wars in the middle-east. Being nine percent of all voters, these people had a higher than average demand of family values. By focusing on this group, Obama was able to win three important swing states, though it is impossible to prove that this was exclusively reached by data alone.

The collection of data became a specific goal. The reason that MyBo received so much attention from the campaign could be either because they represented the grassroots idea of a campaign, but could at the same time be seen from the perspective of data collection. Everyone who participated on the MyBo platform made data: what time they were on, how active they were, to whom they talked, where they lived, etc. By analyzing and comparing this information with, for example, donation extractions, or Facebook profiles, the campaign had real time

(30)

track of every voter; to categorize them, to target them, to get their vote, to participate in the campaign or to donate. Former Google employee Dan Siroker started A/B testing: showing different websites to different people if visited the Obama website. By receiving feedback on how long, what page, and the amount of donations received per layout of the website, they could perfect the website in reaching as many voters and donations (Murphy 2012). By doing so, the campaign database grew with a factor of ten: “accruing 223 million new pieces of info in the last two months of the campaign alone, the issue was integrating and accessing it” (Murphy 2012).

Obama won the elections, becoming the first afro-american president in history. In his speech, the night of his victory, Obama spoke to more than 100.000 people, emphasizing the collaborative spirit, the grassroots philosophy of the campaign; the organizers, the volunteers and the campaign team that made the victory possible; emphasizing the ‘we’:

I was never the likeliest candidate for this office. We didn't start with much money or many endorsements. Our campaign was not hatched in the halls of Washington. It began in the backyards of Des Moines and the living rooms of Concord and the front porches of Charleston. It was built by working men and women who dug into what little savings they had to give $5 and $10 and $20 to the cause. It grew strength from the young people who rejected the myth of their generation's apathy who left their homes and their families for jobs that offered little pay and less sleep. It drew strength from the not-so-young people who braved the bitter cold and scorching heat to knock on doors of perfect strangers, and from the millions of Americans who volunteered and organized and proved that more than two centuries later a government of the people, by the people, and for the people has not perished from the Earth. This is your victory (CNN 2008).

(31)

The 2012 campaign and improved meta-targeting

In 2010 the Democratic party absorbed the worst midterm loss since 1938: a six seat loss in the senate; 63 seats in the House and the Republicans captured a majority in 26 states, and in 21 states they also had the governor seat. “I’m not recommending for every future president that they take a shellacking like I did last night. I’m sure there are easier ways to learn these lessons”, said the president that evening. Obama’s response was to talk to regular advisors, people that had a track record but were not involved in the political landscape at that time. People of this group were Tom Daschle, former Senate majority leader, Leon Panetta, Obama’s first CIA director and David Gergen, who advised the presidents of both parties and was now stationed at Harvard. Advice that was given included comments on his relationship with Capitol Hill – critiquing his mentality that he only contacted senators and representatives when he needed something happening – and that he failed to change the culture of politics in Washington. These conversations were, according to Balz, only a part of his presidential reorientation. Obama needed to negotiate about the Bush era tax cuts, an arms treaty with the Russians and the

Don’t Ask, Don’t Tell policy of the Pentagon. Obama got through, with the help of

his Vice President, but when he departed to Hawaii for Christmas, one member of the traveling party said he was “as happy as I’ve ever seen him” (Balz 2013: 35-9). Mitch Stewart said about the midterm loss:

Failure is always a better teacher than success, and 2010 was tough. We learned tactically some lessons, but ultimately I think what probably helped us more than anything else is a lot of our volunteers and staff had only been involved in the ‘08 campaign, which was a log of highs, and unless you were very early on in the process like I was, there weren’t a lot of lows. So 2010 was a good learning experience just in that [it showed us] this isn’t all rainbows and bubblegum. I think it actually helped harden some of our volunteers and staff to prepare for 2012 (Balz 2013: 80).

(32)

Jim Messina was one of the people that joined Obama on his trip to Hawaii. In the summer of 2008, when David Plouffe was on the verge of a burn out, Messina joined the Obama campaign team. In 2011 he was sent by Obama to Chicago to lead his 2011 re-election campaign. Messina proposed to not rerun the 2008 campaign, coming up with arguments of Carter and H.W. Bush, who lost their midterm elections, in comparison with the campaigns of Clinton and W. Bush, who won. But Obama wanted not to lose his connection to the people, and that he wanted to run a grassroots campaign for the second time (Balz 2013: 40). Messina presented a list of five objectives that needed to be done before the elections. The third objective was to improve the usage of technology in the campaign, a statement that surprised the president as his national and international credits for using technology in his campaign were widespread. But Messina was obsessed with technology and how it has changed over the past years. Facebook was the major platform, MySpace was gone. Smartphones were the major symbol of change, and this needed to be incorporated in the campaign. Eric Schmidt, former Google CEO became a key advisor on everything: “how to manage a start-up, to the kind of computer platforms to set up, to the most efficient placement of online advertising” (Balz 2013: 41). As Schmidt said to Balz:” Jim is extremely analytical, so what he wanted was a technology base and analytical base to make decisions. For advertising, he would like the have a scientific basis for where you put the money, and so we worked at some length to try to understand what kind of marketing made sense” (Balz 2013: 41).

After Obama’s 2008 victory, a five-hundred page document filled with recommendations was published among prominent figures of the campaign, proposing change and improvements to an already state-of-the-art campaign. Mitch Stewart, who won Virginia in 2008 as a campaign organizer said:

We did very detailed postmortem where we looked at all kinds of numbers, looking at the general stuff like the number of door knocks we made, phone calls we made,

(33)

number of voters that were registered. But then we broke it down by field organizer, we broke it down then by volunteer. We looked at the best way or the best examples in states of what their volunteer organization looked like (Balz 2013: 75-6).

The first step was to connect all the data. By 2012 three different databases of voter information consisted, all used by different parts of the campaign. Operation Narwhal was initiated to link the database of the fieldworkers, the campaign database and the party database: “we realized there was a problem with how our data and infrastructure interacted with the rest of the campaign, and we ought to be able to offer it to all parts of the campaign” (Issenberg 2012). An external party developed and stored the MyBo information, whilst the information of the fieldworkers was stored in the Build the Hope database. All databases consisted of 170 million potential voters and three million volunteers and donors, but the campaign and party did not have a clue which voters, volunteers or donors overlapped. Operation Narwhal unified the databases and provided information per voter, used to micro target. Messina:

What I wanted was, I didn’t care where you organized, what time you organized, how you organized, as long as I could track it, I can measure it, and I can encourage you to do more of it. So what I said to them was I want all of our data together, I want for the first time to treat [a voter] like a voter and not like a number, because right now you’re just a voter number, your voter ID number in your state, your FEC number for how much you contribute, your census data, whatever we know about you from commercial vendors. But we don’t treat [a voter] like a person (Balz 2013: 77).

Operation Narwhal led to a massive database forming the foundation whereon the 2012 campaign could be built, and to the second step in the improvement of technology use in the campaign: Dashboard.

“Dashboard is what we needed to communicate”, Jennifer O’Malley Dillon said. Dillon took over the Obama field organization and renamed it to Organizing for America after the 2008 elections. She and Messina wanted a simple to use

(34)

program that would allow everyone to communicate simply and seamlessly. Dashboard needed to be the main hub of communication between both the volunteers and between volunteer and campaign manager. It was hard to develop, partly because the engineers building the system lacked the experience of being a field organizer. Some even were sent to the field to better understand the needs of field organizers. Dashboard needed to be the field organizer’s office, but online. Harper Reed, one of the executive programmers said:

When you walk into a field office, you have many opportunities. We’ll hand you a call sheet. You can make calls. You can knock on doors, and they’ll have these stacks there for you. They’ll say, ‘Harper, you’ve knocked on fifty doors. That’s great. Here’s how you compare to the rest of them.’ But it’s all very offline. It’s all very ad hoc and it’s not very modern. And so what we set out to do was create that offline field experience online (Balz 2013: 78).

Next to Narwhal and Dashboard, there was a third project involving Facebook. The campaign learned from the re-election of Bush in 2004 that people not actively involved or interested in politics, are more interested into it when approached by a relative, someone they know personally compared to an unknown volunteer. When invited by Facebook in early 2011 to encourage the campaign to spend some of their millions of campaign funding on ads on Facebook, Messina offered not to buy ads, but to work together:

We started saying, ‘Okay, that’s nice if we just advertise’. But what if we could build a piece of software that track all this and allowed you to match your friends on Facebook with our lists and we said to you, ‘Okay, [so-and-so] is a friend of yours, we think he’s unregistered, why don’t you get him to register? Or [so-and-so] is a friend of yours, we think he’s undecided. Why don’t you get him to be decided?’ And we only gave you a discrete number of friends. That turned out to be millions of dollars and year of our lives. It was incredibly complex to do (Balz 2013: 78). The result was called Targeted Sharing. With permission, Dashboard could see one’s friends and compare data with one’s friend’s name to get a list of three or four people that were most benefited to be persuaded by someone they knew to vote for

(35)

Obama. The campaign knew who was and was not registered to vote, they knew which of these friends had a low propensity to vote and they knew who were solid to vote for Obama. They proposed a gentle nudge in the right direction. Eric Schmidt: “If you don’t know anything about campaigns, you would assume it’s national, but a successful campaign is highly, highly local, down to the zip code. The revolution in technology is to understand where the undecideds are in this district and how to reach them“(Balz 2013: 79).

Not only online services supplied information to the databases. The majority of the campaign budget was used to buy time on television. In 2008, Obama almost spent 300 million dollar to broadcast a total of almost 550 thousand commercials during the whole campaign period, with a focus on the swing states. In 2012, the campaign developed an own television audience survey system, called Optimizer. A day was separated into 96 blocks of 15 minutes and analyzed what was shown to what audience at what time on what channel: “The revolution of media buying in this campaign was to turn what was a broadcast medium into something that looks a lot more like a narrowcast medium” (Issenberg 2012). The provided audience-information by commercial companies included geographical audience-information and the age of the audience, but the Obama campaign needed more information to place them into separate categories (Issenberg 2012). Rentrak, a cable provider, provided the campaign with specific information about viewers. To get around privacy laws, the campaign needed to provide information of possible voters, where Rentrak compared it with their database and when matched, they provided information about what was watched at which moment of the day. Personal information was censored. In September another cable provider was added (Buckeye CableSystem), which provided per-second information per household (Farnam 2012 and Issenberg 2012). This way they found out that the voters they wanted to target – the ones that could vote for Obama – did not watch national television, but were watching niche channels such as The Food Channel or Hallmark.

Referenties

GERELATEERDE DOCUMENTEN

In conclusion: parental consent is not likely to lead to improved protection of children’s personal data, given that consent does not actually give us control over our personal

Section 2 describes the types of data collected in the LISS panel, the innovations in data collection which were tested in LISS, the open-access data policy, and the option to

Human genetics: Execution of pipeline analytics and interpreting the outcomes is often an iterative process that can involve multiple people with specializations in areas such

However, the influence of stakeholder inclusion on collective agreement was unpredicted and the results showing a fairly strong connection (15 practical examples from 35 AI

First, we present the results from two types of endogeneity tests: the robust Hausman test and tests based on auxilary regressions (Hausman (1978)).. If we use instruments for

The search information from YouTube can also be analysed using Google Trends, however it will not be used in this research, since the interest in a product

Firstly I discuss the theoretical framework of the study, after that I will discuss the method used in this study where the data selection, data collection and data

Our approach combines methods from the field of Natural Language Processing, or more specifically, Text Mining, into a processing pipeline that extracts those elements from