• No results found

The Facebook Governance and EU Privacy Laws Battle: Mapping Privacy Rhetorics

N/A
N/A
Protected

Academic year: 2021

Share "The Facebook Governance and EU Privacy Laws Battle: Mapping Privacy Rhetorics"

Copied!
66
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

1

The Facebook Governance and EU Privacy Laws

Battle: Mapping Privacy Rhetorics

MA Thesis New Media & Digital Culture

Thesis Supervisor: Esther J. T. Weltevrede Second Reader: Lonneke van der Velden

Author: Charlotte S. L. Leclercq Student Number: 11313544

(2)

2 Abstract

In 2015, the European Commission launched the Digital Single Market strategy, which entailed the revision of the legal framework for privacy and data protection. This has provoked many reactions among the digital businesses and digital rights activists. However, the definition of “privacy” in the digital age remains uncertain and complex. By taking Facebook as case study, this thesis aims to investigate the discrepancies within the privacy rhetoric of both the European Union and Facebook, and research how the public discourse around privacy is shaped. Therefore, an actor-network theory approached is taken, completed with insights from platform studies. This is operationalized through digital methods tools and an interface analysis. Significant differences between the two rhetorics were found. One the one hand, the EU used throughout the years a very homogenized definition of privacy in which data protection is put central. Facebook, on the other hand, positioned its definition of privacy differently following the audience it addressed.

Key words

Privacy – discourse – policy – Facebook – data economy – issue mapping – platform studies

(3)

3 Table of Contents

1. INTRODUCTION ... 5

2. CASE STUDY ... 9

2.1. On the way to a better privacy protection within the European Digital Single Market ... 9

2.2. Facebook, the social network ... 11

3. THEORETICAL FRAMEWORK ...14

3.1. Privacy at stake ... 14

3.1.1. A history of definitions and technology ... 14

3.1.2. Exposure and publicity ... 16

3.1.3. Surveillance and power ... 18

3.1.4. Privacy as data protection ... 20

3.2. Cultural and policy approaches ... 21

3.3. The disruptive infrastructures governing the digital society: platforms and Big Data ... 23

4. RESEARCH DESIGN ...26

4.1. Methodological approach: Actor-Network Theory and Platform Studies ... 26

4.2. Data collection and analysis ... 28

4.2.1. Scraping and crawling through data ... 29

4.2.2. Tracing privacy on Facebook through an interface analysis ... 31

5. RESEARCH FINDINGS ...33

5.1. The European Union staging their privacy issue ... 33

5.2. Understanding Facebook’s privacy from a user’s perspective ... 35

5.3. What makes the news talk about privacy on Facebook/ Facebook’s problem with Europe ... 44

6. DISCUSSION ...53

7. CONCLUSION ...57

8. BIBLIOGRPAHY ...59

APPENDIX 1 – Documents of the European Union ...63

(4)

4 APPENDIX 3 – Analysed News Paper Articles from Google Results ...64

(5)

5

1. INTRODUCTION

At the early beginning of 2000, Simson Garfinkel published a book he called ‘Database Nation: The death of privacy in the 21st century.’ Fifteen years later, the European Commission sets (online) privacy and data protection on the top of their priorities. At the time that Garfinkel wrote his book, he and his contemporaries were thinking about the possible realisation of Orwell’s Big Brother state and, probably, they did not even imagine that the most powerful companies of the century would be basing their source of revenue on tracking and collecting citizens’ information. In the digital age, privacy issues have raised concerns among scholars, human rights activists and governments, because of its new dimension: giving away personal data in exchange for services (Kleiner 84).

In May 2015, within the context of the launch of the Digital Single Market strategy and the evaluation of the legal data protection framework, the European Commission announced their wish to revise the Directive 2002/58/EC – better known as ePrivacy Directive or “cookie law” –, which provides on the one hand the current legal framework for Electronic Communications and, on the other, protects privacy and regulates data processing within the sector (SMART 2013/0071 p.7). This Directive was originally adopted in 2002 and is based on the principles of the data protection principles from 1995. Therefore, it is said complementary to the Data Protection Directive, whose revision has taken place during the last year. These revisions were undertaken as a result of the current legal framework that no longer complies with the latest technological developments enabling mass surveillance, or rather – as argued further – capturing and sorting the social. These terms refer to contemporary conceptualisations of privacy. When talking about surveillance models, one often thinks about Orwell’s Big Brother, or Bentham’s panopticon. However, in the last decades, scholars have been rethinking these models and proposing alternative models (Agre 101; Lyon 26), which bring new dimensions – clarified in the theoretical framework – to the privacy “problem.” It is therefore argued that these new dimensions are crucial for understanding when and how one’s privacy might be violated, and what is at stake for him or her.

The leak of a draft version of the new e-privacy proposal in December 2016 provoked a lot of reactions on social media, especially on Twitter, where the hashtag “#eprivacy” was trending on Twitter for a few days. Also in the news, the topic got a lot of attention from the European, but also American press. “A new set of rules, expected to

(6)

6 be unveiled by European Union officials on Wednesday, is likely to put new pressure on American tech companies. Europe calls it consumer protection. Silicon Valley calls it protectionism. In some ways, they are both right” the New York Times wrote when the EC opened the public consultation for the revision of the directive (NYT 13 Sept. 2016).

The revision of the e-privacy law is taking place within a larger framework, – as mentioned previously – namely the creation of a “Digital Single Market” for the European Union. Therefore, the EU is working on building a strong and homogenized legal framework on the one hand and stimulating research and financing on the other. The ePrivacy Directive complements the General Data Protection Directive (GDPR), whose revision has been published in May 2016. The revision of both the GDPR and the ePrivacy Directive are part of this project. By doing so, the Commission aims to “strengthen trust and give a boost to the data economy” (ePrivacy Factsheet 1). This means that the Commission’s wish to homogenize and build a strong legal framework for online privacy comes within another agenda for the Digital Single Market, namely to strengthen EU’s position on the digital market. The data economy relies – as it says – on consumers’ data, and thus consumers’ trust is crucial, because if they do not feel safe online, less online transactions will be made possible.

Tech and telecom companies operating on the European market are terrified about the outcome and the consequences the new legal framework might have on their business model. For privacy activists, the revision of privacy laws blows a new wind of hope. Furthermore, visions about online privacy are divergent. For some, if you have something to hide, you might be doing something wrong, while for others online privacy is the precondition of freedom of speech (The Guardian 03/08/14). But, the average EU citizen does not really know what is at stake or how to deal with it.

The data economy has emerged through the expansion of connected devices and is used for new technologies, such as Big Data, cloud computing and the Internet of Things (Zech 460). This economy is based on the multi-sided market principle, which is characterized by the presence of two or more sides interacting through a platform (Rochet & Tirole 180), and is therefore often described as a “platform economy.” Common examples of businesses relying on the multi-sided market principle are credit cards and video games businesses (Rochet & Tirole 180). Traditional media (TV, newspaper), also, have a two-sided market: the first side was the audience that pays no or a small fee to

(7)

7 access to content, and the second side is advertisers that pay the media company to put advertising in between the content. This creates a win-win situation where media consumers can access content for a small fee, and advertisers make use of the media company’ network to reach their audience (Rochet & Tirole 180). Companies providing online services, often referred as Over-The-Top companies (OTT’s), have been absorbing all advertisers from the old media – also based on two-sided market model – and brought them to their platforms, the new media, by offering a new technical possibility: personal advertising. Platforms are inviting their end-users to share as much as possible in order to connect advertisers to potential customers with the aim to maximize the probability of selling products or services. Moreover, they are following every of the user’s virtual steps in order to get a behavioural profile of him or her. All of this remains invisible on the end-user’s interface (Evans 191; Hashemi 141).

Throughout the years, OTT’s have been able to collect tremendous amounts of data. Firms, such as Facebook and Google, which counts over billions of users all over the world, – and especially in the US and Europe – have seen their business endangered because of worried government or whistle-blowers, such as Edward Snowden, and therefore, have started with actively working on keeping users’ trust. By using specific discourses within their policy and lobbying, Internet companies try to participate in shaping policies by “fostering a regulatory paradigm” (Gillespie, “Politics of ‘platforms’” 355-6).

Within this thesis, the case of Facebook is taken in order to understand which meaningful role they played in the public discourse around online privacy. Therefore, the following research question is posed: What role does the possible discrepancy between different types of privacy Facebook and EC are referring to influence the public perception of privacy issues?

To intend to answer the present question, a mixed-methods approach is adopted, leaning on the Latourian issue mapping. By making use of digital methods tools, discursive interface analysis and content analysis, quantitative and qualitative data is collected and analysed. The issue mapping approach was chosen because the present study treats a current and controversial topic, as is has gotten attention in the media. To understand these controversies, Latour suggests following the actors trace on the Internet and how an issue was staged by the various actors involved in the controversy. Actors

(8)

8 leave traces on different media, in different formats, which is why different research techniques need to be used. Close attention will be paid to two specific and central actors: the European Union and the social networking site Facebook. Digital methods tools are needed to collect the “traces” they leave on the Internet, and a content analysis with a discursive approach is adopted to understand the “rhetorics” that are used around privacy. After this introduction, the case study will be presented, with a first part on the European Commission and a second part introducing Facebook. Afterwards, a literature review of the evolution of the privacy debates within both academics and politics, and, also, the different approaches in terms of policies will be presented. After setting the theoretical framework, the methodological part is explained. The latter contains a first section with the methodological approaches of research. A second section entails the research design outlining the different data collection tools and analysis needed to perform the research. The following chapter presents the significant findings issued from the analysis, which are discussed and linked with the theory in the discussion. Finally, the conclusion will summarise the essential points of the study and reflect on it.

(9)

9

2. CASE STUDY

2.1. On the way to a better privacy protection within the European

Digital Single Market

After the creation of a single market, the European Commission is aiming to go a step further and to reach a single market for the digital economy. Therefore, the Commission designed a plan called the “Digital Single Market Strategy” – or “DSMS” – in which three main pillars are addressed (Zech 460). The first pillar is aiming better “access” for to digital goods and services across Europe. The second pillar intents to create an “environment” which stimulates the flourishing of digital networks and innovations.

The current complementary e-Privacy Directive from 2002 and the Data Protection Directive, are said to be becoming outdated and unable to protect citizens adequately, especially online, because of the latest technological development (Zuiderveen Borgesius & van Hoboken 198; Eliantonio et al. 398). The Data Protection Directive from 1995 provided basic guidelines for the European Member States. However, new “legal and societal” aspects are challenging this old regulatory framework (Eliantonio et al. 392), the terrorist threat in Europe made the authorities reconsider privacy rights for enhancing security. Another development is that of business and personal data transfers across different Member States and outside of the EU. Also, in the digital society and economy, communication infrastructures have been increasingly digitised, and involve the processing of data (Eliantonio et al. 392). This means that privacy is a right that needs to be balanced with other rights or interests, such as security, convenience and economy, as demonstrated earlier.

The economic dimension of privacy is crucial for the new e-privacy and data protection regulations. The reason for this is that these revisions allow to break down national barriers created by fragmented policies for privacy and data protection and thus enable the creation of a single market for the EU (Kramer 398). As a result, it will facilitate the flow of personal data across the Union. Kleiner points out the difficulties for governments with finding the balance between privacy and growth “The European approach, however, is one where both are preserved” (90), this controversial aspect is clearly present in the EC’s proposal for an e-Privacy Regulation.

(10)

10 On the 10th of January 2016, the European Commission released their proposal for a new Regulation on Privacy and Electronic Communication, which reviews the ePrivacy Directive (2002/58/EC). These reforms take place within the European Digital Single Market agenda, and they aim to both unify and simplify the legal framework in order to realise a European digital single market. With other words, these revisions constitute an attempt to protect citizens’ rights, by increasing their level of trust and security online, to facilitate and expand digital tools and services, and support local companies in developing their businesses (Factsheet).

As already mentioned, before the revision of the ePrivacy Directive, the European Commission first reviewed the data protection law two years earlier. This revision has resulted in a Regulation for General Data Protection (2016/679), which was proposed in December 2015 and adopted in May 2016. Similarly to the new ePrivacy law proposal, the new data protection law entails the strengthening of EU law through replacing the – flexible – directive by a regulation. Thereby, one can wonder whether it is desirable to transfer the power in the data protection field from national authorities to European ones since data protection matters often have been a national one (Blume, “The myths pertaining to the GDPR proposal” 269). As being part of the creation of the Digital Single Market, a central new element within the new Regulation is the attempt to harmonising the EU legal framework for data protection. This facilitates data transfer within the EU. However, two crucial aspects have been maintained from the Data Protection Directive 95/46. First, the GDPR is still based on a regulatory model in which rules establish conditions for data processing. Within this model, data subjects do not control their data. Instead, the rules are addressed to the data controllers, which are the parties that “own” the data and thus need to respect the conditions to process these data (Blume, “The myths pertaining to the GDPR proposal” 270). Nonetheless, the data protection law provides a powerful tool for data subjects: consent, which is the second aspect that is maintained in the new ruling (Blume, “The myths pertaining to the GDPR proposal” 270). But this tool remains weak because it cannot be used in certain conditions.

The actual ePrivacy Directive “ensures the protection of fundamental rights and freedoms, in particular, the respect for private life, confidentiality of communications and the protection of personal data in the electronic communications sector. It also guarantees the free movement of electronic communications data, equipment and services in the

(11)

11 Union” (ePrivacy Regulation proposal p. 2). The proposal starts from the Charter of Fundamental Rights, with the fundamental right to respect his/her private life, home and communications and intends to apply it to the confidentiality of electronic communication (Art. 1, eP proposal, 11). The proposal for an ePrivacy Regulation includes six key points, which constitute the main changes the Commission wants to introduce:

 new players, the so-called OTT’s that also provide electronic communications services, like traditional telecoms operators, are also considered;

 the directly applicable regulation provides stronger rules for people’s and businesses’ protections of electronic communications;

 the privacy for communications content and metadata is guaranteed;

 when consent for communications data is given, businesses might be able to develop their range of services thanks to this data;

 the cookie rule has been simplified by centralising consent through the browser settings, which is more user-friendly, and non-intrusive cookies, do not need consent;

 the protection of users against spam, unwanted electronic communications by emails, SMS and automated calls, is enhanced.

This set of rules, the Commission promises, will on the one hand ensure citizens’ online privacy and on the other, simplify digital business, since companies will only have to cope with one regulation instead of 27 (Factsheet).

2.2. Facebook, the social network

Journalists and scholars have seen the EU’s revisions of the privacy and data protection laws as a reaction to disputes – often resulting in lawsuits – of big American Internet companies. Some examples of these are the cases against Google Spain or Schrems’ case against Facebook (Eliantonio et al 392). Facebook, as social networking site, has often been central within these cases, changes in its privacy policy or the introduction of new products on the website (not always with explicit notification) have led to a number of controversies around user’s control of its information and role on the site.

Facebook is a social network site, which was created by Marc Zuckerberg in 2004 when he was a student at Harvard University, was first meant as an online student

(12)

12 yearbook but grew to one of the most lucrative Internet companies with over 1 billion users (Grimmelmann 1144). Zuckerberg wanted to create a site enabling people to keep contact other people from their university, to “network” (1145). At the beginning, “TheFacebook.com” was only accessible through a college email address, but in 2006 it was opened to all users, to connect with their “friends” (Hashemi 142). That same year, Facebook introduced a new feature to its website: the “newsfeed,” on which news about one’s network, such as profile changes or posts were shared (1146). This new feature provoked controversy surrounding “the panoptic privacy implications,” which lead Facebook to provide “publicity” function for user’s profile changes (Grimmelmann 1146). Another interesting novelty that was launched in 2006, was the Facebook API (Facebook newsroom) that allowed developers to create applications that were directly plugged into the website. Therefore, Facebook is said to be a “platform” (Grimmelmann 1146), The technical and theoretical implications of these types of platforms will be explained in the following chapter.

Facebook’s popularity has been prescribed to its specific features, such as its origins and development around the “college institution,” its restrained user interface and its free services (Grimmelmann 1148; Hashemi 141). Facebook users are giving over huge amounts of personal - and often sensitive - data, such as name, age, sex, religion, etc. but also information such as interests and event participation (Grimmelmann 1150-51). The data tracking and capturing of users can make them vulnerable to privacy breaches, which makes one wonder why users are taking such risks. Social network sites such as Facebook create a social environment, which is valuable for the user because of three factors that Grimmelmann has identified: identity, relationship and community (1151-60). Facebook gives the users “control” over how they are perceived by others through allowing them to construct their identity online. This online identity is accomplished by uploading a picture to their profile, for instance, liking specific Facebook pages or using a pseudonym as a name. As users have realized online relationships can strengthen offline ones, Facebook groups and pages can foster community building and offer a “social position” to the user (1157).

Hashemi listed in 2009 three “controversial partnerships” that Facebook has with third parties (142). The first controversy, she refers to is the “public search” that enables non-members of search engines to access Facebook’s listings. This opened Facebook up to

(13)

13 criticism for not having notified its user via email, and for making the default privacy settings public. In 2007, Facebook introduced an advertising service on its website showing advertisements according to users’ personal information and page likes. Again, Facebook’s communication relating to these changes was not announced by email. The third controversy Hashemi points to is the introduction of the “Beacon” that tracks Facebook users activity on “affiliated websites” and then enables to share the user’s action on a third-party website with his or her Facebook friends through the news feed (Hashemi 143-4). These controversies have led to protests from users that have seen the “private” college community opened up to the public, like other social network sites (145). Therefore, these different controversies help to illustrate how privacy on Facebook has been controversial especially around users’ privacy.

Moreover, the term “privacy” entails many distinct complexities and dimensions, thus it seems necessary to go deeper into what social networks are, what is at stake for them, and how this impacts policies and citizens. These matters are explained in the following chapter, which sets out the theoretical framework.

(14)

14

3. THEORETICAL FRAMEWORK

This chapter engages with academic works from political and policy studies, sociology, (new) media studies, philosophy and platform studies. This framework is divided into two main sections. One section will engage with the different dimensions of the concept “privacy” and the difficulties with the implementation of an adequate policy. In the other section, I intend to clarify the importance of the data economy, which involves platformization the web, and how this relates to online privacy.

3.1. Privacy at stake

3.1.1. A history of definitions and technology

The concept of privacy is not new and has already been discussed a lot and approached by various study domains. However, no consensus was reached about its definition, nor was a practical solution to the problem found. Different reasons can be attributed to this unreached consensus. First, privacy is said to be a cultural concept (Jutand 7), therefore it has often referred to the US versus European approach to privacy, which will be illustrated in the following part of the section. The second explanation for these diverging views on privacy can be traced through the history of the concept, which is somewhat related to technological developments that are always creating new conditions for privacy. When looking a few years back, the private sphere has not always been considered as important as today. For example, when looking at housing conditions of the past century did not allow the same grade of privacy, as nowadays where children often do not need to share a room with their siblings. With the development of technology, privacy has been increased. Therefore, it can be said that technology protects, but also invades private sphere (Blume, “Data protection and privacy” 154-5). Other authors prescribe this difficult definition to the different dimension that the concept entails. Blume, for instance, said “[m]aybe privacy is not one single concept but a cluster word covering many concepts” (152).

At the end of the 19th century, the US government introduced the Bill of Rights that provided “the right of the people to be secure in their persons, house, papers, and effects against unreasonable searches and seizures” (Tuerkheimer 69). This was a consequence of abuses committed by newspapers companies that started using

(15)

15 photography technology and published “scandal photos” of people’s private life in their newspapers (ibidem). In the academic literature, scholars often refer to its first definition, as being the one coined by Louis Brandeis and Samuel Warren in the Harvard Law

Review as “the right to be left alone” (Tuerkheimer 69; Blume, “Data protection and

privacy” 156). Within the literature on privacy, two dimensions of privacy have been identified. The first dimension is described as being relational and is related to the idea of territorial privacy or bodily privacy, while the second dimension is informational, which relates to the storing and processing of data (Holvast 16). This thesis will address the latter dimension.

Subsequently, as privacy has gained more and more interest in recent years, more specifically, with the advent of the Internet and digital technologies, which challenge governments in terms of data protection and control. Kleiner discusses how this challenge is progressing: “In the Internet age, privacy is likely to become a paramount issue in terms of control by individuals and in terms of a trade-off between releasing personal data and obtaining services and products that use personal data” (Kleiner 84). Francis Jutand, Director of the Department of Scientific Affairs at the Institute Mines-Télécom, sees this “global evolution” toward a digitised world pushed by two factors (8). The first factor Jutand identifies is economic, Big Data technologies enable value creation by producing information and knowledge from data tracking (8), which is also referred to as the “data economy”. The second factor he mentions is the psycho-sociological pressures that allow for the creation of open data flows where the public access to personal data through social networks is enabled, he describes this as the digital society where transparency is ruling rather than “a Big Brother society” (8).

Nowadays, it seems that privacy problems involve a lot of different aspects, Wolfgang Schulz, a law and journalism scholar, gives a good idea of the whole actual problem. He sees three types of conflict relating to communication on the Internet, which he calls “structural risks for privacy” (48). Firstly, Schulz points to a conflict between the private sphere between individual citizens, their privacy, and the state's interests in their security” (48). Secondly, he denounces privacy issues with new types of communication platforms, the so-called Over-The-Top players (OTT), such as Google and Facebook as intermediaries. On the one hand, the OTT’s business model is problematic, because it is based on personal data, which emanates from the private sphere, and on the other hand,

(16)

16 users rely on those platforms to build and shape their “digital self” (48-9). The last conflict Schulz acknowledges is when third parties are involved such as it is the case with copyright transgressors, each of those issues, then require an adapted policy that tackles every risk (49).

3.1.2. Exposure and publicity

New media such as blogs or social network sites have become central for the mediation of information. But, their infrastructures are very different from the offline world, they are said to be disruptive for social dynamics (boyd, “Facebook’s privacy trainwreck”14; Grimmelmann 1151). In her work, boyd illustrates this with teenagers that fight for a private space on the Internet and the misconception that sharing personal information in the public sphere does not mean that one has no desire for personal privacy (boyd, “why do youth share so publicly?” 56). Parents’ argument often is that everything on social network sites is public, which means that they have the right to also look at what their children are posting online. Tech companies see teens enthusiasm for sharing on social network sites as an indicator that the new generation do not care about privacy anymore, which justifies their business (boyd, “why do youth share so publicly?” 56).

Unlike privacy advocates and more politically conscious adults, teens aren’t typically concerned with governments and corporations. Instead, they’re [teens] trying to avoid surveillance from parents, teachers, and other immediate authority figures in their lives. [...] There’s a big difference between being in public and being public. (boyd, “why do youth share so publicly?” 56-7, emphasis in original)

To demonstrate this, boyd uses Goffman’s concept of “civil inattention,” which refers to the social norm of not paying attention to conversations that are not intended to us. These norms should also apply for the Internet; it becomes an ethic matter whether parents should look up available information about their children on Facebook or similar social network sites (boyd “why do youth share so publicly?” 58). This privacy problem is what boyd describes as a problem of “publicity,” which cannot be coined as the antonym of privacy, seen the complex interplay of those terms in the context of social media use (“why do youth share so publicly?” 57). In another work, boyd takes the introduction of Facebook’s News Feed to illustrate a similar problem (“Facebook’s privacy trainwreck”

(17)

17 14). Through this new feature of Facebook’s interface, all interactions within the network were made visible to other “friends” by “exposing” them on the feed appearing on the start page (14). This led to lots of criticism by the Facebook users that were not used to share all their profile settings and action with their network. Therefore, Facebook is said to be disruptive for social dynamics: users have difficulties online social norms and boundaries, which have another architecture than in the offline world. By introducing the News Feed, users felt that they had lost control over the situation because every awkward comment or relationship status was revealed to the network. Facebook, boyd argues, has made new aspects of social life that were before obscure, visible – “exposed” (“Facebook’s privacy trainwreck” 15).

In both cases – the curious parents and the News Feed, – tech firms would argue that privacy has not been breached, and that we are evolving towards a more open and transparent world (boyd, “Facebook’s privacy trainwreck” 18). However, boyd says:

The tech world has a tendency to view the concept of ‘private’ as a single bit that is either 0 or 1. Data are either exposed or not. (“Facebook’s privacy trainwreck” 14)

Privacy is more complex than that, it is rather about how users “experience” it, about how they control their data, and giving access to the network (boyd, “Facebook’s privacy trainwreck” 18). The shift towards removed boundaries in online world causes what the German digital activist Michael Seemann characterizes as “Kontrollverlust” – loss of control – “over the data on the Internet” (9). He goes further with saying that with the era of free flow of information, it becomes impossible to control knowledge as it disseminate so fast, and people need to abandon the wish to control the information (15). Accordingly, people become much more vulnerable as the information could fall into the hands of ill-intended people such as stalkers or burglars (Seemann 16). Finally, Seemann concludes that individuals are better off by accepting there are being exposed than if they try to hide their personal information. Therefore, he advises – similarly to Christian Heller – to abandon privacy and adopt “strategies that rely on openness, transparency, and networked structures are antifragile in the information tailspin” (17 & 23).

(18)

18

3.1.3. Surveillance and power

The Internet and digital technologies are somewhat interweaved with surveillance (Blume, “The myth pertaining the GDPR proposal” 271). In academic literature, scholars frequently use the to surveillance models to conceptualize privacy, and often specifically to the theories of Foucault’s panopticon, Deleuze’s societies of disciplines and Orwell’s Big Brother narrative (Lyon 18). New technologies such as connected devices - think of for instance the Internet of Things - which has opened up new ways to think about privacy in the digital age. More importantly, it has brought about fundamental questions such as whether privacy is even possible in the digital society (Rifkin qtd. in Halpern 3). Michel Foucault is one of the most prominent philosophers of the 20th century and is known for his critical theories on modern society, which is characterized by a switch from centralized and visible power to invisible power, but where the subjects of power are made visible (Foucault 215). To illustrate this, Foucault uses Bentham’s famous panopticon metaphor - it is imagined as a sort of ring building with a tower in its centre, which remains dark for outsiders but it can spy and track every single movement in the building (200). Foucault attributes two principal affordances to it: firstly, it allows making observations and registering behaviours and classifying them, secondly it has influence on the subject’s behaviour (202-3). This is why power is said to be invisible within the present surveillance system, if one is aware that there is a possibility that he or she might be ‘watched,’ the awareness can act as a ‘disciplining power’ on him/her, or, in Foucault's words, ‘which assures the automatic functioning of power’ (201). “[I]t [the panopticon] arranges things in such a way that the exercise of power is not added on from the outside, like a rigid, heavy constraint, to the functions it invests, but is so subtly present in them as to increase their efficiency by itself increasing its own points of contact” (Foucault 207).

Recently, a new generation of social scientists, such as David Lyon and Philip Agre reexamined these conceptualisations of privacy and introduced alternative models to surveillance, which consider the new ‘practices of information technologies’ (Agre 101; Lyon). These alternative models do not exclude the old ones according to Agre: “[p]rivacy issues take different forms in different institutional settings and historical periods,”(101). One of these forms is “Ubiquitous computing” that increasingly enables tracking practices (Agre 101), For example, most Europeans today have a smartphone

(19)

19 which collects information about one’s location, the apps he or she uses, and with who they are interacting with. Moreover, the prevalence of connected wearables such as fitbits or smartwatches allow for the recording of even more data about one’s activities. However, tracking is nothing new; building badges, GPS technology are older examples of tracking devices (Agre 102-3). Agre distinguishes various ways of tracking: tracking through a physical location and tracking as a metaphor; tracking objects is different from tracking humans (Agre 103). Nonetheless, in each case information about representative or real changes of the tracked object/subject is collected and saved in a computer, leading Agre to “what becomes of the data once it is collected” (105), these deeper implications are the core of his paper. To construct his argument, he makes the comparison between five components of the surveillance, on the one hand, and of the capture model on the other. While surveillance uses “visual” metaphors (i.e. ‘Big Brother is watching you’, Bentham’s panopticon), capture uses linguistic metaphors. Watching - surveiller - is discrete and does not disrupt ongoing activities. Capturing, on the contrary, is intended to intervene and reorganize them. The third diverging component is that the first model entails a territorial metaphor, - “invasion of private sphere” - while the latter involves structural metaphors. Furthermore, surveillance is centralized, and capturing involves decentralized and multiple management. Finally, the surveillance model is political, as it is identifying with the state, while the capture model has a philosophical aim, “as activity is reconstructed through assimilation to a transcendent (“virtual”) order of mathematical formalism” (Agre 105-6). Thus, through the process of capturing, information is collected and saved, which means this information is often assumed to be ‘true’. This assumption derives from Another concept that Agre coined in the capture model which is “grammars of action”, the aim of which is to create ‘grammars of action’, which consist of systematic activity mapping and tracking organizational processes in order to ‘knowledge representation’ and so to automate activities (Agre 745).

David Lyon, sociologist and director of the Surveillance Center, sees a shift within surveillance models: Orwell’s all-seeing Big Brother metaphor does not suit contemporary surveillance anymore, as it is not about surveilling one particular person anymore. Instead, Lyon argues, “social sorting” as a concept provides a better model, this approach emphasizes “the classifying drive of contemporary surveillance” where privacy is not an individual matter anymore, but a social one: “Human life would be unthinkable

(20)

20 without social and personal categorization, yet today surveillance not only rationalizes but also automates the process” (Lyon 13). Codes became sort of “invisible doors” that allow people access to several experiences or processes according to their activities and transactions that have been monitored and sorted out. These classifications, Lyon says, are nudging an entire population through influencing their chances and choices (13). Lyon illustrates this with “searchable databases” that collect and process data. These data can be used in different ways; it is often used to prevent risky engagements (14). A concrete example of such a searchable database is the World-Check database, which is used by companies and banks for “risk management.” Another example is the use of such databases for marketing purposes, where targeted consumers are sorted according to postal code or ethnic group, for instance (Lyon 14-5).

Both models – capture and social sorting – emphasize the crucial role of modern information technology that allow tracking and organizing data through computational codes. When looking back at the risks Schulz identifies in the first part of the section, we might conclude that the first type of conflict, the one between state and citizens, the traditional surveillance model still applies, while the second type can be conceptualised through alternative models, such as capture or social sorting.

3.1.4. Privacy as data protection

When talking about privacy, scholars and policy makers often refer to data protection. An illustration of this can be found in the European law, where the ePrivacy Directive and the General Directive for Data Protection are complementary. Although these two concepts are strongly related, it should be noted that they do not refer to the same (Blume, “Data protection and privacy” 152). Blume, one of the leading scholars within data protection matters, wrote that to understand data protection, privacy first needs to be properly defined. As already stated, this is not the easiest thing to do since the term “privacy” entails many different aspects (Blume, “Data protection and privacy” 152). Data protection can be understood as a sort of “information privacy” (153). Data protection is closely and always related to privacy, but has its own regulatory framework:

Data protection became something in itself with its own laws and even, its own legal institutions. Data protection is specifically related to the legal rules that regulate to which extent and under which conditions information related to individual physical persons may be used. (“Data protection and privacy” 153)

(21)

21 A regulatory data protection framework has two aims, Blume says: it should protect private information, and provide an acceptable way to use personal data for society (“Data protection and privacy” 154). Furthermore, it should be noted that the concept of privacy makes a distinction between the public and private sphere, where the private one should be protected. Data protection, on the contrary, makes no distinction and aims to protect data emanating from both spheres (Blume, “Data protection and privacy” 154).

3.2. Cultural and policy approaches

As privacy is a cultural concept, it is also closely intermingled with political ideology, which determines both the state’s and citizens’ relationship (Blume, “Data protection and privacy” 155). Privacy and data protection – closely related concepts – are founded on liberal political thinking (Blume, “Data protection and privacy” 155). Many scholars have pointed out to the fundamentally different approaches of the US and the EU, in terms of privacy and data protection policies (Kramer 388; Baumer et al. 402). While within the European culture privacy is seen as a fundamental human right that needs to be protected, in the United States, privacy is perceived as a commodity controlled by the laws of free market (Kramer 389). Although the US did not have any comprehensive privacy regulation until 2004, most commercial websites provided a privacy policy based on the Federal Trade Commission’s principles, which gave a trust signal for visitors (Baumer et al. 402). Additionally, different auto-regulating methods have been tested by private firms and privacy activists. An example of such an initiative was the Platform for Privacy Preferences Project (P3P) (Baumer et al. 402).

The European privacy model starts from the UN Universal Declaration of Human Rights (1948), which protects territorial and communication privacy (Kramer 389) Additionally, EU citizens are protected against government invasions on their privacy by the European Convention of Human Rights (Kramer 389-90). The privacy as fundamental right model entails that personal information firmly relates to integrity and autonomy of the individual (Kramer 390). The US privacy model considers privacy as a ‘good’, which the consumer can “choose” either to protect, or to trade and get access to discounts or free services (Kramer 390).

(22)

22 As already mentioned earlier, privacy needs to be balanced with other rights

“"Freedom of speech is sacrosanct. We as a culture hold [freedom of speech] much higher than other legal systems." We cannot protect the individual privacy without giving up some control over the freedom of speech. Informational privacy is the antithesis of the goal of the First Amendment-to protect the free flow of information.” (Kramer 402)

In the US, Obama’s administration showed clear interests for privacy protection; in February 2012, the government introduced a “Privacy Bill of Rights”, which included mostly traditional principles of fair information practices, but also new principles of “Respect for Context,” which means that: “companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data” (Privacy Bill of Rights cited in Nissenbaum 19).

Since the concept of context remains vague and ambiguous, Martha Nissenbaum, leading scholar on privacy topics at New York University, suggests for interpretations for the meaning of context, namely context as technology system or platform, context as business model, context as sector or industry and context as social domain (21-3). Nissenbaum advocates for the latter interpretation, which sees society as constructed by various social spheres and grounded on the theory of contextual integrity (23).

My claim is that context understood as social domain offers a better chance than the other three for the Principle of Respect for Context to generate positive momentum for meaningful progress in privacy policy and law. The account of social domain assumed by the theory of contextual integrity constitutes a platform for connecting context with privacy through context-specific informational norms, and offers contextual integrity as a model for privacy itself. (Nissenbaum 25)

Another dimension of contextual integrity she describes are the three independent key parameters characterizing “informational norms”: actors, information types and transmission principle (24).

While the EU introduced a general data protection directive in 1995, which covers all areas, the US does not mention the right to privacy in an explicit way and adopts a sectoral approach to privacy protection (Kramer 400). This is what Winston Maxwell, a leading media, communication and data protection lawyer in France, describes as being the key difference between EU and US; for specific areas, the US government intervenes

(23)

23 as strict safeguard, for the EU any exploitation of privacy is seen as a violation of rights (67). We will expand more on today’s EU policy approach to privacy and data protection in the following section.

However, Maxwell argues that the privacy policies of the EU and US are not that different from each other since they are both originally based on the OECD Guidelines from 1980 (63-4). He argues further that it is time for the EU and US to agree for a global approach for privacy policy, since the Internet has no frontiers.

3.3. The disruptive infrastructures governing the digital society:

platforms and Big Data

As argued earlier, the era of the Internet has added a new dimension to privacy problem through the huge amount of data that is collected on daily basis by computers and connected devices (Kleiner 84). This data collection, thereafter analysed, enables a lot of business, it makes the society smarter, and facilitates citizens’ daily lives, but they also are an enormous threat to the subjects’ privacy. Big Data is the result of this process (Kleiner 84).

Online intermediaries, such as Facebook and YouTube have become central within our societies for their cultural components on the one hand, and for their economical value, on the other (Gillespie, “Politics of Platforms” 348). These intermediaries have been described as “platforms” by scholars but also by their creators; Facebook has profiled itself as computational platform rather than social network (Helmond 2-3). But what do “platforms” really entail, and which impact do they have on data governance? These are issues that Gillespie has engaged with. Gillespie defines platform as “sites and services that host public expression, store it on and serve it up from the cloud, organize access to it through search and recommendation, or install it onto mobile devices” (“Governance by and of Platforms” 1). Moreover, to illustrate the concept, Gillespie identifies its four original meanings. The first meaning, the computational platform, points out to the technical environment in which third parties can offer their applications through an API. Secondly, the architectural platform refers to the physical one; for instance, a train platform. The third meaning, the figurative one, refers to its “conceptual usage,” is a situation that helps a further accomplishment. Finally, the political platform

(24)

24 constitutes a “stage” where politicians can talk to a public (Gillespie “Politics of Platforms” 349-50). Aspects of the four meanings can be found within online intermediaries (351).

The point is not so much the word itself; 'platform' merely helps reveal the position that these intermediaries are trying to establish and the difficulty of doing so. YouTube must present its service not only to its users, but to advertisers, to major media producers it hopes to have as partners and to policymakers. The term 'platform' helps reveal how YouTube and others stage themselves for these constituencies, allowing them to make a broadly progressive sales pitch while also eliding the tensions inherent in their service: between user-generated and commercially-produced content, between cultivating community and serving up advertising, between intervening in the delivery of content and remaining neutral. (Gillespie 348)

The unique thing about platforms is that there do not produce any content, but they just reassemble and organize them (“Governance by and of Platforms” 1). This “platformization of the web” creates the architecture of the data economy (Helmond 1). Accordingly, platforms have been described as becoming the new “dominant institution” of society because of they constitute an infrastructure that enables access to knowledge, interaction between individuals and organisations (Seemann 38). They make this possible through the homogenisation and standardisation of the technical infrastructure (Seemann 38). An important element herewith, is the network effects platforms have. The more users one has, the more attractive and useful it is, and the more powerful it becomes (ibidem).

Another characteristic of platforms is that they have no national boundaries, which means that they have more “business freedom” than more traditional services (Seemann 36). Therefore they are described as being “over-the-top” companies (often abbreviated as “OTT”). An example can be taken from the telecom sector: traditional telecom companies, which provide text and call services also have a “a national duty” to build a technical infrastructure for mobile communication, whereas OTT firms such as Whatsapp and Skype, which provide similar services, do not have this duty. However, the state has made attempts to control platforms and data flows with privacy and data protection as main pretext (ref.). The European Commission, for instance, developed a European Digital Single Market Strategy, in which “the free movement of goods,

(25)

25 persons, services and capital is ensured and where individuals and businesses can seamlessly access and exercise online activities under conditions of fair competition, and a high level of consumer and personal data protection, irrespective of their nationality or place of residence” (“A DSM Strategy for Europe”, Communication 3). The strategy is an attempt to have an influence in the development of the digital economy, in which platforms constitute the infrastructures.

One of the main motivations for users to use Facebook is that the service is free (Grimmelmann 1155), which is enabled through generating revenue through advertising (Roberts 24). However, this is only one way that Facebook exploits its huge datasets that are preciously kept in various datacentres all over the US and in some European countries (Stats, Facebook Newsroom). Facebook also has a research centre where it uses this collected data to improve their services, but also to experiment with new data mining technologies, such as applied machine learning and data science (Research Areas, Facebook Research). The data economy is the upcoming economy, which has already been extensively developed in the US, while in the EU, it encountered more difficulties to take off.

(26)

26

4. RESEARCH DESIGN

In the previous section, the theoretical framework presented the complexity of the privacy issue and the various definition of the concept. To investigate the difference between the EU and Facebook’s rhetorics, the following sub-questions need to be posed:

- Which aspects mentioned in the theoretical framework are mentioned in the EU’s privacy? And what about Facebook?

- What is different about these discourses and why?

- Which tensions did exist between these discourses? And why were they controversial?

- Which approach of privacy was dominant in the public discourse – more specific in the European press?

To address these questions a mixed method approach is operated. When studying controversies from a social perspective, the actor-network theory (ANT) is suitable. It provides a general framework for looking at the social, its dynamics, and how its actors are linked. The other methods, platform studies, allow studying Facebook, the object of study. Platform studies pay more attention to devices themselves, how they work, and which messages/governance they create. Furthermore, both methods can be said complementary because ANT, on the one hand operates more on the macro level of the study, and platform studies, on the other hand, operate on micro level.

4.1. Methodological

approach:

Actor-Network

Theory

and

Platform Studies

“If earnest scholars do not find it dignifying to compare an introduction of a science to a travel guide, be they kindly reminded that ‘where to travel’ and ‘what is worth seeing there’ is nothing but a way of saying in plain English what is usually said under (...) method, or even worse, methodology.”

- Bruno Latour, actor-network theorist (p. 17)

Controversies are what sociologist Bruno Latour puts as the starting point when studying the social – or rather, ‘describing social assemblages’ and he advises to seek for traces of

(27)

27 relations between the different controversies rather to structure or order them (33). In his book Reassembling the Social Latour challenges traditional sociology studies and advocates for an Actor-Network Theory (ANT) approach: “What I want to do is to redefine the notion of social by going back to its original meaning and making it able to trace connections again.” (p.1) To do so, we need to take our distances from forms of pre-existing assumptions, or ‘fixed frames of references’, as he call them, and, instead, ‘follow the actors’ and let them speak (Latour 24; 161). Instead of taking an explanative approach, Latour invites for adopting a descriptive one, in which the aim is to make accounts and observe the traces actors leave behind them (Latour 23). ANT does not seek to intervene within controversies or to put order or put the social into frames, but to follow the actors and observe how they put order themselves (Latour 23). The role of the sociologists, Latour says, is not to assemble or re-assemble, but to describe. Throughout his book, Latour uses the travel metaphor to describe methodology. For a qualitative account three conditions have to be met. First, the entire issue must been covered, secondly, the links between the different issues must have been made, and lastly, all the pre-existing assumptions of the social must have been ignored (25).

Describing and accounting social assemblages is operationalized through issue mapping practices, which Rogers, Sanchez-Querubín and Kil, illustrate as following in their book Issue Mapping for an Aging Europe (9-10):

Issue mapping takes as its object of study current affairs and offers a series of techniques to describe, deploy, and visualize the actors, objects, and substance of a social issue. It is concerned with the social and unstable life of the matters on which we do not agree and with how the actors involved are connected to each other, or otherwise associated with each other. Ultimately, the aim is to produce mappings that will aid in identifying and tracing the associations between actor involved with an issue, and to render them both in narrative and visual form so that they are meaningful to one’s fellow issue analysts and their audiences.

In the present case, the current affair is the EU’s attempt to create a legal framework to protect citizens’ rights to online privacy, which clashes with some business models and practices of the new so-called digital economy. As stated by Latour, both human and non-human actors are considered. This means that Facebook as a platform is also an actor that

(28)

28 plays a role within the controversy. In their book, Rogers et al are ‘mapping ageing’ through the use of three ‘mapping strategies.’ Maps allow representing social and power relations (Crampton 125).

The ‘methods of the medium’ or ‘methods embedded in online devices’, as Rogers would describe them, seek to using ‘digital objects’ by combining them in order to study a social or cultural phenomenon (Rogers 1). In an increasing digitized society, actors leave a huge amount of ‘traces’ on the web, which make masses of data available – often already ordered – for research and analysis. The Digital Methods Initiative developed tools and instruments that allow capturing, scraping, analysing and visualizing these data1. Through this, complex concepts such as associations and traces can be articulated in more concrete concepts, such as link or mentions (Rogers et al 29). Although digital methods can be used for various research purposes, they particularly suit issue mapping because of they enable tracing debates across various sources and platforms from the different parties of the dispute (Marres 659).

A second approach considered for the methodology used in the present study is that of platform studies. Game theorists Ian Bogost and Nick Montfort, one the first scholars mentioning platform studies, describe them as an “attempt, investigating the relationships between the hardware and software design of standardized computing systems — platforms — and the creative works produced on those platforms” (176). A platform-centric approach enables to study cultural elements of a platform’ software (Bogost & Montfort, “Platform Studies: Frequently Asked Questions” 1). The aim here is to understand the user’s experience when in contact with the software. This approach was chosen because it allows researches to link technical aspects with social ones (Weltevrede & Borra 1). By taking a device perspective, we look at the different affordances a platform allows, but also at other aspects of the interface of the platform, such as the aesthetics.

4.2. Data collection and analysis

This section explains how the data was collected and analysed. First, digital methods tools were used to retrieve data from Google results and Facebook’s API, which were analysed through a content analysis to spot the controversies between the EU and

(29)

29 Facebook. Secondly an interface analysis of Facebook’s privacy settings and policies is performed in order to understand how Facebook frames the privacy issue for end-users.

4.2.1. Scraping and crawling through data

By using the Google Scraper of the Digital Methods Initiative, – also sometimes referred as the ‘Lippmannian device’ – the top results of chosen queries for specific sites can be “scraped.” This method is used in the present research as sort of starting point to find controversies around the topic. The query “Facebook privacy” was scraped through for selected European news sites: EurActiv, Politico.eu, The Guardian and EUobserver. These sites were chosen according to a number of criteria’s: they are covering European (political) news; they are aiming for a European audience; and written in English. The collected data of the top 50 results of the query was collected and further organized and analysed in Excel. Tools such as RawGraphs2 and Coggle.it3 were used to visualize the data.

Google results are taken starting point because it allows to “transform search into research” (Rogers 113). Scarping as social research techniques enables to study society through analysing online news (Marres & Weltevrede 313). Online data can be collected through scraping it from the Web (Marres & Weltevrede 313). This tendencies towards digital research has often been labelled as ‘the computational turn’, gives new opportunities for social research (idem). Furthermore, scraping provides data (Marres & Weltevrede 316): the Google Scraper, for instance, returns the data in different formats: .csv or tag cloud. The .csv format, which can be transferred to Excel, provides precise information about the returned results, such as the date, link and description of the returned pages, and also their ranking in the results. These results are the starting point of this study to trace controversies between Facebook and the European Commission, and to represent these within time. Queries results returned by Google are ranked by relevance. The way Google determining relevance is difficult to understand because of its lack of transparency. However, Google states that 200 factors influence the ranking of a page and defines PageRank as following4:

“PageRank is the measure of the importance of a page based on the incoming links

2 http://rawgraphs.io/ 3 https://www.coggle.it/

(30)

30 from other pages. In simple terms, each link to a page on your site from another

site adds to your site's PageRank. Not all links are equal: Google works hard to improve the user experience by identifying spam links and other practices that negatively impact search results. The best types of links are those that are given based on the quality of your content.” (Google; How Google Search Works)

While search engines used to rank sources alphabetically, then according to popularity, nowadays these sources are calculated through algorithmic directory. The displacement of human directories by algorithmic ones has been researched within search engine critique studies. Because of this lack of transparency of how search engines’ algorithm work, engines are said to have ‘darkened the web’ (Rogers 95-6).

Despite of the lack of reach of Google’s search engine, scraping its results delivers a great starting point to trace relevant pages around a specific query, through structuring the abundance of available information on the Web.

When the data was collected and cleaned, a content analysis was performed for the newspapers and the Facebook posts retrieved with Netvizz. All controversies were sorted out from the articles in chronological order. Afterwards, every news article was analysed by looking at what topics from the proposal for an ePrivacy Regulation and from the GDPR were mentioned within these articles. The results of the analysis were visualized through a matrix representing the controversies in time and the recurrence of the topics for these. This allowed me to see which aspects of the privacy law were at stake for each controversy and the evolution in time.

The software tool Netvizz and Facebook’s Newsroom are used to collect information about how Facebook frame and stage privacy issues. The media – both social media and news media – create a platform in the political sense that acts as a stage for actors (Gillespie 350). The Netvizz tool allows researchers to extract and collect data from Facebook groups and pages through the Facebook API that Facebook makes available for developers, which was characterized as ‘data crawling’ by contemporary authors (Rieder 346). In the present research, posts from the page ‘Facebook and Privacy’5 – owned by the Facebook firm itself – are collect through Netvizz. The same is

done with the ‘Digital Single Market’ Facebook pages, owned by the European Commission. These two pages are relevant for this research because they reflect the

(31)

31 official position of both actors regarding privacy and these can be traced through time. Through the first mentioned page official announcements and positions from Facebook regarding privacy issues can be analysed. For the latter, posts not mentioning ‘privacy’ were filtered out and, similarly to the ‘Facebook and Privacy’ page, every post relating to privacy was analysed by reporting the privacy issues that are mentioned.

Afterwards, to go deeper into the understanding of the European Commission’s – and more specifically the Digital Single Market’s (DSM) – rhetoric around and ways to use privacy, the Commission’s websites, its Facebook page and documents such as “factsheets” were filtered out with content relating to privacy.

4.2.2. Tracing privacy on Facebook through an interface analysis

Facebook provide extended information to their users about their how they deal with privacy issues and how they secure the users’ data. Facebook’s creators want to give their users control over their privacy (Facebook’s privacy basics6), or rather give them the

feeling that they are. But how do they do that? And to what “type” of privacy are they referring to? This is what will be investigated in the first part of the present study. Facebook users can find information about privacy all over Facebook’s site. Therefore, it seems relevant to perform an interface analysis of the sections where this information can be traced in order to understand how Facebook frames privacy-related issues, how they explain them to users, and which tools they give these users for enhancing their privacy and security.

The interface of a website – in this case, Facebook’s site – is not neutral: it ‘makes a normative claim’ (Stanfill 1059-61). The interface contains choices that have been formerly made by the designer, and shapes users’ actions when navigating through it. This is why Mel Stanfill illustrates affordances of an interface as ‘productive power’: “A productive power framework operates from the premise that making something more possible, normative, or ‘common sense’ is a form of constraint encouraging that outcome” (1060). By drawing in Gibson’s initial definition of affordances and putting it in the context of human-interaction, an affordance can be defined as following: “an attribute of an interaction design feature is what that feature offers the user, what it provides or furnishes” (Hartson 316). Furthermore, Hartson identifies four types of

(32)

32 affordance: cognitive, physical, sensitive and functional affordances (315). Following Stanfill’s discursive interface analysis, the present study will pay attention at the

functional affordances, which means looking at the functionalities the sites allows, the cognitive ones that reveal labelling and descriptions of the site, and the sensitive affordances relating to the visibility of elements of the interface (1063-4). This sets a

framework for tracing “privacy” and related topics on Facebook’s website interface. It should be noted that only the desktop version of Facebook is analysed.

Therefore, different aspect of the site will be considered. First, the way in which the different privacy relating pages are linked will be traced and visualized. Second, screenshots will be taken from significant features of the interface and commented on. For instance, Facebook’s default settings – a cognitive affordance – are interesting within the present study. The aim of this interface analysis is twofold. First, I want to understand what Facebook does label as being related to “privacy” and what not. Second, the aim is to investigate how much control the user has on his or her data, which I consider within this research to be crucial for privacy.

Referenties

GERELATEERDE DOCUMENTEN

The Article 29 Data Protection Working Party and the EDPS clearly point out in their opinions on large scale EU databases that for the processing of biometric data in the proposed

States shall not impose any further security or notification re- quirements on digital service providers.” Article 1(6) reads as fol- lows: “This Directive is without prejudice to

38 This wide view of regulation allows analysis of the chosen multifaceted EU regulatory model related to children’s online privacy which entails hard law

20 European Commission (2015) M/530 Commission Implementing Decision C(2015) 102 final of 20.1.2015 on a standardisation request to the European standardisation organisations as

At the same time, employers (and, indirectly, the public) often have a legitimate interest in policies and practices that may impact on privacy. In this chapter, a number of

If action by the Union should prove necessary, within the framework of the policies defined in the Treaties, to attain one of the objectives set out in the Treaties, and the

This chapter’s analysis focuses on three important stages in the life cycle of a contract, seen from a business perspective: the scope rules that determine whether the CESL

Coke formation was considerably reduced (from 32.3 down to 12.6 %) and aromatics yields increased (from 8.2 up to 21.6 %) by first applying a low temperature hydrogenation step