• No results found

From my privacy to (y)our privacy : A legal analysis of 'interdependent privacy' under the General Data Protection Regulation

N/A
N/A
Protected

Academic year: 2021

Share "From my privacy to (y)our privacy : A legal analysis of 'interdependent privacy' under the General Data Protection Regulation"

Copied!
110
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

From my privacy to (y)our privacy

A legal analysis of ‘interdependent privacy’ under the General Data Protection

Regulation

Information Law Research Master Thesis Eugénie Coche Institute for Information Law,

University of Amsterdam Supervisor: Prof. Dr. J. Van Hoboken Words: 31 825 (excl. footnotes and references) Final version: 28 February 2019

(2)

Source image coverpage: https://www.telegraph.co.uk/technology/2018/09/27/facebook-advertisers-target-using-shadow-phone-numbers/

(3)

FOREWORD

I was warned it would be hard, long and stressful. What they forgot to tell me is how exciting and interesting it would be. What a journey! From the very first outline scratch to the very last footnote, I have been learning. Learning about ‘interdependent privacy’ in the first place. Learning about conducting research, writing, re-writing, cutting parts and summarising others. Learning that, at the end of the day, it is all about perseverance. Not giving up and going the extra mile. I am so grateful to certain people who made it so much easier for me to persevere. First of all, I would like to thank my supervisor Prof. Dr. Joris Van Hoboken, who challenged me many times by his critical insights, remarks and comments. By making me want to give up at certain times, he taught me the very essence of perseverance: to not give up.

I would also like to thank the Institute for Information Law (IViR) for the incredible learning environment it provides its students with. I was fortunate enough to be a research master and to have the chance to work in a privileged research environment that encourages, in many ways, perseverance. More particularly, I would like to thank Dr. K. Irion and Prof. B. Hugenholtz for giving me the chance to become a research master student. A particular thank you goes also to Dr. J.P. Quintais and Dr. T. McGonagle who encouraged me to write on certain topics or to work on different interesting projects together.

The warmest thank you goes to ‘Mon roi’, who made perseverance not even feel like it. When writing your thesis ends up with an amazing meal and a good glass of wine, I am sure everyone could persevere. Thanks for always being interested. You are worth all the footnotes in the world.

But learning to persevere started a long time ago, with my incredible parents who always taught me that everything is possible as long as you work for it. Thanks for your support in every step I take. I would have never taken this wild ride, from Brussels to Maastricht and from Maastricht to Amsterdam, without your support and trust.

Finally, thank you to my Australian host mum Joy, who keeps supporting me from the other side of the world. You’ve taught me English and you still teach me how to improve it. Because, at the end, this is what perseverance is all about: to continuously improve yourself.

(4)
(5)

ABSTRACT

This thesis aims to analyse interdependent privacy, more particularly ‘by association’, in light of the General Data Protection Regulation (GDPR). The purpose is to find out whether and, if so, in how far this Regulation’s legal framework provides individuals, whose data are being shared by others, with sufficient safeguards to protect their right to data privacy.

In order to do so, the social phenomenon ‘interdependent privacy’ is first discussed as well as the reasons for its increased and inevitable occurrence. This is paired with a discussion on our (interdependent) understanding of privacy and its (lack of) reflection in the GDPR. Following that, a legal analysis is conducted, where the phenomenon is assessed based on the GDPR. This is done by means of three case studies, which considers the situation of individuals transferring data of others when using social networking services and/or third party applications.

The findings of the analysis can be summarised as follows: the GDPR’s legal framework is not well designed to address interdependent privacy situations ‘by association’, which in turn leads to legally problematic situations. The main reason for this is that core provisions of the GDPR, which are crucial for a lawful data processing, are hard to apply to interdependent privacy situations. Moreover, it is uncertain which controller should comply with which obligations when a case of joint control exists. This becomes especially problematic in a case where one (joint) controller is (financially and technically) unable to comply with certain obligations. This should be seen against the CJEU’s broad and expanding interpretation of the ‘controller’ notion, paired with its narrow interpretation of the household exemption, which is likely to make users of social networking services or applications, when sharing other people’s data for certain purposes, ‘joint controller’ with the service provider at stake.

In order to address the legal ambiguities stemming from the GDPR in an interdependent privacy context, this thesis proposes different measures to prevent, at most, data privacy violations from arising. These include technical, educational and regulatory measures. Indeed, as flows from this study, mere enforcement of the GDPR does not seem to provide data subjects, whose data may have been unlawfully disclosed by others, with sufficient remedies in respect of their rights.

(6)

TABLE OF CONTENTS

FOREWORD ... 3 ABSTRACT ... 5 1. INTRODUCTION ... 9 1.1. Research question ... 12 1.2. Sub-questions ... 14

1.3. Relevance, aims and methodology of this thesis ... 15

1.4. Conceptual framework ... 17

1.4.1. Personal data ... 17

1.4.3. “Others” ... 19

1.4.4. Shadow profiles ... 20

PART I: Social and legal understanding of ‘interdependent privacy ... 21

2. Interdependent privacy ... 21

2.1. Different types of ‘interdependent privacy’ ... 22

2.1.1. Inferences ... 22

2.1.2. Dependencies by differentiation, refraction or chill. ... 23

2.1.3. Dependencies by association ... 25

2.2. Different reasons behind interdependent privacy ... 26

2.3.1. Peer-to-peer surveillance ... 27

2.3.2. Business-models of companies: data instead of money ... 29

2.4. Chapter conclusion ... 32

3. Our right to privacy: an interdependent meaning? ... 33

3.2. Interdependent privacy under the GDPR: acknowledgement of this reality? ... 35

3.3. Chapter conclusion ... 37

PART II: Legal analysis under the GDPR ... 38

4. Legal arrangement under the GDPR to face interdependent privacy situations ... 38

4.1. Three case studies: explanation of the cases ... 39

4.1.1. Data transfer with third party applications ... 40

4.1.2 Contact upload on SNSes ... 41

4.1.3. Information sharing on SNSes ... 43

4.2. Legal framework under the GDPR ... 44

(7)

4.2.1.1. Joint controllers ... 47

4.2.2. The household exemption ... 52

4.2.2.1. Evolution of the household exemption under the GDPR ... 55

4.3. Application of the legal framework to the three case scenarios ... 58

4.3.1 Allocation of responsibilities ... 68

4.3.1.1. Legitimacy obligation ... 69

4.3.1.2. Transparency obligation ... 76

4.3.1.3. Data accuracy obligation ... 79

4.3.1.4. Data protection by default ... 80

4.3.1.5. Obligation to comply with data subjects’ rights ... 81

4.4. Chapter conclusion ... 83

PART III: different means to address interdependent privacy issues under the GDPR ... 85

5. Possibilities to address interdependent privacy violations under the GDPR ... 85

5.1. Correctional measures: enforcement of the law ... 85

5.1.1. Compensation of damages ... 85 5.1.2. Imposition of fines ... 86 5.2. Preventive measures ... 87 5.2.1. Technical measures ... 87 5.2.2. Educational measures ... 91 5.3. Regulatory measures ... 94

5.3.1. Excluding non-users data from the scope of data protection laws ... 94

5.3.2. Preventing users from qualifying as ‘(joint) controller’ ... 96

5.4. Chapter conclusion ... 98

6. CONCLUSION ... 100

(8)

ABBREVIATIONS

Abbreviation…

… for this word

AG Advocate General

API Application Programming Interface

App Application

Apps Applications

Art Article

Art 29 WP Article 29 Working Party

Charter Charter of Fundamental Rights of the

European Union

CJEU Court of Justice of the European Union

CNIL Commission Nationale de l’Informatique et

des Libertés

DPA Data Protection Authority

ECHR European Court of Human Rights

ECtHR European Convention on Human Rights

EDPB European Data Protection Board

EDPS European Data Protection Supervisor

et al. And others

GDPR General Data Protection Regulation

ICO Information Commissioner’s office

i.e. Id est

ISS Information Society Service

Para. Paragraph

SNS Social Networking Site

(9)

1. INTRODUCTION

“CAMBRIDGE ANALYTICA”, “PRIVACY”, “FACEBOOK”.

These were the words that internet users, more specifically Facebook users, repeatedly had on their mind earlier this year, in April 2018. At that time, revelations had shown that Cambridge Analytica’s parent company had been able to collect personal data from millions of Facebook users via back-door-methods, namely through a third party App1, which allowed it to send these people targeted information and, ultimately, influence political elections.

Following these revelations, many Facebook users investigated whether Cambridge Analytica had had access to their Facebook profile. Carolyn Springman, a US citizen, was one of them. When she inquired whether this had been the case – whether her Facebook profile had been scrutinized by the company and, thereby, had served electoral ends – she received the following answer: ‘based on our investigation you don’t appear to have logged into “this is your digital life” with Facebook’.2 However, her personal information still had been shared with the App. The next part of the answer she received revealed how this had been possible. It read: ‘a friend of yours did log in. Consequently, your public profile, page likes, birthday and current city were shared’.3

This response exposes a shady aspect of our right to privacy, deserving far more academic attention than it has, until now, received: the interdependent nature of the enjoyment of our (right to) privacy. In a world governed by Facebook “friends”, “tweets” or LinkedIn connections, our right to privacy can no longer be considered in isolation from its inter-relational dimension. Irrespective of our own will, the limits of our privacy are incontestably shaped by other people’s actions. Our effective enjoyment of privacy is continuously influenced by the people we ‘connect’ with and vice versa.

This aspect of our privacy rights can be illustrated by Facebook’s way of calculating approximately how many of its users had been affected by the Cambridge Analytica scandal. In order to find this out, it multiplied the number of people that had installed the App ‘This is Your Digital Life quiz’ with the average number of Facebook friends those persons were

1

In this case it was the ‘This is Your digital Life quiz’ application.

2 Woodrow M., ‘Facebook users react to Cambridge Analytica Scandal’ Abc7news (San Fransisco, 10 April 2018) < https://abc7news.com/technology/facebook-users-react-to-cambridge-analytica-scandal/3325121/> accessed 22 February 2019.

(10)

thought to have. By doing so, ‘those whose data may have been shared with the app by their friends’ were also included in the estimation.4

When the Cambridge Analytica scandal emerged, internet users had the three aforementioned words on their mind, but members of the EU Parliament only had one on theirs: the GDPR. This commonly used abbreviation refers to the General Data Protection Regulation, which is a European legislative instrument that has been in force since 25 May 2018.5 In a nutshell, this Regulation creates rights and obligations whenever an entity, whether a company or an individual, processes personal data of EU data subjects.6 As outlined in its recital 7, this regulatory instrument aims to achieve a ‘strong and more coherent data protection framework in the Union’, enhance legal certainty and give people more control over their personal data. Taking into account the instrument’s goals, the fact that it has only recently entered into force and that interdependent privacy has lately been the object of increased media attention, this thesis will strive to shed light on this phenomenon from a legal perspective and thereby explore whether and, if so, in how far the GDPR has taken this interrelated dimension of privacy into account.

Concerning increased media attention, British lawmakers released in December 2018 a leaked Facebook document containing the firm’s internal e-mails from 2012 to 2015, in which interdependent privacy issues were yet again highlighted.7 Indeed, one of the e-mail conversations therein unveiled the presence of ‘whitelisting agreements’, through which Facebook allowed certain companies to maintain access to users’ friends’ data irrespective of the platform’s previous API changes, aimed at ending such controversial access-to-friends practices. The importance of such practices were emphasised by one of the ‘whitelisted’ company who communicated that ‘the friends data we receive from users is integral to our product (and indeed a key reason for building Facebook verification into our apps).’8

Relying on someone to obtain someone else’s data thus seems to be an essential and often occurring practice for certain (many) companies. Indeed, a few days later, another investigation, this

4 Chadwick P., ‘How many people had their data harvested by Cambridge Analytica?’ The Guardian (London, 16 April 2018) < https://www.theguardian.com/commentisfree/2018/apr/16/how-many-people-data-cambridge-analytica-facebook > accessed 22 February 2019.

5Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC [2016] OJ L119/1 (General Data Protection Regulation, GDPR).

6

See Art 3 GDPR for the Regulation’s territorial scope.

7 UK Parliament, ‘Note by Damian Collins MP, Chair of the DCMS Committee - Summary of key issues from the Six4three files’ < https://www.parliament.uk/documents/commons-committees/culture-media-and-sport/Note-by-Chair-and-selected-documents-ordered-from-Six4Three.pdf> accessed 22 February 2019. 8 Ibid., exhibit 84.

(11)

time revealed by the New York Times, exposed how Amazon and Yahoo were, amongst others, also granted continued data access to Facebook users’ friends.9

Notwithstanding these privacy scandals, other events shed light on some ‘benefits’ stemming from interdependent privacy. Indeed, in May 2018, a suspected US killer, known as the ‘Golden State killer’, was identified based on his relatives’ DNA disclosures.10

Interdependent privacy thus led to identification of the crime suspect. The author reporting on this story called this phenomenon the ‘privacy of the commons’ and explained it by reference to the proverb ‘three may keep a secret if two of them are dead’.11

In other words, no one owns or has full control over his/her personal data, taking into account that certain data relate to multiple data subjects and that anyone having access to these data could potentially share it with others. Interestingly, the Dutch police recently expressed its wish to have in place an open DNA databank comparable to the American one, which enabled the identification of aforementioned crime suspect.12 Such a system in Europe, however, turns out to be controversial, taking into account that it may give rise to interdependent privacy violations under the GDPR.13 Indeed, whereas such interdependency is inherent to our right to privacy and thereby unavoidable in practice, it may, however, be problematic in case privacy violations do arise.

Having regard to some undesirable consequences which interdependent privacy may give rise to, this thesis, besides finding out whether the GDPR has recognised something as ‘interdependent privacy’, will also analyse how this legal instrument applies to interdependent privacy situations and, ultimately, whether the legal safeguards provided therein are sufficient to protect our right to privacy in light of this phenomenon. In doing so, this thesis will focus on one particular scenario within interdependent privacy, which, as will be discussed, is an umbrella term that covers a broad range of situations. The scenario that will be explored, by

means of three case studies, is the situation of users of information society services (ISS)

9 Dance G.J.X. et al., ‘As Facebook raised a Privacy Wall, It Carved an Opening for Tech Giants’ The New York Times (New York, 18 December 2018) < https://www.nytimes.com/2018/12/18/technology/facebook-privacy.html#click=https://t.co/p565d1TX5L > Accessed 22 February 2019.

10

Backer M.K., ‘You can’t opt out of sharing your data even if you didn’t opt in’ FiveThirtyEight (3 May 2018) < https://fivethirtyeight.com/features/you-cant-opt-out-of-sharing-your-data-even-if-you-didnt-opt-in/ > Accessed 22 February 2019.

11 Ibid.

12 Endedijk B. and van den Berg E., ‘Nederlandse politie wil gebruikmaken van particuliere DNA-databank in VS’, NRC (5 February 2019) < https://www.nrc.nl/nieuws/2019/02/05/politie-kan-bijna-iedereen-vinden-met-particuliere-dna-databank-a3653045 > accessed 22 February 2019;

13 Engelfriet A., ‘Nederlande politie wil gebruikmaken van particuliere DNA-databank in VS’ (7 February 2019) < https://blog.iusmentis.com/2019/02/07/nederlandse-politie-wil-gebruikmaken-van-particuliere-dna-databank-in-vs/ > accessed 22 February 2019.

(12)

sharing personal data of others, non-users of the service at stake, with providers of these services.

A makes use of the service from B. Person A discloses personal data of C, a non-users of B’s service, in his/her interactions with B. Based on this, B is able to access data of C and, in turn, use the data for its own purposes, such as by building a profile of C.

By analysing this scenario by means of three case studies, this thesis aims to find out whether and, if so, how the GDPR safeguards C’s right to data privacy.

1.1.

Research question

Taking into account the above, this thesis will answer the following research question:

‘What is the legal arrangement, if any, for addressing interdependent privacy under the General Data Protection Regulation and in how far does this arrangement provide sufficient guarantees for individuals’ right to privacy when users share personal data of non-users with information society services?’

Consequently, this thesis aims to explore interdependent privacy in light of the GDPR in order to find out whether this instrument sufficiently protects individuals’ right to privacy. In order to do so, the scope of this thesis will gradually be narrowed down and zoom in on one particular situation, namely the aforementioned sketched scenario where users share personal data of others, non-users of the information society service at stake. Information society services are defined as ‘services normally provided for remuneration, at a distance, by means of electronic equipment for the processing and storage of data, and at the individual request of a recipient of a service’.14

This includes internet intermediaries such as Facebook, Youtube or

(13)

Ebay.15 Examples of situations where non-users’ data are disclosed through users of a particular service are: a friend who makes an Instagram story about a non-Instagram user; a daughter who posts a picture on her Facebook wall of her mum who is not on Facebook; the random backpacker you met years ago to whom you gave your mobile number and who has recently installed WhatsApp, LinkedIn or Tinder on his or her phone.

In order to reflect on this broad range of situations falling within the envisaged scenario, three case studies will be analysed in light of the law. The first concerns third party Apps gaining access to data of their users and these users’ social connections. In such a context, the App provider makes access to its service conditional upon such data disclosure. The second case study reflects on the situation of users of social networking sites (hereinafter SNS users) uploading their contact list in order to more easily connect with acquaintances. In other words, data transfer takes place in order to make use of the service’s feature, namely to ‘better connect’ with others, such as on LinkedIn. The third case study analyses SNS users disclosing personal data about others in the course of their normal use of the service i.e. in their interaction with others. This happens for example when posting a picture of someone else on Facebook or sharing a message about others on Twitter.

Analysing these three cases will bring us to the core of this thesis, namely the obligations and/or responsibilities of both users and service providers when the former shares personal data of others with the latter. In other words, do these actors have certain responsibilities with regard to the transfer of non-users data? Interestingly, recent case law has shown that users may have some responsibilities when disclosing other people’s personal data. In a 2017 German lower court judgment, it was ruled that the use of Whatsapp is illegal under German Data protection law if the user has not obtained the written consent of persons stored in his or her mobile phone’s address book to pass these persons’ data to Whatsapp/Facebook.16

On 5 June 2018, the Court of Justice of the European Union (hereinafter CJEU) recognised that administrators of a Facebook page are controllers under EU data protection law and are therefore jointly responsible with Facebook when processing personal data of members of that page.17 This judgment renders things even more interesting as it illustrates that both personal data, as well as the responsibility for processing these, have relational and interdependent elements.

15 C-324 /09 L’oreal SA v Ebay [2011] ECLI:EU:C: 2011 :474.

16 German Federal Court, AG Bad Hersfeld, - F 120/17 EASO, 15 May 2017. 17

C-210/16 Unabhangiges Landeszentrum fur Datenschutz Schleswig-Holstein v Wirtschaftsakademie

(14)

1.2.

Sub-questions

In light of aforementioned research question, different sub-questions will need to be answered. Each question will be further developed and elaborated on in distinct chapters. The first sub-question is the following: What is meant by ‘interdependent privacy’ and to

what extent is this phenomenon a reality? This section will elaborate on the concept of

interdependent privacy, distinguish between the different types entailed within this broad and vague term and thereby delineate the contours of this thesis by specifying which type will specifically be dealt with. Moreover, this chapter will investigate the reasons behind (increased) privacy interdependency, taking into account that interdependent privacy has always existed (CHAPTER 2).

Following that, the author will discuss what ‘interdependent privacy’ means for data privacy law in the EU. In other words, does an interdependent interpretation of privacy exist in

EU law and does the GDPR acknowledge this? This part will (partly) reflect on the

theoretical understanding of privacy in the EU and on how far this understanding has been mirrored in the GDPR (CHAPTER 3).

Notwithstanding the answer to the previous question, it is of paramount importance to find out how the GDPR applies to interdependent privacy situations. Indeed, whether this instrument

recognises such a phenomenon should be distinguished from how it addresses it.

Consequently and in order to give a critical assessment of EU law in light of this phenomenon, chapter 4 will answer the following question(s): how does the GDPR address

interdependent privacy, in terms of responsibilities, when users of information society services share personal data of others who are non-users of these services? Moreover, does this GDPR construction provide sufficient guarantees for these non-users’ right to privacy? In order to answer these questions, some GDPR provisions, such as the ‘controller’

notion under Article 4(7), will be closely scrutinised alongside jurisprudence relevant to their interpretation. These include the Lindqvist,18 Rhynes19 and Wirschiftsakademie20 case

(CHAPTER 4).

18 C-101/01, Lindqvist v Sweden [2003] ECLI:EU:C:2003:596.

19 C-212/13, Ryneš v Urad pro ochranu osobnich udaju [2014] ECLI:EU:C:2014:2428. 20

C-210/16 Unabhangiges Landeszentrum fur Datenschutz Schleswig-Holstein v Wirtschaftsakademie

(15)

Finally, in the last section, the author explores whether alternatives are conceivable to the existing GDPR construction, through which our right to privacy could enjoy an enhanced protection. This question builds further upon chapter 4 as it addresses previously identified shortcomings under the GDPR and proposes different ways to address these (CHAPTER 5). Importantly, this thesis is subdivided into three parts. The first part, containing chapter two and three, concerns the social and legal understanding of interdependent privacy. The second part, corresponding to chapter four, contains the legal analysis of this phenomenon under the GDPR. Finally, the last part, namely chapter five, discusses available (alternative) means to address interdependent privacy issues under the GDPR.

1.3.

Relevance, aims and methodology of this thesis

Relevance

Taking into account that the GDPR has only recently entered into force and that, in our highly digitalised society, interdependent privacy arises on a large(r) scale, it is pertinent to take a closer look at this issue. Furthermore, and as mentioned previously, the CJEU has only recently issued a judgment which assigns a certain degree of responsibility to users of ISS under the former Data Protection Directive. This illustrates and contributes to the topic’s relevance. Whereas some academic research has specifically dealt with the phenomenon ‘interdependent privacy’, these mostly focused on the phenomenon as such without pairing it to a legal analysis.21

Regarding existing legal studies surrounding this subject, these all differ from the present one in multiple ways. For example, Van Alsenoy devoted numerous articles to this topic22, which also formed the object of his PhD thesis23, but he never referred to it as ‘interdependent privacy’ and thereby did not really focus on the phenomenon in se. Moreover, at the time of his thesis’ writing, the GDPR was neither yet into force, nor had the CJEU issued its ruling in the Wirschiftsakademie case in which the notion (joint) controller was further developed and

21 See for example : Harkous H. and Aberer K., ‘“If you can’t beat them, Join them”, A Usability Approach to Interdependent Privacy in Cloud Apps’ (2017) CODASPY Proceedings of the Seventh ACM on Conference on Data and Application Security and Privacy; Biczok G. and Chia P.H., ‘Interdependent Privacy: let me share your data’ (2013), chapter 2.

22

Van Alsenoy B. et al., ‘Social Networks and Web 2.0: Are users also bound by Data Protection Regulations ?’ (2009) Vol 2 (1) Identity in the Information Society.; Van Alsenoy B., ‘The Evolving Role of the Individual Under EU Data Protection Law’ (2015) Issue 23 ICRI Research Paper.

23

Van Alsenoy B., ‘Regulating Data Protection: The Allocation of Responsibility and Risk among actors involved in Personal Data Processing’ (2016).

(16)

in which the Court gave (some) indications regarding the allocation of responsibilities. In the same way as Van Alsenoy, legal scholars Helberger and Van Hoboken wrote on that issue without clear focus on ‘interdependent privacy’. The legal framework for their legal analysis was formed by the then-in-force Data Protection Directive and they merely focused on one particular case, namely the situation of photo-tagging on Facebook. Similarly to them, Symeonidis and others also decided to constrain their study to one particular case, within what they called ‘collateral damage’ instead of ‘interdependent privacy’, namely that of third party Apps gaining data access to Facebook users’ friends.24 Although their legal analysis did focus on the GDPR, they did not take into account aforementioned CJEU ruling, nor did they thoroughly analyse the household exemption in light of the GDPR.25

Consequently, this thesis differs from all previously consulted academic work by (i) focusing on ‘interdependent privacy’ as a phenomenon, on the one hand, and analysing it in light of the law, on the other; (ii) using the GDPR as legal framework for the legal analysis; and (iii) adopting an holistic approach through analysis and comparison of three case studies within ‘interdependent privacy’. However, all previously mentioned studies will be further built upon and therefore will serve as the reference framework for this thesis.

Aim(s) and methodology

The primary aim of this study is an evaluative one, which consists in finding out to what extent the GDPR takes interdependent privacy into account and, in light of this, whether this instrument provides sufficient safeguards for individuals’ right to privacy. In order to do so, this thesis will analyse how the GDPR addresses interdependent privacy and, in particular, which guarantees it provides in situations where users share personal data of third parties with providers of ISS.

The secondary aims are both descriptive and normative. Descriptive because, before any evaluation can take place, the different GDPR provisions that are relevant to this topic will need to be described, alongside interpretative documents and jurisprudence shaping these rules. A central notion of this thesis will be the notion of ‘controller’ as it goes hand in hand with imposed responsibilities that are meant to protect individuals’ right to privacy. To

24 Symeonidis I. et al., ‘Collateral Damage by Facebook Applications: A Comprehensive Study’ (2018) Vol. 77 Computers & Security Journal.

(17)

complete the picture and close the loop, recommendations will be made regarding (possible) better alternatives supplementing the existing GDPR construction.

Consequently, the methodology used will be a combination of evaluative, descriptive and normative approaches. Moreover, the research will be conducted from within an internal perspective and thereby use a traditional doctrinal method. Indeed, different legal rules will be interpreted in light of their interpretation given by data protection authorities, the CJEU, national courts and legal scholars.

1.4.

Conceptual framework

For reasons of clarity and consistency, different terms need to be defined as these will regularly come up throughout this thesis.

1.4.1. Personal data

Taking a look at the title of this thesis as well as the GDPR’s material scope, which serves as the legal framework for the further descriptive and evaluative stages of this thesis, it is clear that the term ‘personal data’ is a pivotal element of this study.

The GDPR defines personal data as follows:

‘Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person’.26

This definition entails four components. It concerns (i) ‘any information’, which is a very broad term encompassing objective, subjective, false and true statements, whatever format these may have.27 Moreover, such information must (ii) ‘relate to’ an (iii) ‘identified or identifiable’, (iv) ‘natural person’.

26 Art 4(1) GDPR.

27

Article 29 Data Protection Working Party, ‘Opinion 4/2007 on the Concept of Personal data’ 01248/047/EN WP136, 20 June 2007, p. 6-9.

(18)

According to Article 29 Working Party (hereinafter Art 29 WP), a data-protection advisory body which is now replaced by the European Data Protection Board (hereinafter EDPB), three alternative ways exist to find out whether the information at stake ‘relates’ to an individual. First, it could be that the data says something about a person, which means that the content of the data is relevant. For example, in a case where you share a picture of someone else on social media, the picture will be considered as personal data of the person displayed on it. Another possibility is that the use of the data is aimed at either evaluating someone or treating that person differently from others. In such situations, the purpose of the data is relevant. For example, if you ‘rate’ someone on Airbnb or Uber, the rating may say a lot about the person at stake, such as his/her cleaning habits or driving manners. In se, the whole purpose of such a rating is to ‘evaluate’ someone. Finally, sometimes consideration must be given to the impact of the data, where use of the data is likely to have consequences in relation to a person’s rights and interests.28 This is closely related to the previous example but goes one step further. Indeed, in the case of an Uber driver, the overall rating-review about him/her (especially where the rating is low) is likely to impact his/her career, whether in terms of employment, position or income. Importantly, the information at stake in aforementioned examples also says something about the person posting the picture on social media or rating the Uber driver or Airbnb host through his/her personal account. Yet again, this highlights the interdependent nature of personal data, where data often relate to multiple data subjects.

Regarding the third condition, whether the data concern an ‘identified or identifiable’ individual, the minimum threshold is whether the individual, whose data are being used, can be ‘singled out’ and thereby becomes indirectly identifiable.29

This means that, by means of the data on itself or in combination with significant criteria, the individual can ‘be recognized by narrowing down the group to which he belongs (age, occupation, place of residence, etc.)’.30

Consequently, when posting a group picture on your Facebook account, you do not need to ‘tag’ everyone on that picture in order for that picture to amount to personal data vis-à-vis each person appearing on it. The mere fact that each one of them can be identified based on certain peculiarities, by using all means ‘reasonably likely to be used’,

28 Ibid., p. 9-12. 29

Ibid. 30 Ibid., p. 13.

(19)

is enough for being considered ‘personal data’. Regarding means that are ‘reasonably likely to be used’, recital 26 GDPRR explains that ‘account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments’.

Concerning the last criterion, namely that the information at stake must relate to a ‘natural person’, the GDPR does, in principle, not apply to information relating to dead or legal persons. Consequently, if you write something on your Twitter account about ‘your last experience with Deliveroo’, this would not constitute personal data relating to the company. However, if you write something about Deliveroo’s employee, this would be different regarding the latter person.

In light of these four, rather broad conditions, the threshold for data to amount to personal data appears to be rather low. As specified in the GDPR, pseudonimised data, internet protocol (ip) addresses or cookie identifiers will generally fall within the instrument’s ambit.31 In principle, anonymised data fall outside the GDPR’s scope as it generally precludes a person from being any longer identifiable.32

1.4.3. “Others”

Since this thesis concerns data disclosure about ‘others’, it is important to explain what this term entails. This notion could be defined as follows: Any person who is ‘digitally connected’ with the user of the ISS at stake. ‘Digitally connected’ encompasses a wide spectrum of situations. In a case where you are that ‘other’ person, it would include situations where a WhatsApp user has your phone number stored on his/her mobile phone, a Facebook user who is your “Facebook friend” or a Gmail user who has your email address in his email listing. It would also include an SNS user who posts a picture of you on his/her account. In the latter situation the posted picture would represent your ‘digital connection’ with that person.’ In light of this understanding, the term ‘others’ encompasses all kinds of social connections. It also entails situations where the ‘other’ person at stake is not familiar with the ISS user, for

31

Recital 26 jo. Recital 30 GDPR. 32 Recital 26 GDPR.

(20)

example in case of public figures whose pictures are regularly shared on social media. In such a situation, the public figure (i.e. a president or singer) will be the ‘other’ person.

1.4.4. Shadow profiles

In the situation sketched in the introduction (see scenario diagram), person B, the provider of a service is, amongst other data purposes, able to build a profile of person C, a non-user of his/her service. This is possible through data disclosure by its user, person A. The profile meant here is commonly known as a ‘shadow profile’. In a study about the plausibility for Twitter to create shadow profiles, a ‘shadow profile’ was explained as follows:

‘profiles created with information about non-users that has been inferred from information shared by the users of the social network [the service]’33

Regarding Facebook, the same study explained the ability of SNS providers to build shadow profiles through:

‘hidden files on individuals with private information that has been inferred through the individual’s friends inside the social network.’34

To sum up, the term ‘shadow profile’ refers to a non-user’s profile, which was built through data collection of the service’s users.

33

Garcia D. et al., ‘Collective aspects of privacy in the Twitter social network’ (2018) EPJ Data Science, p.1. 34 Ibid., p.2.

(21)

PART I: Social and legal understanding of

‘interdependent privacy

2. Interdependent privacy

In an article by Biczok and Chia, ‘interdependent privacy’ was explained as meaning that: ‘The protection of personal, relational and spatial privacy of individuals is increasingly dependent on the actions of others, rather than the individuals themselves, in the interconnected digital world’.35

The authors gave different examples of this phenomenon. This includes photo tagging on Facebook or the latter’s application platform (API) which allows third party Apps to gain access to Facebook users’ profiles as well as to these persons’ ‘friends’. In the latter situation, the Facebook user’s privacy, in relation to these third party Apps, does not merely depend on that person’s own decisions. That person’s privacy will also partly depend on the decisions of his or her friends, in their interactions with these third party Apps.36

This inter-relational notion of privacy has recently received lots of media coverage through,

inter alia, events such as Cambridge Analytica and is increasingly being discussed in

academic circles. For example, Barocan and Levy are currently devoting an entire article to this phenomenon, which they refer as ‘privacy dependency’. They explain this term as covering the different ways in which ‘one person’s privacy is implicated by the decisions and disclosures of others’.37 Interestingly, they distinguish between different types of ‘privacy dependencies’, which all require a different response, whether in terms of law, policy-making or advocacy.38 Their article distinguishes between privacy dependencies by similarity, difference, refraction, chill, and association.

Although this thesis will particularly focus on the last type of privacy dependency, namely ‘privacy dependency by association’, the other types will briefly be explained below as these

35

Biczok G. and Chia P.H., ‘Interdependent Privacy: let me share your data’ (2013), chapter 2. 36 Ibid..

37 Barocas S. and Levy K., ‘Privacy Dependencies’ [working paper – draft version - unpublished] (2018) presented at the Privacy Law Scholars Conference (PLSC 2018), p. 1.

(22)

all contribute to the idea that nowadays the right to privacy needs to take account of these inter-relational elements.

2.1. Different types of ‘interdependent privacy’

2.1.1. Inferences

Privacy dependency ‘by similarity’ concerns the possibility of drawing inferences based on shared characteristics between persons. In other words, the fact that a person discloses information about him or herself might reveal certain things about another person in case the latter is ‘understood to be similar [emphasis in original] to the former and thus likely to share the disclosed characteristics’.39

In 2014, in an article discussing Big Data, Barocas and Nissenbaum referred to this privacy phenomenon, in a rather eloquent way, as being the ‘tyranny of the minority’.40

The authors explained this tyranny as meaning that ‘the volunteered information of the few can unlock the same information about the many’.41

To put this in their own words: ‘the willingness of a few individuals to disclose certain information implicates everyone else who happens to share the more easily observable traits that correlate with the revealed trait’.42

Nissenbaum and Barocas cited the example of pregnant women and their shopping habits. Interestingly, a store was able to ‘infer a rule about the relationship between purchases and pregnancy from what must have been a tiny proportion of all of its customers who actually decided to tell the company they recently had a baby’.43 Consequently, based on such inference, all women who happened to share these shopping habits were considered as being pregnant. However, taking into account the number of people that currently share their personal information with all types of actors, whether online or physical stores, one could say that it would be more appropriate to call it the ‘tyranny of the majority’.

Interestingly, privacy dependency by ‘inferences’ is sometimes also referred to as ‘collective’ or ‘group’ privacy; in the sense that people are classified according to groups depending on their common characteristics. In aforementioned example, this group would be the group of

39 Ibid., p. 11.

40 Barocas S. and Nissenbaum H., ‘Big Data’s end run around anonymity and Consent’ in Julia Lane et Al.,

Privacy, Big Data and the Public Good – Frameworks for Engagement (Cambridge University Press, 2014), p.

16 41 Ibid. 42

Ibid. 43 Ibid., p. 17.

(23)

‘pregnant women’. Mantelero, who devoted numerous studies to this privacy dimension,44 argued that ‘predictions based on correlations do not only affect individuals, which may act differently from the rest of the group to which they have been assigned [emphasis added], but also affect the whole group and set it apart from the rest of society’.45 Already in 1987, Simitis shed light on this group aspect of privacy in relation to dubious new forms of data collection. According to him, these new collection techniques altered individuals’ right to privacy in many ways. He stated: ‘Automated processing generates categorizations, especially when hits are assembled under a particular heading. An individual included in the computer-made list is necessarily labelled and henceforth seen as a member of a group, the peculiar features of which are assumed to constitute her personal characteristics’.46

2.1.2. Dependencies by differentiation, refraction or chill.

According to Barocan and Levy, privacy dependency ‘by difference’ can be divided into three distinct categories.

The first category concerns dependency through a ‘process of elimination’, which can best be explained by reference to criminal investigations. In the case of two crime suspects, the privacy of these two will be mutually dependent on each other’s statements and privacy disclosures. The reasoning is as follows: ‘The observer can rely on deductive reasoning to determine certain things about B when A eliminates herself from the suspect set’.47

In other words, the alibi of a suspect may influence the other suspect’s incrimination.

The second category identified by the authors is dependency through ‘anomaly detection’. In order for this to exist, a large amount of data from different parties needs to be collected in order to establish a ‘norm’. This norm will permit other persons’ data to be tested against it, which, in turn, allows deviant or abnormal persons to be detected. This dependency rests on the idea that categorisation of a person as ‘deviant’ will depend on what is considered as ‘normal’, based on aggregations of other people’s data.

The last category concerns ‘unraveling’, which the authors explained as follows: ‘Disclosure of some type of information is voluntary, but the observer attaches value to disclosure with

44 See for example: Mantelero A., ‘Regulating Big data. The guidelines of the Council of Europe in the context of the European data Protection Framework’ (2017) 33 Computer Law & Security Review.

45 Mantelero A., ‘Personal Data for decisional purposes in the age of analytics: From an individual to a collective dimension of data protection’ (2016) 32 Computer Law & Security Review, p. 239.

46 Simitis S., ‘Reviewing privacy in an information society’ (1987) Vol 135 (3) University of Pennsylvania Law Review, p. 709.

47

Barocas S. and Levy K., ‘Privacy Dependencies’ [working paper – draft version - unpublished] (2018) presented at the Privacy Law Scholars Conference (PLSC 2018, p. 16.

(24)

some incentive or benefit if the discloser has some desirable attribute.’48

One of the examples they cite makes it very clear: ‘If most job applicants detail their employment histories on LinkedIn, not having a profile is a red flag to potential employers’.49

Here again, conclusions are drawn based on what ‘most people do’. For example, if most people use their real name on their Facebook account, future employers may deduce that a job seeker using a fake name for his /her profile has ‘something to hide’.

Another type of privacy dependency concerns dependency by ‘refraction’, which is well-explained in the authors’ article concerning ‘refractive surveillance’.50 In such situations, ‘data about A are subsequently used to make decisions about a differently situated B’.51 Unlike with inferences where privacy dependency is based on shared characteristics, the case of ‘refraction’ concerns persons dependent on one another but who do not share such similarities. An example of this dependency type, which formed the object of their study on ‘refractive surveillance’, concerns the collection of customers’ data by retailers. By analysing the way customers act, the retailer is able to exert more control over his/her employees. For example, the fact that customers increasingly shop online may lead to more employees being dismissed. Or the fact that most customers shop between 4 and 6 pm may lead to modification of the employees’ schedule. Consequently, treatment of the employees will depend on the customers’ data.

Lastly, privacy dependencies can also occur by ‘chill’, which refers to the existence of a ‘chilling effect’. This is the case where the behaviour of one person, and thereby his/her personal disclosure, is influenced by his/her expectation of ‘similar treatment’ as others. In the context of surveillance, this occurs when one person adapts his/her behaviour prior to any surveillance taking place on the basis of the expectation to be surveilled in the same way as another person is being monitored. In other words, ‘the chill that can occur when B sees (or believes) that A is being surveilled by an observer’.52

Privacy disclosure will thus depend on what other people have disclosed and what the consequences of their disclosures were.

48 Ibid., p. 21.

49 Ibid.,, p. 21. 50

Barocas S. and Levy K., ‘Refractive Surveillance: Monitoring Customers to Manage Workers( (2018) 12 International Journal of Communication.

51 Barocas S. and Levy K., ‘Privacy Dependencies [working paper – draft version - unpublished] (2018) presented at the Privacy Law Scholars Conference (PLSC 2018), p. 23.

(25)

2.1.3. Dependencies by association

As mentioned previously, privacy dependency by association will be the focus of this thesis, through analysis of three particular and frequently occurring situations. This type of dependency occurs when data about one person is disclosed through another person by virtue of these people’s ‘social tie’.53 Basically, this is the case where data about you is shared by people you know, by people you ‘associate’ with.

Examples include the use of e-mails. Indeed, Benjamin Mako Hill, a software specialist, explained how Google has a grasp on a high percentage of e-mails that were either sent with or received by a Gmail account.54 Consequently, irrespective of your use of Gmail, your e-mail may end up with Google if the friend you are conversing with makes use of a Ge-mail account. Such privacy dependency also holds true when using Cloud storage services, such as Dropbox or Google Drive, as the user’s decision to grant access to third party apps has shown to inflict privacy losses on that user’s Cloud collaborators, irrespective of the latter’s will.55

Regarding Twitter, an empirical study has demonstrated that ‘data shared by the users of that service predicts personal information of non-users of the service’56, thereby making it possible for Twitter to build shadow profiles of non-users of its platform.57 This feature is, however, not reserved to Twitter. Indeed, when answering questions from the European Parliament regarding privacy enquiries resulting from the Cambridge Analytica scandal, Facebook explicitly stated that: ‘If you don't use Facebook, you can ask for any information we store about you via a form in the Facebook Help Centre’.58 Although Facebook’s building of shadow profiles was explicitly excluded by the Company, it did, however, not preclude its possibility to do so. This example of ‘shadow profiles’ relates to the prominent example of ‘dependency by association’ to which Barocan and Levy made reference. They referred to the practice of ‘social networks encouraging users to upload contact lists in order to “find friends” who already use the service – and to solicit participation by those who don’t, by building

53 Ibid., p. 6.

54 Hill B.M., ‘Google has most of my Email because it has all of yours’ < https://mako.cc/copyrighteous/google-has-most-of-my-email-because-it-has-all-of-yours>

55 Harkous H. and Aberer K., ‘“If you can’t beat them, Join them”, A Usability Approach to Interdependent Privacy in Cloud Apps’ (2017) CODASPY Proceedings of the Seventh ACM on Conference on Data and Application Security and Privacy.

56 Garcia D et al., ‘Collective aspects of privacy in the Twitter social network’ (2018) Vol 7 (3) EPJ Data Science.

57 Ibid.

58Facebook Brussels, ‘Follow-questions from European Parliament’ (23 May 2018) < http://www.europarl.europa.eu/resources/library/media/20180524RES04208/20180524RES04208.pdf > Accessed 22 February 2019.

(26)

profiles of these non-users’.59 This scenario will be the object of one of the discussed case studies.

Having regard to this non-exhaustive list of examples and, in light of current available technologies and people’s daily interaction with these, it can be deduced that interdependent privacy is a reality and occurs on a daily basis. Consequently, if we want our right to privacy to be protected, the legal instruments protecting it must take into account these inter-relational elements. However, before going into that question and delineating the actual scope of our right to privacy under the GDPR, the next section will shed light on the underlying reasons that make ‘interdependent privacy’ such a reality.

2.2. Different reasons behind interdependent privacy

Importantly, it would be wrong to assume that interdependent privacy is a new phenomenon. Our degree of privacy has always depended, to a certain extent, on other people’s interactions and disclosures. For example, authors of autobiographies may reveal details about others when confessing stories about their lives. This is very well illustrated in the case Ruusunen v.

Finland, from 2014.60 This case concerned the publication of a book by a former girlfriend of Finland’s Prime Minister which was published during the period he held office. It described their intimate life and contained details that had not been previously disclosed. In that context, disclosure of the author’s privacy also entailed disclosure of someone else’s private life, i.e. the Prime Minister. In order to find out whether the latter’s right to privacy had been breached, the ECtHR balanced the author’s right to freedom of expression against his right to privacy.

In recent years, however, interdependent privacy has taken a whole new turn. The growing importance of the internet, paired with technological developments such as SNSes, has encouraged such interdependency on an unprecedented scale. Both the rise of peer-to-peer surveillance as well as the data-driven business-models of companies may explain why our level of dependency regarding privacy has increased, not to say has become unavoidable. In other words, without these developments, people wouldn’t be affecting each other’s privacy

as much as they do today.

59 Barocas S. and Levy K., ‘Privacy Dependencies [working paper – draft version - unpublished] (2018) presented at the Privacy Law Scholars Conference (PLSC 2018), p. 8.

(27)

2.3.1. Peer-to-peer surveillance

As argued by Albrechtslund, with the rise of SNSes our society has moved away from the vertical, hierarchical idea of surveillance to a horizontal, non-hierarchical idea of it.61 Consequently, the ‘watching over someone’ saying has become inappropriate as our nowadays’ conception of surveillance is not anymore constrained to situations where one watches the other from above.62 He also argues that the concept of surveillance has been broadened to include all forms of monitoring techniques, which entails ‘data collection and technological mediation’.63

In other words, the original understanding of ‘surveillance’ has, in our networked society, been turned upside down. Whereas Albrechtslund referred to it as ‘participatory surveillance’64, which has a positive connotation, Andrejevic rather refers to it as ‘lateral surveillance’. According to the latter, lateral surveillance is ‘not the top-down monitoring of employees by employers, citizens by the State but rather the peer-to-peer surveillance of spouses, friends and relatives’.65

It is ‘the use of surveillance tools by individuals, rather than by agents of institutions, public or private, to keep track of one another’.66

Against this background, where everyone’s privacy has the potential to be closely monitored by peers and has the possibility of backfiring, the expectation would be that SNS users refrain from sharing information. However, Marwick pointed out that ‘reciprocity’ is an aspect that differentiates ‘social surveillance’ from typical surveillance. Unlike the latter, social surveillance presents the feature that ‘each participant is both broadcasting information that is looked at by others, and looking at information broadcasted by others’.67

It is thereby a two-way process, where mutual control is exerted and mutual trust expected. Indeed, sharing information on SNSes is, according to Marwick, primarily an expression of trust towards peers.68 Contrary to what one may think, SNSes like Twitter do ‘encourage “digital intimacy”,

61 Albrechtslund A., ‘Online social networking as participatory surveillance’ (2008) Vol. 13 (3) First Monday, p. 6.

62

This implies hierarchy and comes from Bentham’s ‘Panoptican’.

63 Albrechtslund A., ‘Online social networking as participatory surveillance’ (2008) Vol. 13 (3) First Monday, p. 6.

64 Ibid., p. 7.

65 Andrejevic M., ‘The Work of Watching One another: Lateral Surveillance, Risk, and Governance’ (2004) Vol 2(4) People Watching People, p. 481.

66 Ibid., p. 488.

67 Marwick A.E., ‘The Public Domain: Social Surveillance in Everyday life’ (2012) Vol 9 (4) Cyber-Surveillance in Everyday Life, p. 379.

(28)

reinforcing connections and maintaining social bonds’.69

Having regard to this, it can be inferred that social surveillance may, in two ways, contribute to interdependent privacy. On the one hand, users’ privacy seems to be dictated by the constant thought of being monitored by peers. This may result in users only disclosing personal information which is considered ‘appropriate’, resulting in their privacy being influenced by others.70 What will be considered ‘appropriate’ will, however, differ per person. Standards will depend on the online behaviour of the users’ peers, such as what information they decide to disclose. This line of reasoning should be seen in light of what Gurses called ‘privacy as practice’, where she explained privacy as ‘the negotiation of social boundaries through a set of actions that users collectively or individually take with respect to disclosure, identity and temporality in environments that are mediated by technology. […] Privacy is negotiated through collective dynamics’.71

On the other hand, social surveillance seems to have created an atmosphere which stimulates information-sharing, based on feelings of trust and intimacy. As Albrechtslund pointed out, ‘visibility becomes a tool of power that can be used to rebel against the shame associated with not being private about certain things’.72 His notion ‘participatory surveillance’ mirrors, by default, the active participation of users. From this perspective, it may be inferred that peer-to-peer surveillance encourages the act of sharing and, as one could say, even promotes ‘exhibitionism’. In other words, the act of surveillance seems to produce an effet contraire. Instead of restraining users from disclosing information, it encourages them to do so. This inevitably results in information disclosure about them, but also about others (taking into account that disclosure of someone’s privacy often also reveals details about someone else – see Ruusunen v. Finland above). By referring to the work of Nissenbaum, in which different dimensions of privacy were described, Marwick explicitly recognised this interdependent privacy issue in the context of social surveillance.73 In light of this, peer-to-peer surveillance seems to intrinsically contribute to the occurrence of interdependent privacy.

69 Ibid., p. 384.

70 Ibid., p. 380.

71 Gurses S., ‘Can you Engineer Privacy? On the potentials and challenges of applying privacy research in engineering practice’ (2014) Vol 57 (8) Communications of the ACM, p. 4

72 Albrechtslund A., ‘Online social networking as participatory surveillance’ (2008) Vol. 13 (3) First Monday, p. 7.

73

Marwick A.E., ‘The Public Domain: Social Surveillance in Everyday life’ (2012) Vol 9 (4) Cyber-Surveillance in Everyday Life, p. 380.

(29)

The fact that social surveillance may lead to increased disclosure by users and their peers should, however, also be seen in light of what a study called the ‘social penetration theory’.74 This theory suggests that ‘progressively increasing levels of self-disclosure are an essential feature of the natural and desirable evolution of interpersonal relationships from superficial to intimate’. According to it:

‘Similar to privacy, self-disclosure confers numerous objective and subjective benefits, including psychological and physical health. The desire for interaction, socialization, disclosure and recognition of fame (and conversely, the fear of anonymous unimportance) are human motives no less fundamental than the need for privacy. The electronic media of the current age provide unprecedented opportunities for acting on them. Through social media, disclosures can build social capital, increase self-esteem, and fulfill ego needs. […] Self-disclosure was even found to engage neural mechanisms associated with reward; people highly value the ability to share thoughts and feelings with others. Indeed, subjects in one of the experiments were willing to forgo money in order to disclose about themselves’.75

This will of disclosure, inherent to human beings, paired with the fact that everyone’s privacy, whether online or offline, is tied to one another, may only magnify the presence of interdependent privacy.

2.3.2. Business-models of companies: data instead of money

Another circumstance that may explain the expansion of interdependent privacy is the data-driven business-models of companies, where data is used as their primary ‘ingredient’. Different forms exist, which are not mutually exclusive. As identified by Schroeder, three types of business models should be distinguished: data facilitators, data users and data suppliers.

Data facilitators are companies who provide businesses with the necessary infrastructure and organizational expertise for appropriate data storage and analysis.76 This includes software or

74

Acquisti A. et al., ‘Privacy and human behavior in the Age of information’ (2015) Vol 347 (6221) Science, p. 510.

75 Ibid., p. 510-511. 76

Schroeder R., ‘Big data business models: challenges and opportunities’ (2016) 2 Cogent Social Sciences, p. 8-9.

(30)

hardware providers. However, these actors only contribute to interdependent privacy in an indirect way, as their businesses do not directly encourage people to share data.

Data users, on the other hand, rely on data sharing practices in order to improve their current product and services and, in turn, to increase their overall turnover.77 As phrased by Schroeder, this concerns companies ‘engaged in answering the question: how can data be used to create value within our business?’78

Such a model, where data is collected and subsequently built into a product or service, could for example, serve a shoe company that wishes to find out at which period of the year running shoes are most popular or what the average spending budget is for people aged between 15 and 18 years.

Data suppliers, as the word itself indicates, are companies supplying data to others. In doing so, they first collect data and, in turn, sell these as primary or secondary data.79 These companies can, for instance, serve businesses who wish to better target their audience. Importantly, two types of data suppliers exist: data analytics companies and data brokers.80 Regarding the former, it concerns companies that collect massive amounts of data and analyse these in order to draw ‘inferences’ (see section 2.1.1). The output is thus a ‘data summary, analysis, insight, advice or some other product derived from data’.81

Regarding the latter, this concerns companies selling data, as a product, to third parties. In the context of interdependent privacy, where SNS users disclose personal data of themselves and of others, these users should be considered as data suppliers as these fulfil a similar role, although without being paid for exchange of such data. As emphasised by Schroeder, data brokerage is not new but its field of application has expanded because of two factors.

As argued by him, the first reason is that what constitutes ‘data’ has expanded tremendously because of evolving data analytical techniques, permitting companies to also analyse unstructured data. In other words, the source of data has now nearly become infinite: any type of information, whether text or image can now be treated as ‘data’. Secondly, many transactions which would previously not have given rise to any data, as these could not be ‘captured’, are now digitally mediated and thereby trigger the creation of data.82

For example,

77 Ibid., p. 7. 78 Ibid., p. 8.

79 Primary data are data in their purest form as these have been collected by the researcher itself, while secondary data have been collected and already interpreted by someone else.

80 Schroeder R., ‘Big data business models: challenges and opportunities’ (2016) 2 Cogent Social Sciences, p. 7-8.

81 Ibid. 82 Ibid.

Referenties

GERELATEERDE DOCUMENTEN

Algemene beschrijving: topografie, bodemkundig, archeologisch; dus een algemene beschrijving van de criteria die voor de afbakening van de site zijn aangewend.. De vindplaats ligt

In conclusion: parental consent is not likely to lead to improved protection of children’s personal data, given that consent does not actually give us control over our personal

Statistics confirm that there is a significant correlation between the SNS users that are willing to share more personal data when they receive a payment from SNS services

In the context of privacy calculus, people will then associate less risk with disclosing information to a hospital, meaning they are more likely to give

There is little collaboration between risk management or audit activities and development of open data policies (Gemeente Utrecht, personal communication, January 15,

In this thesis it is shown that the General Data Protection Regulation (GDPR) places anony- mous information; information from which no individual can be identified, outside the

In summary, we have demonstrated that it is possible to achieve catalytic asymmetric addition of organometallic reagents to stereochemically challenging

20 European Commission (2015) M/530 Commission Implementing Decision C(2015) 102 final of 20.1.2015 on a standardisation request to the European standardisation organisations as