• No results found

2020 presents: The era of Data Protection by Design-compliant robotic companions, personal assistants and sexual partners

N/A
N/A
Protected

Academic year: 2021

Share "2020 presents: The era of Data Protection by Design-compliant robotic companions, personal assistants and sexual partners"

Copied!
53
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

The era of Data Protection by

Design-compliant robotic

companions, personal

assistants and sexual partners

Master thesis Riesa van Doorn E-mail: Riesamaya@gmail.com

Student number: 12390534 Mastertrack Informatirecht/Information Law Supervisor: Dhr. Dr. J.P. (João) Quintais

(2)

Sources of pictures cover sheet (from left to right):

- Robot Harmony: http://android-apps.com/news/sex-robot-maker-seeks-mass-production-in-second-factory-due-to-global-demand/

- Robot Buddy: http://www.robobuddy.com/web/robobuddy/buddy

- Robot Mabu: https://techcrunch.com/2015/06/12/catalia-health-gets-1-25-million-from-khosla-ventures-for-its-healthcare-robot/

(3)

Abstract

Currently, Human Interactive Robots (HIR) are increasingly taking on the roles of personal assistants or companions. By entering people’s private homes, they can process a considerable amount of personal data. However, data processing poses risks to the right to the protection of personal data of their users. HIR developers should address these risks by implementing the GDPR's Data Protection by Design obligation, which is currently still open to interpretation. Therefore, this study aims to explore how developers of Human Interactive robots (HIR), namely social, healthcare and sex robots could implement their data protection by design obligation, as enshrined in article 25(1) of the General Data Protection Regulation.

This paper is divided into four chapters. First, chapter 2 briefly explains the background of HIR. It describes technological definitions and the essential characteristics of social,

healthcare and sex robots. Following this, chapter 3 emphasizes on the user’s perspective. It clarifies the concept of the right to data protection, outlines which data the three HIR types process and explores whether data processing poses risks to the users’ data protection right. Subsequently, chapter 4 analyses article 25(1) of the GDPR. This chapter clarifies which DPbD obligations robotics manufacturers have to take into account and discusses which measures they should take to protect the personal data of HIR users more effectively. This chapter is the core of this thesis. It also provides recommendations on how other actors can influence HIR developers to create DPbD-compliant robots. Ultimately, chapter 5 summarizes all the findings.

(4)

Table of contents

1. Introduction ... 5

1.1 Background ... 5

1.2 Research question, scope and methodology ... 7

1.3 Structure ... 8

2. Background on HIR ... 9

2.1 Introduction ... 9

2.2 General definition robot ... 9

2.3 Technological features ... 10

2.3.1 HIR’s architectural structure ... 10

2.3.2 Internet of Robotic Things ... 11

2.4 Types of HIR ... 12

2.4.1 Social robots ... 12

2.4.2 Healthcare robots ... 13

2.4.3 Sex robots ... 14

2.5 Conclusion ... 15

3. Data protection concerns: the phenomenon of risks ... 15

3.1 Introduction ... 15

3.2 The right to data protection ... 16

3.3 Types of personal data HIR process ... 16

3.3.1 The definition of personal data ... 17

3.4 Risk assessment ... 19

3.4.1 Defining risks ... 19

3.4.2 Assessment criteria ... 20

4. Legal Analysis: Practical guidance towards DPbD-compliant HIR ... 25

4.1 Introduction ... 25

4.2 The concept of data protection by design ... 26

4.2.1 Article 25 of the GDPR ... 26

4.3 Practical guidelines for both existing and future HIR manufacturers ... 28

4.3.1 Existing non-sector specific guidelines ... 28

4.3.2 Strategies applied to HIR Manufacturers and concrete guidelines ... 36

4.4 Further guidance ... 44

Regarding the mixed views on how to approach DPbD, this study also provides recommendations on how other actors can encourage HIR developers to create DPbD-compliant robots. ... 44

(5)

1. Introduction

1.1 Background

Over the last decade, interest in artificial intelligence (AI) technologies has increased considerably.1 AI refers to self-learning systems that mimic intellectual processes

characteristic of humans.2 In 2014, the intelligent chatbot, Eugene Goostman, demonstrated

that it was the first machine capable of passing the Turing Test.3 While some believe that this

indicates that the chatbot has obtained human-level intelligence, others disagree, arguing that Eugene merely demonstrated its human communication abilities. However, it might not take decades for AI to reach the human-intelligence level because a great wave of new AI

inventions is emerging.4 The recent Artificial Intelligence Market Research report predicts

that the AI market will increase from $16.06 billion in 2017 to $208.49 billion by 2025.5 This

could be partly explained by the fact that many AI technologies are considered to be useful for society as a whole. For instance, because it can increase productivity in businesses6 and

even contribute to the achievement of the United Nations’ Sustainable Development Goals.7

However, AI developments have also caused human rights related concerns.8 Theoretical

physicist Stephen Hawking even believed that intelligent machines will be the end of humanity.9

1 K. Frankish et al., The Cambridge Handbook of Artificial Intelligence (Cambridge University Press 2014), p. 247. 2 B.J. Copeland, 'Artificial Intelligence | Definition, Examples, And Applications' (Encyclopedia Britannica, 2019)

<https://www.britannica.com/technology/artificial-intelligence> accessed 2 September 2019.

3K. Warwick and H. Shah, ‘Passing the Turing Test Does Not Mean the End of Humanity’ (2016) 8 Cognitive Computation 409 <http://doi:10.1007/s12559-015-9372-6>, p. 409, 417.

4 Ibid.

5 Marketsandmarkets, 'Artificial Intelligence Market - 2025 | Marketsandmarkets' (2020)

<https://www.marketsandmarkets.com/Market-Reports/artificial-intelligence-market-74851580.html> accessed 2 January 2020; MarketWatch, ‘Artificial Intelligence Market 2019 Share, Trends, Segmentation And Forecast To 2025 | CAGR Of 36.2%' (2019) <https://www.marketwatch.com/press-release/artificial-intelligence-market-2019-share-trends-segmentation-and-forecast-to-2025-cagr-of-362-2019-08-12> accessed 2 September 2019.

6 International Federation of Robotics, 'Media Backgrounder Artificial Intelligence In Robotics' (International Federation of

Robotics 2018)

<https://ifr.org/downloads/papers/Media_Backgrounder_on_Artificial_Intelligence_in_Robotics_May_2018.pdf>.

7 M. Sadowski and R. Powell-Tuck, ‘AI & THE SUSTAINABLE DEVELOPMENT GOALS: THE STATE OF PLAY -

2030 Vision’ [2019] 2030 Vision Global Goals Technology Forum 27 < https://sustainability.com/our-work/reports/ai-and-the-sustainable-development-goals-the-state-of-play/>.

8 See for

example:<https://www.rathenau.nl/sites/default/files/2018-02/Human%20Rights%20in%20the%20Robot%20Age-Rathenau%20Instituut-2017.pdf.>.

9 R. Cellan-Jones, 'Hawking: AI Could End Human Race' (BBC, 2014) <https://www.bbc.com/news/technology-30290540>

accessed 2 September 2019; S. Gaudin, ‘Stephen Hawking fears robots could take over in 100 years’ (Comupterworld, 2015) <https://www.computerworld.com/article/2922442/stephen-hawking-fears-robots-could-take-over-in-100-years.html> accessed 2 September 2019.

(6)

This thesis focuses on a particular AI area, namely on robots that are able to interact or communicate with humans. There is yet no existing term to label such robots. Nevertheless, this study refers to them as Human Interactive Robots (HIR). This name is based on Human-Robot Interaction (HRI), which is a field of study that focuses on understanding, designing and evaluating robotic systems that interact or communicate with humans.10 The types that

slowly start entering or influencing people’s domestic or personal environments play a key role in this study: social, healthcare and sex robots. Occasionally, these types are, therefore, already covered by the media, with a special emphasis on the way users begin to interact with them.11 In the near future they will likely be a major part of their user’s everyday

environments.12 Bill Gates, even predicted, that every home will have a robot by 2025.13

However, as mentioned previously, the use of HIR is not without the risk of some undesirable human rights consequences. Due to the data sharing that takes place – and which is often required for the use or improvement of the robots - the right to the protection of personal data, as enshrined in article 8 of the Charter of Fundamental Rights of the European Union (CFR or the Charter), might be affected or even violated. During the 38th International Conference of

Data Protection and Privacy Commissions, the European Data Protection Supervisor (EDPS), therefore, emphasized that the design and development of robotics should preserve the right to data protection.14 In order to encourage entities to respect this right of European data subjects,

the General Data Protection Regulation (GDPR) was introduced in May 2018. A crucial provision is article 25(1), which obligates all data processors,15 including HIR manufacturers,

to implement data protection rules by design. It requires them to take technical and

organizational measures at the design stage of their robots. A few examples of such measures are included in article 25(1). However, because the GDPR is such a new legal instrument, this

10Goodrich M.A. and Schultz A.C., ‘Robot Interaction: A Survey’ (2007) 1(3) Foundations and Trends in Human-Computer Interaction, p. 204.

11 See for example

<https://nos.nl/video/2310170-zorgrobot-sara-doet-spelletjes-en-bewegingsoefeningen-met-mevrouw-meijers-95.html>; <https://www.ad.nl/video/kanalen/rotterdams-dagblad~c434/series/korte-reportage~s1048/knuffelen-met-zeehondrobot-paro~p27508?referrer=https://www.google.com/>.

12C. Holder et al., ‘Robotics and Law: Key Legal and Regulatory Implications of the Robotics Age (Part i of II)’ (2016) 32 Computer Law and Security Review 383 <http://dx.doi.org/10.1016/j.clsr.2016.03.001>, p. 2.

13 B. Gates, 'A Robot In Every Home' (2008) 18 Scientific American

<https://www.cs.virginia.edu/~robins/A_Robot_in_Every_Home.pdf>.

14 European Data Protection Supervisor (EDPS), 'Artificial Intelligence, Robotics, Privacy And Data Protection, Room

Document For The 38Th International Conference Of Data Protection And Privacy Commissioners' (2016) <https://edps.europa.eu/sites/edp/files/publication/16-10-19_marrakesh_ai_paper_en.pdf>, p.17.

(7)

provision lacks a detailed description on how developers should apply it and is still open to interpretation.16

1.2 Research question, scope and methodology

Given these issues, central to this thesis is the following research question:

From the perspective of safeguarding the users’ right to the protection of personal data, how could manufacturers of Human Interactive Robots (HIR), such as social, healthcare and sex robots, implement their data protection by design obligation, as enshrined in article 25(1) of the General Data Protection Regulation?

The scope of this study has been delineated to the fundamental data protection right of HIR users, enshrined in article 8 of the Charter of Fundamental Rights of the European Union (CFR or Charter). Whether this right is sufficiently safeguarded depends on the extent to which robotics manufacturers, such as companies or research institutions, comply with their Data Protection by Design (DPbD) obligations. To interpret article 25 GDPR, relevant data protection principles in other provisions, such as the principle of data minimization in article 5 GDPR, must also be considered.

Although HIR may raise other interesting questions - for instance, the question of who is liable for damage caused by the robots - these questions fall outside the scope of this study. Additionally, the closely related right to respect for private and family life, as enshrined in article 7 of the Charter, also falls outside the scope of this study. In order to apply the data protection rules, it is not necessary to examine whether this right is infringed.17

Moreover, the research question of this study is a design question. The aim is to improve the safeguarding of the right to data protection of HIR users by providing practical guidelines to HIR developers. Taking into account that DpbD obligations should not be solely effective in theory, but also in practice, HIR developers must be informed about how to implement these obligations at the initial design stage and throughout the complete development process of the robots.

16 B. Lee, ‘How does Privacy by Design Work in Practice? (CCI, 2016),

<https://www.corporatecomplianceinsights.com/privacy-design-work-practice/ > accessed 2 September 2019.

17FRA et al., Handbook On European Data Protection Law (3rd edn, Publications Office of the European Union 2014) p. 20.

(8)

In order to draw a conclusion on the main question, the following sub-research questions are examined:

1. What are the different types of HIR and what technology is involved?

2. What does the right to data protection entail and what types of personal data do HIR process?

3. What are the risks for data subjects when HIR process their personal data?

4. What does the DPbD obligation entail and how can developers be further guided on how to implement this obligation?

To be able to give a comprehensive answer to all the above-mentioned questions, this study applies the doctrinal research methodology. This method involves an analysis of the relevant primary sources of law. Firstly, article 8 of the Charter will be analyzed to clarify the concept of the right to data protection. This provides an understanding of the data protection rules which are guaranteed by the GDPR. Next, article 25(1) of the GDPR will be analyzed and interpreted. Additionally, the doctrinal method is based on literature review and case law concerning the protection of personal data of both the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (EctHR). Article 53 of the CFR states that the CJEU can also recognize and consider other international legal instruments as minimum protection standards, such as the European Convention of Human Rights (ECHR) when interpreting a provision in the Charter. This is allowed because these two different legal instruments are intertwined.

1.3 Structure

This paper is divided into four chapters. First, chapter 2 briefly explains the background of HIR. It describes technological definitions and the essential characteristics of social,

healthcare and sex robots. Following this, chapter 3 emphasizes on the user’s perspective. It clarifies the concept of the right to data protection, outlines which data the three HIR types process and explores whether data processing poses risks to the users’ data protection right. Subsequently, chapter 4 analyses article 25(1) of the GDPR. This chapter clarifies which DPbD obligations robotics manufacturers have to take into account and discusses which

(9)

measures they should take to protect the personal data of HIR users more effectively. This chapter is the core of this thesis. Ultimately, chapter 5 summarizes all the findings.

2. Background on HIR

2.1 Introduction

This first chapter answers the questions, “What are the different types of HIR and what technology is involved?” It starts with exploring the general definition of robots, including gaining insight into the architecture of HIR, which is crucial for understanding how the robots operate. Next, the specific features of three different types of HIR are described - namely social, care and sex robots – with reference to concrete examples.

2.2 General definition robot

In 1921, the term “robot” was first coined by Czech playwright Karel Capek. He used it in his play Rossums Universal Robots to refer to human-like machines.18 Currently, there are several

definitions of robots. According to the International Organization for Standardization (ISO), a robot is, for example, an actuated mechanism that is programmable in two or more axes. It can move within its environment, perform intended tasks, and it also has a degree of autonomy.19 Richards and Smart believe that ‘constructed systems displaying both physical

and mental agencies, but that are not alive in the biological sense’ can be designated as robots.20 Despite the fact that the term robot remains open to interpretation, it is widely

believed that a robot is an autonomous machine capable of performing human tasks. 21

According to a report of the RobotLaw Project, a study concerning the regulation of robots that was funded by the European Commission, the following attributes result from such an understanding:

18 Encyclopedia Britannica, ‘Karel Čapek CZECH WRITER’ (2014) <https://www.britannica.com/biography/Karel-Capek>

accessed 20 September 2019.

19 ‘Robots And Robotic Devices — Vocabulary' (Iso.org, 2020) <https://www.iso.org/obp/ui/#iso:std:iso:8373:ed-2:v1:en>

accessed 20 September 2019; T. Wang et al.,'Current Researches And Future Development Trend Of Intelligent Robot: A Review' (2018) 15 International Journal of Automation and Computing

<http://html.rhhz.net/GJZDHYJSJZZ/20180503.htm>.

20 N. Richards and W. Smart, 'How Should The Law Think About Robots?', Robot Law (1st edn, Edwar Elgar Publishing

2016), 3-22, at 6.

21 Oxford, ‘robot’ (2020) <https://www.oxfordlearnersdictionaries.com/definition/american_english/robot> accessed 29

November 2019; E. Palmerini et al.,‘Regulating Emerging Robotic Technologies in Europe: Robotics Facing Law and Ethics’ 1 <http://www.robolaw.eu/RoboLaw_files/documents/robolaw_d6.2_guidelinesregulatingrobotics_20140922.pdf>, p.15.

(10)

1. Their physical nature: robots are physically embodied machines that can perform tasks in the physical world;

2. Their independence of external human control: robots are able to execute a

programme to autonomously behave in a certain way, such as carrying out a certain task;22 and

3. Their human-like behaviour.23

All these features together distinguish a robot from other devices. As mentioned in the introduction, this thesis does not focus on all robots. It excludes industrial robots and focuses on HIR: physically embodied robots that are able to autonomously interact24 with their users.

2.3 Technological features

2.3.1 HIR’s architectural structure

In addition to the hybrid architecture, the adaption of the Internet of Things (IoT) technology in robotic systems appears to be another technological change. This technology is worth mentioning briefly because it makes HIR vulnerable for cyberattacks.25 In this way, it poses

risks to the right to data protection of HIR users (discussed in section 3.3). IoT refers to a comprehensive network in which all sorts of physical objects, such as home appliances, are connected.26 By collecting data and interacting with each other via Wi-Fi or Bluetooth, IoT

devices (known as smart devices) function without human intervention. They are

automatically able to adjust to their users’ routines and preferences.27 Similarly, HIR can

become part of the IoT network. ABI research defined this phenomenon as the Internet of Robotic Things (IoRT).28 By connecting HIR to a cloud, they gain access to useful

information for accomplishing several tasks,29 or they can perform computations in this cloud.

22 R. Leenes et al., ‘Regulatory Challenges of Robotics: Some Guidelines for Addressing Legal and Ethical Issues’ (2017)

Law, Innovation and Technology. <https://doi.org/10.1080/17579961.2017.130492>, p. 5.

23 Ibid p.15.

24 including physically or establishing relationships.

25 E. Catania. and A. La Corte. ‘IoT Privacy in 5G Networks’ (IoTBDS 2018) <https://doi.org/10.5220/0006710501230131>,

p. 124.

26 Keyur P. et al., ‘Internet of Things-IOT: Definition, Characteristics, Architecture, Enabling Technologies, Application &

Future Challenges’ (2016) 6 International Journal of Engineering Science and Computing 1 <http://ijesc.org/>, p. 6122.

27https://www.researchgate.net/publication/320532203_Internet_of_Things_IoT_Definitions_Challenges_and_Recent_Resea

rch_Directions.

28 ABI Research, ‘IoRT’ (2014) <www.abiresearch.com/market-research/product/1019712-the-internet-of robotic-things/>

accessed on 20 September 2019.

29 C. Turcu et al., ‘Integrating Robots into the Internet of Things’ (2012) 6 International Journal of Circuits, Systems and

Signal Processing 430 < https://pdfs.semanticscholar.org/da63/afa2ad27f31cc9646b64792770fd87ac362b.pdf

(11)

This connection also enables the robots to improvise their movements and behaviors, such as sensing, communicating, tracking and monitoring.30

31

2.3.2 Internet of Robotic Things

In addition to the hybrid architecture, the adaption of the Internet of Things (IoT) technology in robotic systems appears to be another technological change. This technology is worth mentioning briefly because it makes HIR vulnerable for cyberattacks.32 In this way, it poses

risks to the right to data protection of HIR users (discussed in section 3.3). IoT refers to a comprehensive network in which all sorts of physical objects, such as home appliances, are connected.33 By collecting data and interacting with each other via Wi-Fi or Bluetooth, IoT

devices (known as smart devices) function without human intervention. They are

automatically able to adjust to their users’ routines and preferences.34 Similarly, HIR can

become part of the IoT network. ABI research defined this phenomenon as the Internet of Robotic things (IoRT).35 By connecting HIR to a cloud, they gain access to useful information

for accomplishing several tasks,36 or they can perform computations in this cloud. This

30 R. Batth et al., ‘Internet of Robotic Things: Driving Intelligent Robotics of Future - Concept, Architecture, Applications

and Technologies’ [2019] Proceedings - 4th International Conference on Computing Sciences, ICCS 2018 151, p. 153.

31 Image retrieved from Murphy 2000, p. 7.

32 E. Catania. and A. La Corte. ‘IoT Privacy in 5G Networks’ (IoTBDS 2018) <https://doi.org/10.5220/0006710501230131>,

p. 124.

33 Keyur P. et al., ‘Internet of Things-IOT: Definition, Characteristics, Architecture, Enabling Technologies, Application &

Future Challenges’ (2016) 6 International Journal of Engineering Science and Computing 1 <http://ijesc.org/>, p. 6122.

34https://www.researchgate.net/publication/320532203_Internet_of_Things_IoT_Definitions_Challenges_and_Recent_Resea

rch_Directions.

35 ABI Research, ‘IoRT’ (2014) <www.abiresearch.com/market-research/product/1019712-the-internet-of robotic-things/>

accessed on 20 September 2019.

36 C. Turcu et al., ‘Integrating Robots into the Internet of Things’ (2012) 6 International Journal of Circuits, Systems and

Signal Processing 430 < https://pdfs.semanticscholar.org/da63/afa2ad27f31cc9646b64792770fd87ac362b.pdf

>.

(12)

connection also enables the robots to improvise their movements and behaviors, such as sensing, communicating, tracking and monitoring.37

2.4 Types of HIR

2.4.1 Social robots

Social robots are robots that can personally communicate and interact with people in a human sociable way. Relating to and understanding its user are also characteristics of these types of robots. Vice versa, the user will also understand the machine and empathize with it;38

therefore, these robots may also serve as companions.39 Another aim of building social robots

is that they provide useful insight for understanding social intelligence, human sociality and social behavior disorders, such as autism. According to Carol Povey, the director of the National Society’s Centre for Autism, social robots are an effective tool for developing social skills of autistic people, who are drawn to the robots because of their predictability.

Consequently, social robots even have the potential to influence their lives positively. Social robots tailor-made for autistic children are currently still in the prototype phase, for example robot Kaspar,40 or solely available for schools (robot Milo),41 and thus are not yet available for

parents to use at home.

However, other social robots, which are not solely for autistic children, can also have a positive impact on them. An example of such a robot that is already available on the market is Buddy.Buddy is also suitable for the elderly and whole families, both children and their parents. According to its developer, Blue Frog Robotics (BFR), the robot is an emotional companion and personal assistant that wins the heart of every family member.42 Due to its AI,

voice and facial recognition software, Buddy is able to recognize and interact with each member. Its ability to autonomously behave and move into homes is facilitated by the hybrid architecture. However, users also have the possibility to control its movement remotely via its mobile application when they are not at home. Furthermore, Buddy can express certain

37 R. Batth et al., ‘Internet of Robotic Things: Driving Intelligent Robotics of Future - Concept, Architecture, Applications

and Technologies’ [2019] Proceedings - 4th International Conference on Computing Sciences, ICCS 2018 151, p. 153.

38 C.L. Breazeal, Designing Sociable Robots, (MIT 2002), p.1.

39 J. Broekens et al., ‘Assistive Social Robots in Elderly Care: A Review’ (2009) 8(2) Gerontechnology, p. 95.

40 Wood L.et al., ‘Developing Kaspar: A Humanoid Robot for Children with Autism’ [2019] International Journal of Social

Robotics <https://doi.org/10.1007/s12369-019-00563-6>.

41 For more information see (see): <https://robots4autism.com/milo/>.

42 BFR,‘Buddy The Emotional Robot’ (2020) <https://buddytherobot.com/en/buddy-the-emotional-robot/>

(13)

emotions and perform multiple tasks, such as playing games with children, reminding users to accomplish their to-do lists, making video calls, securing homes, playing music, and taking pictures and videos, to name a few. Such a wide range of services is possible because Android apps can run on Buddy and the robot is part of an IoRT network. Via Wi-Fi or Bluetooth, it can retrieve data from its user’s other smart devices.43

2.4.2 Healthcare robots

Healthcare robots are robots that aim to promote or monitor health.44 Over the years, several

studies on the use of these robots employed in various healthcare settings, such as hospitals or nursing homes, have shown that the robots are beneficial for a wide range of healthcare related activities. For instance, they can foster the functioning of vulnerable individuals, such as the elderly,45 by providing personal care, physical rehabilitation or therapeutic activities. In

order to adapt to the needs of individuals or perform coordinated actions, care robots process data acquired through sensor technology.46

In 2017, a research paper on the use of a robot assistant in supporting the treatment of diabetes was published. It stated that a robot that is part of an IoT-based eHealth platform would be successful in providing a health managing service. When such a robot is connected with medical sensors through Bluetooth, it can collect relevant measurements of its user, monitor health data and use it for healthy lifestyle supporting dialogues.47

In 2018, a startup called Catalia Health (CH) made such an intelligent robotic coach available for patients.48 Robot Mabu aims to assist patients in managing their chronic diseases at home,

– particularly heart diseases, arthritis or diabetes. By supporting its users to consistently follow treatment plans, as recommended by their doctors, the health condition of patients can be improved. In order to use Mabu, Mabu must connect it to its own cellular network or its

43 Ibid; Indiegogo, ‘Buddy’ (2020) <https://www.indiegogo.com/projects/buddy-your-family-s-companion-robot#/> accessed

1 December 2019.

44 H. Robinson et al., ‘The Role of Healthcare Robots for Older People at Home: A Review’ (2014) 6 International Journal of

Social Robotics 575, p. 576.

45 Feil-seifer D. and Mataric MJ, ‘Human-Robot Interaction Major HRI Influences in Popular Culture’ [2009] Encyclopedia

of Complexity and Systems Science 4643, p.6.

46 R. Leenes et al., ‘“Nothing Comes between My Robot and Me”: Privacy and Human-Robot Interaction in Robotised

Healthcare’ (2019) Data Protection and Privacy, p. 97-99.

47 M. Al-Taee et al., ‘Robot Assistant in Management of Diabetes in Children Based on the Internet of Things’ [2017] 4(2)

IEEE Internet of Things Journal < https://doi.org/ 10.1109/JIOT.2016.2623767>.

48 S.C. Stuart, ‘Mabu’ (Pc Magazine 2019) <

(14)

user’s Wi-Fi network. This healthcare robot has a built-in camera and microphone to communicate with its user on a daily basis. It asks about concerns or needs and also

emphasizes the importance of medication by providing reminders through voice, SMS or its touchscreen.49 At the end of the month, Mabu provides a summary of the treatment. Having

Mabu, therefore, reduces the need for chronic disease patients to visit a doctor to solely exchange basic communications. Due to the implementing of machine learning technology, Mabu is capable of tailoring its coaching style to its user’s personality; for instance, through personality-based language.50

2.4.3 Sex robots

Recently, in July 2019, academics and professionals were invited to explore sex robot related topics during the 4th International Congress on Love and Sex in Brussel.51 Indeed, there

appeared to be both a growing demand and a number of initiatives for the development of sex robots in the last several years.52 Humans can use these robots for sexual stimulation.

Therefore, their behavior and appearance, also relating to human sexuality, are strongly human-like.53

However, the growing industry of sex robots is a controversial topic. On the one hand, experts regard it as a favorable development that will mainly benefit the elderly, disabled people or others that have difficulties with satisfying their sexual needs.54 On the other hand, various

critics have expressed their concerns. For example, Richardson, who launched the “Campaign Against Sex Robots” in 2015. In her position paper, she presented various arguments, inter alia, that the robots will reduce human empathy, objectify women and increase inequality.55

49 P.R Cohen, R. Tumuluri, ‘Commercialization of Multimodal Systems’, The Handbook of Multimodal-Multisensor

Interfaces (3rd volume, ACM & CP 2019), p. 633.

50 CH, ‘Mabu’ (2020) <http://mymabu.com/about/> accessed on 3 October 2019; W. Thibodeaux, ‘inventions’ (Inc.,

2017)<https://www.inc.com/wanda-thibodeaux/117-million-people-could-end-up-using-this-adorabl.html> accessed on 3 October 2019.

51 See https://philevents.org/event/show/71218.

52 J. Illes, F.R. Udwadia, ‘Sex robots increase the potential for gender-based violence’, (the conversation, 2019)

<https://theconversation.com/sex-robots-increase-the-potential-for-gender-based-violence-122361> accessed on 3 October 2019.

53 Johnson D.G. and Verdicchio M., ‘Constructing the Meaning of Humanoid Sex Robots’ [2019] International Journal of

Social Robotics <https://doi.org/10.1007/s12369-019-00586-z>.

54 E. Di Nucci, ‘Sexual Rights and Disability’ (2011). Journal of Medical Ethics, p.158

< https://ssrn.com/abstract=1931595>.

55 K. Richardson, ‘The asymmetrical 'relationship': parallels between prostitution and the development of sex robots’ (2015)

(15)

Robot Harmony, an AI female, is an example of a sex robot. In 2017, it was developed by realistic sex doll company Abyss Creations (AC).56 According to the company, “she” is the

perfect companion.57 The robot consists of a robotic head, including facial expressions and

basic movements, and a body (including the vagina) made from silicon doll material.

Although the robot is yet not able to walk or recognize faces (because it does not have a built-in camera), the robot can already respond to touch, sbuilt-ince it is equipped with sensors. With a mobile app that is connected with Harmony’s robotic system, its users can customize the robot.58 They can choose external features, such as the body type, personality traits and

emotions that they find attractive. Harmony is also capable of having personal conversations (via its microphones).59

2.5 Conclusion

This chapter aimed to provide an understanding of the concept of HIR. Examples are

provided to illustrate different types of HIR: Buddy for social robots, Mabu for health robots and Harmony for sex robots. Regardless of their functions, these robots have the same

architectural structure, namely the hybrid architecture. They are also able to connect to Wi-Fi and, increasingly, to other (smart) devices, to enhance their services. As a result, there can be certain risks for their users. The next chapter identifies these risks.

3. Data protection concerns: the phenomenon of risks

3.1 Introduction

This chapter provides an answer to the following questions:

1. What does the right to data protection entail and what types of personal data do HIR process?

2. What are the risks for data subjects when HIR process their personal data?

56Harmony is sold by its subsidiary company Realbotix.

57 Realbotix, ‘Harmony’ (2019), <https://www.realdoll.com/product/harmony-x/> accessed on 3 October 2019. 58 C. Bartneck, M. McMullen, ‘Interacting with Anatomically Complete Robots’ HRI '18: Companion of the 2018

ACM/IEEE International Conference on Human-Robot Interaction (ACM 2018), pp. 2-3 <https://doi.org/10.1145/3173386.3173387>.

59 N. Sharkey et al., 'Our Sexual Future With Robots' (Foundation for Responsible Robotics 2017), p. 4

<http://responsiblerobotics.org/2017/07/05/frr-report-our-sexual-future-with-robots/> accessed 3 October 2019; Realbotix, ‘Harmony’ (2019), <https://www.realdoll.com/product/harmony-x/> accessed on 3 October 2019.

(16)

In order to answer the first question, this study analyses the privacy policies of the HIR and case law. The GDPR and documents of the former Article 29 Working Party (WP29)60 are

examined to answer the second question.

3.2 The right to data protection

The right to the protection of personal data is guaranteed by article 8 of the CFR. The article requires the processing of personal data to be fair, for specified purposes. The processing can be based on consent or another legitimate basis that is enacted by law. It also states two rights of data subjects, namely their right to access their personal data and to rectify it, and that an independent authority should control compliance with the data protection rules.61 This data

protection right is closely related to the right to respect for private life (the right to privacy), enshrined in article 8 of the ECHR and article 7 of the CFR. By guaranteeing a private sphere for the free development of individual personalities and personal opinions, both human rights aim to protect corresponding values, such as human dignity and autonomy. The rights are also distinct from each other, notably in formulation and scope. The scope of the right to privacy is limited to situations involving an individual’s private life. In contrast, the scope of the right to data protection is broad. It includes any activity concerning the processing of personal data, regardless of whether there is any interference with the right to privacy.62 However, this does

not mean that the protection of personal data is an absolute right. Article 52 CFR allows limitations on the rights and freedoms that are protected by the Charter if they meet certain conditions: they should respect the essence of the data protection right, comply with the law, have a legitimate aim, and be necessary and proportionate to achieve this aim.63 In principle,

to protect the users’ data protection right, HIR developers should be aware that their robots process personal data automatically. Since certain types of personal data have a stricter protection regime, they should also consider which different types of personal data HIR process.

3.3 Types of personal data HIR process

60 which has been replaced by the European Data Protection Board (EDPB). 61 Article 8 CFR.

62 FRA, p. 20. 63 Article 52 CFR.

(17)

3.3.1 The definition of personal data

The concept of personal data includes any information that is related to an identified or (directly or indirectly through identifiers) identifiable natural person.64 Its scope includes a

wide range of data. The CJEU, for instance, ruled in the case Digital Rights v Ireland that location data and traffic data of electronic communication networks, in other words metadata,65 are also personal data because a detailed image of one’s private life can be

derived from such data.66 Similarly, it considered IP addresses to be personal data in the case

Breyer v. Deutschland.67 Internet connectivity enable robots like Buddy, Mabu and Harmony

to process IP-addresses and metadata.

Personal data can be divided into non-sensitive and sensitive data. The latter are qualified as special categories of personal data.68 In contrast to non-sensitive data, sensitive data enjoy a

higher level of protection because the misuse of such data could have dire consequences on fundamental rights and freedoms of individuals.69 The right to privacy, freedom of speech,

freedom of religion, and protection from discrimination are examples of other fundamental rights that can be affected.70 Article 9 of the GDPR prohibits the processing of special

categories of personal data, unless one of the mentioned exceptions is applicable. It also states the types of data that are qualified as special categories, which include information

concerning sexual life, sexual orientation, and biometric data or health. Health data include information on diseases or disabilities (autism), any clinical treatment, and physical or psychological state.71 Biometric data includes information on physical, physiological or

behavioral characteristics, resulting from a specific technical process.72 The robots Buddy,

Mabu and Harmony all process information relating to identified natural persons, namely their user and possibly persons that are closely related to their user. These types of HIR all process sensitive data: Mabu asks for health data; both Mabu and Buddy use facial and voice recognition technologies to collect data relating its user’s physiological characteristics; Buddy can send medication reminders; and Harmony processes data concerning its users’ sexual preferences. The websites of each robotic development company also mention other types of

64 Article 4(1) GDPR. 65 FRA, 327.

66 Joined cases C-293/12 and C-594/12, Digital Rights v. Ireland (CJEU, 8 April 2014), par. 32-36.

67 Case C-582/14: Patrick Breyer v Bundesrepublik Deutschland (CJEU 19 October 2016), par 44-49. 68 Recital 51 GDPR.

69 Article 29 Working Party (WP29) ‘Advice paper on special categories of data (“sensitive data”)’ (2011), p.4.

70 WP29, ‘Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result

in a high risk” for the purposes of Regulation 2016/679’ (WP 2482017), p.6.

71 Recital 35 GDPR.

(18)

data the robots collect and store. Additionally, the developer of Mabu mentions various types of data it processes in its privacy notice.73 These are compiled below, and categorised into

non-sensitive and sensitive data:

73 further discussed in section 4.3.

74 BFR, ‘Robot’ (2020) <http://www.bluefrogrobotics.com/robot/> accessed 1 December 2019. 75 Ibid.

76 CH,‘Privacy Policy (2019) <http://www.cataliahealth.com/privacy/>; CH.’How it Works’ (2019) <http://www.cataliahealth.com/how-it-works/> accessed 17 December 2019.

HIR non-sensitive data sensitive data

Robot Buddy names, gender, age, address,

telephone numbers, location, (smart) devices and network data, including IP-address, photographic images, information on its user’s activities, interests, messages (the content and metadata), music preferences, and all other self-provided data.74

-biometric data from facial and voice recognition technologies. -health data:

o information on medication; o information on user’s

psychological state, namely emotions;

o possibly whether a child has autism;

o all other self-provided information.75

Robot Mabu names (first and last), e-mail

address, gender, birthyear, location, including zip code, interests,

personality, all other self-provided data enabling personalized

conversations.

(According to the privacy notice, the above-mentioned data are part of a non-exhaustive list.)76

-biometric data from facial and voice recognition technologies, -health data:

o disease state (symptoms) and progression;

o clinical treatment, such as information on medication, including type, usage, adherence, side-effects and nutrition;

(19)

Table 1: Types of personal data HIR process

3.4 Risk assessment

3.4.1 Defining risks

There is a relationship between data processing and risks for data subjects. Risks refer to undesirable occurrences that derive from a wide range of sources, such as external attacks on software systems. The term includes two components: their likelihood of occurring and to what extent their effects are severe. To illustrate the phenomenon of risks, the

internet-77 Ibid.

78 General information the company states to collect. However, it is not clear what information its Robot processes (see section 4.4). Realbotix, ‘Privacy Policy’ (2018) <https://www.realdollx.ai/Policy/PrivacyPolicy> accessed 17 December 2019.

79‘Harmony’,(Realbotix 2019)< https://www.realdoll.com/product/harmony-x/>.

o psychological state,

including stress and anxiety levels;

o physical state, such as weight and exercise;

o all other self-provided health data.

(According to the privacy notice, the above-mentioned data are part of a non-exhaustive list.)77

Robot Harmony name, address, financial data,

demographic data (such as age and gender), device and network data, interests and all other self-provided data or provided by third parties, such as social networks (when the user provides consent).78

sexual life and orientation, including what features users prefer in a partner, such as a body type and personal traits.79

(20)

connected smart doll, Cayla, is a good example. Although speech-recognition technology enabled the doll to collect a significant amount of personal data of children and their parents, it was not fully secured. As a result, the doll could be easily accessed by hackers via

Bluetooth, who could listen to conversations or speak directly to children. Therefore, German regulators classified the doll as an illegal surveillance device and banned its distribution in 2017.80 It can be argued that social, care and sex robots are also vulnerable to cyber security

attacks, given that, much like Cayla, these HIR can connect to Wi-Fi and Bluetooth.

Considering the two components of risks, namely their likelihood and severity, the level of risks can range from low to high.81 The higher the severity of a risk, the more complex the

assessment and measures that should be undertaken to mitigate this risk.82 The next section

explores whether data processing of HIR has the potential to result in high risks.

3.4.2 Assessment criteria

In order to assist data controllers to assess the phenomenon of high risks, the WP29 issued guidelines in its working paper on Data Protection Assessment (DPIA). It states that the following non-exhaustive list of criteria could be considered.83

Assess whether the processing:

1. Can potentially result in harming the data subject.84

Harm includes physical or material loss, such as financial loss, non-material damage, identity fraud, discrimination or reputational damage.85

2. Impedes data subjects to exercise their rights or use a service.

This is the case when the processing aims to allow or refuse access to a service.

3. Is regarded as systemic monitoring of natural persons in a publicly accessible area,

which involves observing, monitoring and controlling individuals.86

80 P. Otermann, ‘Cayla doll’ (Guardian 2017)

<https://www.theguardian.com/world/2017/feb/17/german-parents-told-to-destroy-my-friend-cayla-doll-spy-on-children> accessed on 3 October 2019.

81 WP 248, p.10; A. Mantelero, ‘Comment to Article 35 and 36’, GDPR Commentary (Edward Elgar Publishing 2019,

Forthcoming), par B.I.1.

82 Ibid par A.

83 WP248; recital 75 GDPR; FRA, p.181.

84This criterium is not explicitly mentioned by the WP29 but in Mantelero, par A and implicitly in recital 75.

85 Ibid.

(21)

4. Involves sensitive data.

5. Concerns data of data subjects that are considered to be vulnerable persons, such as

patients, the elderly and children are considered to be vulnerable.87

6. Is meant for profiling.

Profiling concerns any form of automated processing, which entails using personal data for evaluating, analysing or predicting personal aspects, such as health, preferences, interests and behaviours.88

7. Happens on a large scale.

The following factors are relevant for this assessment: the number of data subjects, whether the processing activity concerns a variety of data, its geographical extent and duration.

8. Is based on new technologies.

AI, biometrics or facial recognition are technologies that are considered invasive.89

9. Makes automated decisions with legal or similar effects.

For example, discrimination against individuals.

10. Matches or combines datasets, obtained from multiple sources.

3.4.3 Criteria applied to the HIR

These criteria issued by the WP29 may be applied to Buddy, Mabu and Harmony, in order to assess their potential impact on data privacy rights.

87 WP248, p. 9 88 Article 4(4) GDPR.

89 ICO, ‘risks,’<

(22)

I. Buddy

Some of the criteria mentioned above may apply to Buddy:

- Sensitive data

As mentioned in the previous section, Buddy processes biometric data and health data. Its function of taking up the role of companion or personal assistant enables Buddy to process such data.

- Vulnerable people

In society, children and the elderly are considered to be vulnerable. Since Buddy is suitable for families with (disabled) children and the elderly, it can process data of vulnerable persons.

- Potential damage

Given that Buddy processes sensitive data and the users can be vulnerable, disclosing of their data (by unauthorized persons) might lead to non-material damage, such as

psychological suffering or reputational damage.90

- Profiling

In order to recognize and personally interact with various persons, Buddy is able to automatically store, analyze and predict personal aspects of its individual users, such as their music preferences and gaming interests.

- Large scale & combining datasets

Due to its IoT technology, Buddy can retrieve personal data from its users’ other smart devices and include these data in its own dataset. Besides its connectivity, its ability to process data of each family member allows the robot to process both a considerable amount and a wide variety of personal data for a long period of time.

II. Mabu

The following criteria of the above-mentioned table may apply to Mabu:

90 Recital 75 GDPR; L. Stevens et al., ‘Dangers from Within?’ Data Protection and Privacy: (In) visibilities and

(23)

- Sensitive data

Mabu processes biometric data and a wide range of health data. Its function, taking up the role of a health coach, enables Mabu to process such data.

- Vulnerable people

Mabu assists patients with chronic diseases, who are considered to be vulnerable.

- Potential damage

Since the Mabu processes sensitive data of vulnerable persons, exposing of their data (by unauthorized persons) might lead to non-material damage, such as psychological

suffering, reputational damage or discrimination.

- Profiling

Its AI technology enables Mabu to automatically evaluate and predict the health and behaviour of its users, for example, their progress with a treatment and factors that influence this, such as medication adherence. This is wat enables Mabu to respond in an appropriate way to foster positive healthcare outcomes.

- Large scale & combining datasets

Mabu is able to process both a considerable amount and a wide variety of its users’

personal data, for a long period of time. Furthermore, it can retrieve medical data from the user’s healthcare provider.91

- Impeding data subjects to use a service

Since Mabu is only available through healthcare providers, the robot will most likely be reimbursed by health insurers.92 However, it may be that patients who do not comply with

their treatment will be disadvantaged by their health insurers. For instance, they may be excluded from certain services, such as using the robot, or charged higher premium rates.

III. Harmony

The following criteria of the above-mentioned table may apply to Harmony:

91 CH (2020) <http://www.cataliahealth.com/helpfaq/>.

(24)

- Sensitive data

Harmony processes information concerning sexual life and orientation.

- Vulnerable people

Harmony can fulfil the needs of disabled persons or the elderly.

- Potential damage

In society, using a sex robot is considered to be rather strange than common. The

exposure of such information and sensitive sexual orientation data can, therefore, lead to non-material damage, such as psychological suffering, reputational damage or

discrimination.

- Profiling

Harmony is connected with an application, in which a user profile is made. The profile contains, among other things, information on what traits the user finds attractive so that the robot can automatically behave in a certain way. In order to respond appropriately, it can predict what the needs of its user are.

- Large scale

Harmony is able to process both a considerable amount and a wide variety of its users’ personal data, for a long period of time.

3.4.3 Conclusion

The findings of the risk assessments show that Mabu meet seven of the high-risk assessment criteria, and Buddy and Harmony each meet five. In its DPIA paper, the WP29 also provides examples of data processing cases that are likely to pose high risks. When they met at least three of the above-mentioned criteria (or a combination of two more infringing criteria, such as systematic monitoring of vulnerable people), they were regarded as high-risk processing activities.93 This chapter, therefore, draws the conclusion that the processing of personal data

by robots Buddy, Mabu and Harmony is likely to result in high risks to the rights and

freedoms of their users. In order to reduce these risks, HIR manufacturers should implement

(25)

the DPbD obligations by taking adequate safety measures. 94 The next section further

elaborates the concept of DPbD.

4. Legal Analysis: Practical guidance towards DPbD-compliant HIR

4.1 Introduction

Before the GDPR entered into force, embedding privacy or data protection safeguards into the core of information systems, known as Privacy by Design (PbD),95 was already considered to

be crucial for mitigating risks to data subjects. With the emergence of Privacy Enhancing Technologies (PETs) in the 1990s – technological solutions that aimed to mitigate privacy and data protection risks of individuals – the concept of “privacy design” also emerged, as

Cavoukin called it. According to Cavoukin, privacy design consists of seven Foundational Principles (see section 4.3), which stress that privacy or data protection should not only be a remedial afterthought, but taken into account both from the start and throughout the full system development lifecycle.96 Taking a proactive attitude by considering relevant privacy

requirements in a system’s architectural design and building them into systems by default, so that privacy is automatically protected, is required among other things to comply with these principles.97 In 2010, the 32nd International Conference of Data Protection and

Privacy Commissioners adopted a resolution stating that PbD is an essential part of protecting privacy. Hence, they encouraged organizations to adopt the Foundational Principles.98 The

policy documents of European bodies, such as the European Data Protection Supervisor (EDPS), are in line with the resolution and also refer to the seven Principles.99

In article 25 of the GDPR, the term “Privacy by Design” is replaced by the term “Data Protection by Design” (DPbD). Although both terms have the same meaning, the DPbD is a fully enforceable legal obligation but PbD not. This chapter further clarifies what this obligation entails. It also provides a more explained guidance on how HIR manufacturers

94Council of Europe, ‘Guidelines on the protection of individuals with regard to the processing of personal data in a world of Big Data’ (2017), p.3.

95 This concept was enshrined in the former Data Protection Directive (95/45/EC) that is replaced by the EU General Data

Protection Regulation.

96 P. Tsormpatzoudi et al.,‘Privacy by Design: From research and policy to practice - the challenge of multi-disciplinarity’

Privacy Technologies and Policy (3rd edn Springer, 2015), p. 202.

97 European Data Protection Supervisor (EDPS), ‘Opinion 5/2018 Preliminary Opinion on privacy by design’ (2018), p. 4. 98 L.A. Bygrave, ‘Data Protection by Design and by Default: Deciphering the EU’s Legislative Requirements’ 4(2) Oslo Law

Review, p. 107.

(26)

should implement this obligation in practice and what current limitations might be.

Consequently, it answers the sub-questions: “What entails the DPbD obligation and how can developers be further guided on how to implement this obligation?”

4.2 The concept of data protection by design

4.2.1 Article 25 of the GDPR

The first paragraph of article 25 implies that the legal obligation of DPbD entails that systems should be designed in such a way that they automatically respect data protection principles, such as data minimization, as far as possible. Therefore, it obliges controllers - persons or companies determining the purpose and means of the processing100 - to take appropriate and

effective technical and organizational measures, both before the processing (at the time of determining the processing means) and during the processing itself, while considering the following factors:

- the state of art, which concerns the question which technological progress is currently available in the market;

- implementation costs, which refer to all resources the controller has in general, such as time or human resources;

- the nature or in other words the inherent characteristics of the data processing; - the scope, which examines the size and range of the processing activity;

- the context, which concerns the question under which circumstances the processing takes place;

- the purposes of the processing or its aims; and

- the risks, particularly taking into account their likelihood and severity.101

The controllers must integrate necessary safeguards into the data processing mechanism to ensure that the requirements of the Regulation are met. Then, the data subjects’ rights are sufficiently protected.102

Additionally, the third paragraph of article 25 states that compliance with DPbD can be demonstrated with an approved certification mechanism.103 Although the certification

100 4(7) GDPR. BFR (social purposes), CH (health purposes) and AC (sexual pleasure purposes) are responsible for

determining the means and purposes of the HIR.

101 European Data Protection Board (EDPB), ‘Guidelines 4/2019 on Article 25 Data Protection by Design and by Default’

(2019), p. 8-9.

102Article 25(1)(3) GDPR

(27)

mechanism is still in the developing phase and there are no accredited bodies that can provide certificates yet, it would be beneficial for HIR manufacturers to know how they can gain a compliance certificate and prevent paying fines imposed by data protection authorities (DPAs). When considering article 25, it is also important to consider other provisions of the GDPR: article 5, in which the data protection principles are enshrined, articles 12 to 22 and recital 4, which set out the rights and fundamental freedoms of data subjects (further elaborated in section 4.3).

The ECtHR and CJEU have not explicitly ruled on article 25 or referred to it yet. However, both Courts indirectly nurtured the obligation. In 2008, the ECtHR already found that a lack of technological-organizational measures to secure the confidentiality of a patient’s health data in a public hospital was not acceptable. In the case I v Finland, the hospital’s health records system did not consist of a logging mechanism that could register every logging in and trace whether unauthorized third persons had consulted the records. Therefore, the Court ruled that Finland’s positive obligation to ensure the right to respect for private life (article 8 of the ECHR) was violated.104 Thereafter, the CJEU decided in the Google v Spain case that

Google should make its search engine operations more privacy friendly by reconfiguring systemic aspects.105

Both article 25 and case law do not prescribe specific technical and organizational measures, such as business strategies. One reason that the current wording of article 25 is broad and technology-neutral is that very specific DPbD requirements could stifle innovation.106

Recently, on 30 October 2019, the Berlin DPA, however, prescribed a more concrete measure. It imposed a €14.5 million fine on a German Real Estate company that retained personal data substantially longer than necessary. The company could not demonstrate appropriate measures; for instance, implementing the possibility to separate and delete data with different retention periods in the company’s archiving system. Considering this, the DPA found that the company violated article 25(1) of the GDPR.107

104 I v Finland (2008) App No. 20511/03 (ECtHR, 17 July 2008). 105 Case C-131/12, Google Spain v Spain (CJEU, 13 May 2014).

106 Koops B.J. and Leenes R., 'Privacy regulation cannot be hardcoded. A critical comment on the ‘privacy by design’

provision in data-protection law' (2014) 28(2) International Review of Law, Computers & Technology p. 162.<DOI:10.1080/13600869.2013.801589>.

107 EDPS, ‘Berlin Commissioner for DP Imposes Fine on Real Estate Company’ (2019),

< https://edpb.europa.eu/news/national-news/2019/berlin-commissioner-data-protection-imposes-fine-real-estate-company_nl> accessed 20 December 2019.

(28)

4.3 Practical guidelines for both existing and future HIR manufacturers

According to the DPbD guidelines of the European Data Protection Board (EDPB), HIR are likely to be DPbD-compliant, as long as their developers can demonstrate their chosen

measures in the processing are appropriate to safeguard data protection and that they have not implemented solely generic measures.108 Hence, this section provides practical guidelines that

are suited to the described operation processes of HIR in chapter 2 and 3. These guidelines are developed by reviewing to what extent and how the privacy engineering method of Hoepman and others can be applicable to HIR developers.109 Privacy engineering is a research field

consisting of a set of methodologies, tools and techniques to make systems privacy-friendly or DPbD-compliant.110 Besides Hoepman’s guidelines, there are a number of non-sector specific

guidelines that all aim to contribute to privacy engineering. However, this paper selected his Privacy Design Strategies because they focus on translating the GDPR’s data protection principles into practical guidelines. The EDPB’s DPbD guidelines confirms that promoting the implementation of all data protection principles is desirable.111 Additionally, the

guidelines are widely recognized in literature. For example, European agencies, such as the European Union Agency for Network and Information Security (ENISA), referred to them.112

Since DPbD is a relatively new term, the old term PbD is more dominant in existing literature and guidelines and, therefore, more used in the next sections.

4.3.1 Existing non-sector specific guidelines

In order to understand current guidelines for privacy engineering, it is essential to gain insight into the earlier mentioned seven Foundational Principles, on which the principle of PbD is based. Although no universal definition explaining how to interpret these principles exists, several institutions and academics have endeavored to translate these into more concrete guidelines. Recently, in October 2019, the Spanish DPA (AEPD), for example, issued guidelines.

108 EDPB, p.7.

109M. Colesky, J. Hoepman et al., ‘A Critical Analysis of Privacy Design Strategies’ [2016] Conference 2016 IEEE SPW 2016; AEPD; J. Hoepman,‘Privacy Design Strategies (The Little Blue Book)’ (2018)

<https://www.cs.ru.nl/~jhh/publications/pds-booklet.pdf.

110 Y.S. Martin and J.M. Del Alamo, ‘A Metamodel for Privacy Engineering Methods’, CEUR Workshop Proceedings

(2017), pp. 41-42.

111 Lawfulness, fairness, transparency, purpose limitation, data minimization, accuracy, storage limitation, integrity and

confidentiality.

(29)

113

113 Table made in Word. All information is paraphrased and retrieved from: A. Cavoukian ‘PbD’ (2011)

<https://www.ipc.on.ca/wp-content/uploads/Resources/7foundationalprinciples.pdf> accessed on 20 December 2019; PD, ‘A guide to PbD’ (2019) <https://www.aepd.es/sites/default/files/2019-12/guia-privacidad-desde-diseno_en.pdf> accessed on 20 December 2019.

(30)

Privacy Engineering Approach

Privacy Design Strategies, Privacy Design Patterns and Privacy Enhancing Technologies (PETs) enable engineers to incorporate these foundational principles (and the stricter GDPR requirements) in all six stages of the development lifecycle of a product, system or process (see figure 1). Design strategies can be used in the “concept development” and “analysis” stages of the development lifecycle. They clarify the approach to achieve the particular PbD goal. In the design phase, patterns are useful tools for solving general design problems.

Figure 2: Development process114

These design-guidelines describe how software components should be organized so that strategies can be implemented in the system. 115

After a system is already designed, technical solutions or PETs are applicable to implement specific patterns, and solve a particular privacy problem.Since there a considerable number of PETs developed and available, both already implemented software pieces or concrete

standards ready to be embedded, which can be used by HIR developers, this study does not focus on describing PETs.116 However, knowledge about data protection rules is essential

before they can select and implement a suitable PET. Therefore, this study solely clarifies the privacy design strategies and patterns and examines to what extent they can be tailor-made to HIR systems.

I. Privacy Design Strategies

114Cycle diagram created on:

https://www.smartdraw.com/cycle-diagrams/examples/cycle-diagram-example-systems-development-life-cycle/, based on Hoepman’s system development lifecycle in Hoepman (2018), p.23.

115 ENISA, p. 17; EDPS, p.14; Hoepman (2018).

116 C. Bier and E. Krempel, ‘Common Privacy Patterns in Video Surveillance and Smart Energy’, ICCCT-2012 (2012), p. 610

<https://ieeexplore.ieee.org/document/6530407>; EDPS, pp. 16-17.

(31)

In order to provide concrete steps to implement PbD in the early stages, Hoepman developed eight privacy design strategies, which are based on the ISO 29100 Privacy Framework, guidelines of the Organization for Economic Co-operation and Development (OECD) and the GDPR. Because Hoepman’s guidelines do not explicitly mention on which GDPR principles they are based, this paper selected the relevant provisions (see table 2). His strategies are divided into data-oriented (technical nature) and process-oriented strategies (organizational nature). Whereas the first category aims to minimize the impact of the data processing itself, the second category focuses on the interaction between controllers and data subjects to ensure that their rights are safeguarded. HIR developers should not consider only one but all eight strategies.117

II. Privacy Design Patterns

ENISA’s report on DPbD and thereafter the AEPD, both linked the privacy design strategies to general Patters, which are both included in table 2.118 This study slightly differs from

certain pattern and strategy combinations stated by ENISA or AEPD because it found other combinations more suitable or relevant for HIR developers.

117Colesky et al., 17-18.

118ENISA, pp. 19-22; AEPD, pp. 23-24.

119 All information in the table is retrieved from Hoepman, pp. 27-28 and ENISA (2014), pp. 19-22.

Privacy Design strategies and Patterns119 Data-oriented

strategies

Description Privacy Design Patterns

1. Minimize entails that the amount personal data processed must be restricted as much as possible by excluding unnecessary data with regards to the processing purposes.

- mechanism that select relevant data;

- mechanisms that strip out or destroy data that are no longer needed;

(32)

120 FRA, p. 93.

121 views are subsets of databases that are generated from queries. See <

https://www.techopedia.com/definition/25126/view-databases>.

122 Techdifferences, ‘Difference’ (2017)<

https://techdifferences.com/difference-between-client-server-and-peer-to-peer-network.html>.

123 K. Lauter et al., ‘Can Homomorphic Encryption Be Practical?’ (2011) Proceedings of the ACM Conference on Computer

and Communications Security Workshop, p. 113 < https://doi.org/10.1145/2046660.2046682>.

referred to as the principles of data minimization, purpose and storage limitation in the GDPR:

- recitals: 39, 49, 67, 78, 156.

- articles: 5(1)(b)(c), 25(1), 47(d), 89(1).

- anonymization; the process of eliminating all identifying information from a dataset. As a result, the data subject can no longer be directly or indirectly identified.120

2. Separate entails isolating or distributing the

processing or storage of data logically (for example, over separate databases or by determining views.121

referred to in the GDPR: - recitals: 29, 70.

- articles: 4(5).

- applying a peer-to-peer model instead of a client- server model. Whereas, in the first model data processing takes place in endpoints, such as smartphones, in the second model it takes place in a central server, such as mail servers;122

- homomorphic encryption; allows computations on data being encrypted.123

3. Abstract

entails the limitation of the amount of

detail of personal data to the greatest extent. Controllers can, for example, summarize data (collect age instead of the date of birth); process data of a group

- the granularity of location data;

Referenties

GERELATEERDE DOCUMENTEN

the kind of personal data processing that is necessary for cities to run, regardless of whether smart or not, nor curtail the rights, freedoms, and interests underlying open data,

In summary, we have demonstrated that it is possible to achieve catalytic asymmetric addition of organometallic reagents to stereochemically challenging

Figure 9.1: Schematic representation of LIFT (adapted from [131]), where the absorbed laser energy (a) melts the donor layer resulting in droplet formation [16, 18] or (b) transfers

Taking into account that data separation strategies constrain commercial communication and strengthen responsible gambling approaches, their implementation may lead

Article 29 Working Party guidelines and the case law of the CJEU facilitate a plausible argument that in the near future everything will be or will contain personal data, leading to

20 European Commission (2015) M/530 Commission Implementing Decision C(2015) 102 final of 20.1.2015 on a standardisation request to the European standardisation organisations as

To be specific, the private interests that lie in the exercise of either rights concerned often relates to the most important policy objective that EU data protection law aims

systemically investigated using synchrotron-based PES and NEXAFS, DFT, and MD simulations, from which we make the following observations: (i) the average tilt angles of the Fc