• No results found

The Privacy Risks of Household Appliances Connected Through the Internet of Things

N/A
N/A
Protected

Academic year: 2021

Share "The Privacy Risks of Household Appliances Connected Through the Internet of Things"

Copied!
79
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

The Privacy Risks of Household Appliances Connected Through the Internet of Things Vanessa Simões de Azevedo

Student ID: 2111543

Thesis submitted to the Faculty of Governance and Global Affairs of Leiden University in partial fulfilment of the requirements for the MSc Crisis and Security Management

(2)

Abstract

[…] when ubiquitous devices pervade our physical world we cannot be certain when our actions are being monitored. (Williams, Nurse, & Creese, 2016, p. 6)

The introduction of the Internet of Things (IoT) in people’s lives revolutionized how we interact with technology, leading society to the interconnected digital age. Accordingly, the spread of household appliances with IoT changed the way privacy is perceived in private spaces, posing several security concerns related to the exercising of ordinary tasks. Unfortunately, when not properly addressed such concerns have the potential to turn into emerging risks, particularly when it comes to consumers’ privacy. Hence, the enforcement of the General Data Protection Regulation (GDPR) manifested the worries of European Union (EU) citizens about privacy in the digitalized era, demanding transparency, access, consent and accountability to the processing of personal data. In this context, this thesis assesses to which extent the current regulation perceives household appliances connected through the IoT as an emerging risk, by contrasting the GDPR with the most prominent privacy risks related to the use of IoT appliances. Thus, this research aims to clarify several implications of the large-scale data collection conducted by IoT devices. As a result, this thesis confirms that multiple emerging risks are not accurately addressed by the current regulation and proposes a set of recommendations for legal evolution in the privacy domain. Finally, it suggests further research to minimize the impacts of additional risks that may emerge with the technology’s massive adoption.

Keywords: Internet of Things (IoT), privacy risk, General Data Protection Regulation (GDPR), household appliances

(3)

Table of Contents

1. Introduction ... 5

2. Theoretical Framework... 9

2.1 The Realms of Privacy ... 10

2.2 The Development of Household Appliances with IoT ... 14

2.3 Privacy Risks of IoT ... 17

2.4 The Theory of Emerging Risks ... 24

3. Methodology ... 26

3.1 Justification of Research Design and Logic of Case Selection ... 26

3.2 Operationalization ... 27

3.3 Validity Issues and Limitations ... 32

4. Analysis ... 34

4.1 Identification of Individuals Through Raw Data ... 34

4.2 Data Breaches ... 36

4.3 Inaccurate Definitions of ‘Data’ and ‘Goods’ ... 38

4.4 Inadequate Authentication Methods ... 39

4.5 Unconsented Security Expiration Dates ... 40

4.6 Unaware Collection of Personal Data Through the Presence of Devices ... 42

4.7 Weaker Party ... 43

4.8 Uncontrolled Dissemination of Data to Secondary Appliances and Third Parties 45 4.9 Unaware Collection of Data Through Default Settings ... 47

4.10 Incorrect Data Circulating About Individuals ... 49

5. Discussion ... 51

(4)

6. Conclusion ... 61

References ... 65

Footnotes ... 72

(5)

1. Introduction

The IoT technology is experiencing a rapid growth in the number of devices connected to the networks (Maple, 2017). For that matter, IoT devices can be defined as items embedded with sensors that are connected to the Internet (Maple, 2017). In order to perform better, optimize and customize the services provided, these devices often collect vast amounts of personal data, potentially posing a threat to users’ privacy (Wachter, 2018). In that sense, the owning of IoT appliances can create several risks for consumers, which might be amplified by weak cybersecurity standards and the devices’ inability to anonymize data (Wachter, 2018).

Some of these risks were illustrated in the Mirai case. In August 2016 a malware named Mirai was identified by a ‘whitehat security research group’; together, Mirai, its variants and imitators were used to perform a few of the most powerful DDoS attacks in history (Kolias, Kambourakis, Stavrou, & Voas, 2017). The malware is able to scan the Internet for IoT systems, select vulnerable devices protected by hardcoded or factory default usernames and passwords, infect them and transform these devices into bots that will report to a central control server and be used to launch DDoS attacks in a later stage (“Mirai botnet targets IoT devices,” 2016). Hence, the malware was able to spread first to webcams, routers and DVRs, and use brute force afterwards to deduce administrative credentials of other IoT devices connected to the same networks (Kolias et al., 2017).

The rapid spread of the Mirai malware showed the risks related to the lack of transparency in the managing of personal data by IoT devices and raised awareness concerning how crucial it is to tackle this issue. But the Mirai was not the only critical case of IoT hacked devices that took place in the past few years: the hacking of Wi-Fi enabled Barbie dolls and cars, such as the Jeep Cherokee, also help to reveal the scope of risks entailed in the attempts to “interconnect physical systems and embed them in a larger

(6)

network of devices and appliances” (Tanczer, Steenmans, Elsden, Blackstock, & Carr, 2018, p. 1).

After the occurrence of data breaches like these, the approval of the GDPR (“GDPR General Data Protection Regulation,” 2016) by the EU was considered a huge step towards protecting consumers’ privacy. With the increased exchange of data, retailers are substantially being provided with more information about consumers, which therefore encouraged recent developments of new data protection regulations by the EU that aim to guarantee an adequate level of digital privacy for European citizens.

In order to keep up with technological changes responsible for having impacted “how data is collected, stored, shared, and transferred” (O’Brien, 2016, p. 81), the EU replaced the Data Protection Directive (“Directive 95/46/EC,” 1995) from 1995 with the GDPR. The old regulation was responsible for the implementation of new rights for individuals, as well as for the enhancement of accountability and responsibility regarding the collection and processing of personal data by companies (Kelly, 2014). In this regard, the directive comprised, for instance, data protection and data processing principles (Kelly, 2014).

Accordingly, the GDPR was designed to harmonize data protection standards within EU member states and enhance individual rights with a more coherent structure to enforce stronger penalties (O’Brien, 2016). Applicable from 25May 2018, the document is a pioneer regulation that considers data protection a fundamental right, ensuring a clear language for written privacy policies, the need for an affirmative consent from users before their data can be handled by businesses, stronger rights, more transparency and stronger enforcement of users’ rights (“A New Era for Data Protection in the EU - What Changes after May 2018,” n.d., 2018).

(7)

Despite the growing concerns regarding privacy regulations, the amount of personal information exchanged online is being boosted by the introduction of the IoT in daily aspects of people’s lives, particularly with the use of connected appliances in private homes (H. Weber, 2015). For example, by using IoT devices it is possible to determine the identity and the behavioural patterns of individuals. The risks the technology can pose become even more problematic when combined with the monetization of personal data and the ethical issues it entails. Hence, if the IoT technology is not precisely regulated in time we could face a large-scale privacy issue before long.

It is known the GDPR implements “new data protection standards relevant for IoT devices” (Wachter, 2018, p. 437), but does it address all the potential data protection issues of the technology? Does it consider all the IoT’s social implications and bring them to the public debate? Does it contemplate all data protection risks of connecting household appliances to the IoT network? By specifically regulating personal data, is it possible the GDPR leaves a breach concerning the gathering of raw data by IoT appliances?

Therefore, the aforementioned concerns drive the present explanatory research, which is aimed at answering the following question:

To what extent the current regulation perceives household appliances connected through the IoT as an emerging risk?

This thesis is developed as a single case study. It aims to analyse the privacy risks posed by household appliances with IoT. The choice for household appliances reflects the data collection revolution brought by IoT devices to private spaces that were once unmonitored. In other words, once such devices entered private spaces the risks they pose went beyond data protection and reached the scopes of privacy, consolidating the collection of data that can identify individuals within the private life. Hence, this research contrasts the privacy risks of the technology identified in the literature with the GDPR,

(8)

in order to indicate exactly what is covered in the EU by the regulation and what are the potential breaches it leaves when applied to IoT devices.

Hence, this research aims to answer to which extent EU citizens can trust the current regulation to protect them from privacy risks originated from the use of household appliances with IoT. The development of multiple types of IoT devices adds significant value to individuals and businesses, but also cause risks (H. Weber, 2015), a number of which are intrinsic to IoT systems (Wachter, 2018). Therefore, the relevance of this analysis is that it transcends the general acknowledgement that the IoT technology is revolutionary and recognizes it as an emerging technology, which therefore can pose emerging risks (Barr & Raimbault, 2011). Consequently, it exposes the importance to further investigate the impact of these risks on consumers’ privacy, particularly in private spaces, increasing public awareness over the dangers posed by IoT household appliances. Thus, this thesis denounces the GDPR’s inability to cover multiple privacy risks and adds to the literature by exposing the particularities of IoT household appliances, a matter that is currently under-researched given its relevance to today’s world.

Accordingly, this analysis aims to inspire the creation of innovative solutions to the privacy risks it exposes and assist the development of the IoT technology with stronger privacy and data protection requirements. Therefore, this thesis may decrease the tension between privacy and the development of IoT (Williams et al., 2016) by allowing the technology to expand and become the ‘digital revolution of the twenty-first century’; similarly, since guaranteeing consumers’ acceptance of IoT is directly related to its successful commercialization (Park, Kim, & Jeong, 2018), this thesis may contribute to increase the technology’s acceptance by the general public, consequently improving capitalisation. Also, considering the majority of people do not trust companies responsible for managing their personal data (B. Scibelli, 2013), this research can assist

(9)

companies to prevent trust issues that may emerge with the spread of the technology by bringing transparency to the collection and processing of data by household appliances with IoT.

The present research has been outlined as follows: the second chapter conceptualizes the keywords of the analysis, elucidates the theory and the hypothesis that were applied and presents a literature review on the topic; the third chapter introduces the methods used in the research, the logic of the case selection, as well as the justification and operationalization of the research design, and evaluates potential validity issues and limitations of the analysis; the fourth chapter presents the outlining of the content analysis; the fifth chapter elaborates a discussion on the findings and proposes recommendations for legal evolution; finally, the conclusion summarizes the results and clarifies the research question.

2. Theoretical Framework

This chapter conceptualizes privacy, consumers’ privacy and data protection, and clarifies the introduction of the right to privacy and the right to data protection in the EU’s legal framework. The concepts are crucial for this research, since the GDPR refers specifically to data protection and the right to data protection, aiming to improve consumers’ privacy and EU citizens’ privacy in general. Subsequently, it illustrates the risks posed by household appliances connected through the IoT. The final part elucidates the theory applied in this research, which identifies IoT as an emerging technology that may engender emerging risks.

(10)

2.1 The Realms of Privacy

Although “privacy is notoriously hard to capture” (Koops, Newell, Timan, Chokrevski, & Gali, 2017, p. 487), the concept has been formulated in several ways over the years (Moore, 2008). Thus, the literature provides “a rich discussion on the nature, definition, and conceptualization of privacy” (Hallam & Zanella, 2017, p. 218) in a twofold manner: by developing “a unitary conception of privacy in the form of a unified conceptual core” (Koops et al., 2017, p. 487) and “typological or pluralist conceptions of privacy by making meaningful distinctions between different types of privacy” (Koops et al., 2017, p. 487). As such, Hildebrandt argues “privacy is closely related to the protection of one’s identity” (as cited in McDermott, 2017a, p. 3), while Hallam and Zanella (2017) claim it can be divided “into different categories, namely information privacy, social privacy, psychological privacy, and physical privacy” (p. 218). Accordingly, En Yap (2001) identifies three dimensions of privacy concerns: ‘control over aspects of the self known to others’ – having control over who knows about you; ‘freedom from surveillance’ – not being monitored or observed; and ‘freedom from others’ – not being disturbed. Nonetheless, despite the fact that privacy is commonly conceptualized in a ‘de-contextual and abstract manner’, it is crucial to analyse it ‘locally and in context’ (Timan, Newell, & Koops, 2017).

Consumer information has always been of strategic value for companies (B. Scibelli, 2013), which creates one more obstacle for the protection of privacy. Understanding the demands for products involves knowing costumers to focus research and development towards the market’s needs and, consequently, reduce costs: by tracking costumer’s personal information, businesses can use personalized marketing to make products more appealing to the public (B. Scibelli, 2013). Similarly, possessing such information allows companies to create further-reaching promotions and programs to

(11)

reward customer loyalty, to assess the effectiveness of marketing operations and even customize advertisings strategies (Miltgen, Henseler, Gelhard, & Popovič, 2016). Overall, this plays a significant role when it comes to “long-term consumer satisfaction and commercial success” (Miltgen et al., 2016, p. 4659).

Still, several authors have pointed to the lack of awareness about privacy issues related to the disclosure of personal data. Consumers often accept privacy violations and are frequently portrayed as “passive marketplace participants who lack sufficient knowledge about how to protect their privacy” (En Yap, 2001, p. 58). Moreover, researches show they are usually not well-informed about the issues related to disclosing personal information (En Yap, 2001). Thus, consumers can be “willing to share personal information with minimal expectations or reward” (B. Scibelli, 2013, p. 43) and be “enticed into wanting to share their information based on the intricacy of the personalization of the product, and trust in the company” (B. Scibelli, 2013, p. 43).

In that sense, the literature identifies a phenomenon called the privacy paradox: a “discrepancy between consumers’ intentions to protect their privacy and the actual privacy protection practices they engage in to manage threats to their privacy” (En Yap, 2001, p. 57). The privacy paradox exposes a contradiction between privacy concerns and actual behaviour (Miltgen et al., 2016; Williams et al., 2016): it was used to prove that “privacy concerns primarily affect the distant-future intentions, which do not directly affect behavior” (Hallam & Zanella, 2017, p. 224). Thus, despite being aware and becoming more concerned about the risks of sharing personal data, consumers are not taking sufficient measures to protect themselves (En Yap, 2001) and it is possible the disparity will grow with the expansion of IoT (Williams et al., 2016).

However, this does not mean that consumers want to recklessly disclose personal information for companies: most people are concerned with the exposure of their privacy

(12)

and personal information to the technology industry. Consumers’ concerns about the use of personal data are directly related to how individuals use internet products and services. In a study conducted in 2013, consumers who were least active had higher concerns about the disclosure of their personal data (B. Scibelli, 2013). The study also reported a gap between the willingness and choice with respect to the exchange of personal information and the desire to be involved with technology trends and innovations (B. Scibelli, 2013): participants felt they did not have a choice and the only way to be involved with technological trends was to disclosure their data. Hence, consumers require more control over the management of their personal information.

Additionally, consumers’ trust is frequently jeopardized when companies are not transparent about the uses for collected data, for instance by selling it to third parties or abstaining to inform consumers about information collection, and privacy concerns can become effective inhibitors to the adoption of technology (Miltgen et al., 2016). Consequently, EU’s data protection rules have been substantially challenged in the last decade by the advancement of technology (Lindqvist, 2018). Globalization, along with fast technological developments led to a substantial increase on the scale of data collected and shared about users, creating unprecedented challenges to the protection of personal data (“GDPR General Data Protection Regulation,” 2016). In the last few years this tendency was intensified by the implementation of data analysis platforms, as data started to be considered a key economic asset (Janeček, 2018). These and many other concerns were addressed by the adoption of the GDPR in the beginning of 2018. The regulation was implemented to protect consumers’ privacy, which can be defined as “consumers’ control over their personal information” (Miltgen et al., 2016, p. 4659). Hence, the legal framework focuses on empowering consumers and increasing their trust in companies that collect their personal data.

(13)

The right to privacy, an “attempt to map privacy as a legal notion” (Koops et al., 2017, p. 493), appears in a number of EU’s legal instruments, including the European Convention on Human Rights (ECHR) (“European Convention on Human Rights,” 1953) and the Charter of Fundamental Rights of the European Union (“Charter of Fundamental Rights of the European Union,” 2000) (Custers, Dechesne, Sears, Tani, & van der Hof, 2018). As such, the right to privacy was developed by EU agencies in order to secure “a very wide and legally binding understanding of the notion of ‘respect for private life’” (Koops et al., 2017, p. 494).

Similarly, the first time an autonomous fundamental right of data protection was enforced by the EU was with the implementation of the EU’s Charter of Fundamental Rights in 2009 (McDermott, 2017). Accordingly, Yvonne McDermott (2017) claims the right to data protection endeavours to preserve the fundamental right of privacy, but the formulation of a data protection right distinct to the right to privacy was unprecedented in the European legal order.

However, some analysts assert the organization “has neither adequately justified the introduction of the right to data protection in the EU legal order nor explained its content” (as cited in McDermott, 2017, p. 1). In other words, the distinction between privacy and data protection is often not very clear when it comes to EU’s legislation, since the jurisprudence considers “privacy to be at the core of data protection” (Kokott & Sobotta, 2013, p. 223) and some courts “tend to treat data protection as an expression of the right to privacy” (Kokott & Sobotta, 2013, p. 222). Still, it is possible to distinguish privacy and data protection by the scopes of both rights, meaning “the information covered by the respective right” (Kokott & Sobotta, 2013, p. 225): whereas the former lies in the private life, which “does not necessarily include all information on identified

(14)

or identifiable persons” (Kokott & Sobotta, 2013, p. 225), the latter specifically covers this type of information.

The EU’s most recent regulation on data protection confirms the tendency to associate data protection to the protection of personal data: the legal framework exposes how the Union is striving to ensure “the protection of natural persons with regard to the processing of personal data” (“GDPR General Data Protection Regulation,” 2016 article 1, paragraph 1), as well as the protection of “fundamental rights and freedoms of natural persons and in particular their right to the protection of personal data” (“GDPR General Data Protection Regulation,” 2016 article 1, paragraph 2). Thus, the GDPR specifically targets the protection of personal data for all citizens under the umbrella of the regulation.

2.2 The Development of Household Appliances with IoT

In light of EU’s new legal framework, the introduction of devices connected through the IoT on the market brings distinct privacy issues and concerns. Considered as “one of the most significant disruptive technologies of this century” (Caron, Bosua, Maynard, & Ahmad, 2016, p. 5), the technology is expected to exert a massive impact on individuals’ everyday lives and on society as a whole (Caron et al., 2016): it promises multiple benefits for a number of stakeholders, such as individuals, organizations and third parties (Caron et al., 2016). By 2020, IoT is estimated to be embedded in hundreds of billions of devices across industries (Kamin, 2017).

The IoT has been defined in the literature in many different ways. In Kamin (2017), it is elucidated as “an open and partially standardized technology infrastructure that enables interaction between devices over the Internet using unique addressing schemes” (pp. 10–11). Likewise, Weber (2015) characterizes the IoT as “an emerging global Internet-based information architecture that facilitates the exchange of goods and

(15)

services” (p. 618). A more detailed framing is provided by the WP 29: “[A]n infrastructure in which billions of sensors embedded in common, everyday devices – ‘things’ as such, or things linked to other objects or individuals – are designed to record, process, store and transfer data and, as they are associated with unique identifiers, interact with other devices or systems using networking capabilities” (as cited in Lindqvist, 2018a, p. 47). The Institute of Electrical and Electronics Engineers’ special report, on the other hand, describes it as “a network of items – each embedded with sensors – which are connected to the Internet” (as cited in Maple, 2017).

As such, IoT devices amplify the extent and the scope of products and services that need to guarantee security (Tanczer et al., 2018). With a large range of application areas that encompass agriculture, energy, finance, automotive, manufacturing and health sectors, the IoT technology is considered “the amalgamation of diverse and interconnected technologies” (Tanczer et al., 2018, p. 1). Nevertheless, the provision of innovative services by IoT devices is challenging ‘traditional perceptions of privacy’ and “stretch[ing] existing social norms about privacy in public and private spaces” (Tene & Polonetsky, 2013, p. 63). In many of its uses, data collected is sensitive and might include the users location or physiological data, for instance (Jayaraman, Yang, Yavari, Georgakopoulos, & Yi, 2017). An IoT device, such as a smart coffee machine or toothbrush, “will send data from the device in the home out to the Cloud, leaving their private nature uncertain” (Timan et al., 2017, p. 245). Therefore, the collection of data by household appliances is especially problematic since it distorts the boundaries between public and private spaces: “sometimes we perform acts in public space that are private, […] while we also sometimes do things in the home that are public” (Timan et al., 2017, p. 8). Hence, privacy risks that formerly only applied to public spaces are now part of the domestic environment as well.

(16)

IoT’s increased level of heterogeneity along with the wide scale of its systems contributes to the anticipation of several expected and a few unforeseen security threats (S. Sicari, Rizzardi, Grieco, & Coen-Porisini, 2015). Some of the attacks that can be targeted to these devices by exploring its inherent vulnerabilities are: “message modification and/or alteration, traffic analysis, Denial of Service (DoS), Distributed DoS, eavesdropping, Sybil attacks, etc.” (Riahi Sfar, Natalizio, Challal, & Chtourou, 2018, p. 119). Moreover, IoT devices can suffer physical damages inflicted by an “environmental threat, [an] employee error, or a physical attack” (Kamin, 2017, p. 17), which can provoke malfunctioning, data theft and bad data. Besides, IoT devices require users to consistently trust that data platforms will store and share their information in an ethical manner (Sabrina Sicari, Rizzardi, Miorandi, & Coen-Porisini, 2018), even though technological developments are not properly supported by ‘clear ethical guidelines’ in every industry (Tene & Polonetsky, 2013).

As such, before being massively implemented the IoT should ensure a secure environment with regard to “security of communication/authentication, integrity of data and devices, privacy of users and personal data, and trustworthiness of the environment and of the involved parties” (Borgia, 2014, p. 11). In other words, it needs to provide data anonymity, include authentication and authorization mechanisms, guarantee data protection and confidentiality in the handling of users’ personal information and earn users’ trust (S. Sicari et al., 2015).

Nonetheless, the originality and freshness of IoT appliances may “distract consumers from the quantities of data they are disclosing” (Williams et al., 2016, p. 3) and hinder risk perception. Still, risk perception plays an important role in IoT’s acceptance in society: according to Park et al. (2018), it has a negative effect on costumers’ intentions of adopting the technology, which is even higher when “the product

(17)

is new, expensive, or technically complex” (Park et al., 2018, p. 2356) and “information or experience […] is insufficient” (Park et al., 2018, p. 2356). Therefore, the unique characteristics of IoT devices raise users’ concerns about the threats these products bring to their lives, especially when introduced in their own homes. For instance, since it is capable “of transmitting large amounts of data upon request or in an automated fashion” (Kamin, 2017, p. 17), the technology increases the scale of damages caused by potential data breaches.

Among the challenges identified in the literature, privacy was designated one of consumers’ largest concerns in the information age (Caron et al., 2016). In that sense, the use of ‘sensor-rich devices’ with IoT substantially increased the volume and speed of shared data, making it harder to control consumers’ privacy (Kamin, 2017). Hence, addressing IoT’s vulnerabilities in order to regulate the technology is proving to be challenging as many risks to privacy have been identified in the literature. Next, the ten most prominent risks that can jeopardize the adoption of household appliances with IoT will be elucidated in the following sections. These risks will then be contrasted with the current regulation in a later stage of the analysis.

2.3 Privacy Risks of IoT

The most remarkable risks brought by the use of IoT household appliances are outlined as follows:

 Identification of individuals through raw data

The capacity of IoT devices to collect large amounts of data that can be used to identify consumers and recognize their behavioural patterns is a major concern. Therefore, the increasing use of IoT household appliances collecting raw data (data directly collected from a source and not yet processed) in a daily basis within

(18)

private spaces facilitates the identification of individual patterns. However, current regulations tend to be particularly focused on personal data and do not consider the possible applications of raw data, which can be used along with analytical methods and combination to retrieve people’s identities (Weber, 2015). Hence, the definitions employed in data protection and privacy regulations might be insufficient to protect users.

 Data breaches

The adoption of multiple and collaborative appliances in the private space turns everyday objects into potential security threats (Caron et al., 2016). Therefore, the inevitable presence of security vulnerabilities in IoT devices originates extensive possibilities for data breaches, which can occur in several forms, such as wireless hackings, attacks on physical equipment and manipulation of false data: due to the technology’s capacity of transferring large volumes of data, blocking malicious traffic becomes trickier (Caron et al., 2016). In addition, many IoT devices operate with constrained resources and “communicate using weak or immature protocols” (Williams et al., 2016, p. 5), therefore, compromising data confidentiality. Likewise, since a number of IoT gadgets have energy and processing power constraints, “resource-intensive communication protocols are infeasible” (Williams et al., 2016, p. 5), thereby restricting the encryption of transmissions. Also, the spread of low-budget appliances in the market fosters the development of low quality products that prioritize functionality and aesthetic over security: “efforts to reduce manufacturing costs have implications for what might be viewed as the non-essential functionality of the device” (Williams et al., 2016, p. 5).

(19)

 Inaccurate definitions of ‘data’ and ‘goods’

The way that current regulations define ‘data’ and ‘goods’ can pose a risk, since the absence and the use of incomplete definitions may compromise users’ privacy. In that sense, data considered to be non-personal now can be classified as personal data in the future, which can hamper the formulation of privacy and data protection regulations nowadays (Janeček, 2018). Similarly, the unclear new meaning of ‘goods’ has been accused of jeopardizing IoT’s contractual regulation. Due to the characteristics of the technology, IoT devices encompass products engaged simultaneously in the online and offline domains (Lindqvist, 2018): “IoT data management moves from being a classic ‘offline’ system, where storage/query processing/transaction operations are managed offline, to be an ‘online/offline’ system, where processing and analysis are also real-time” (Borgia, 2014, p. 17). Thus, even if regulations stablish upgraded privacy measures addressing both concepts, the lack of a clear conceptualization of ‘data’ and goods’ can result in incorrect interpretations by legal authorities, potentially leaving IoT appliances exempted from compliance.

 Inadequate authentication methods

Inadequate authentication methods can compromise users’ privacy and trust in IoT appliances. In that sense, the monetization of data has made it more challenging to preserve anonymity and provide appropriate authentication to users since identifying users became valuable for companies, which started tracking people’s preferences for marketing purposes. Therefore, new authentication methods need to be constantly developed and standardized to ensure secure logins and protect individual anonymity (Caron et al., 2016). Thus, the authentication technology ought to evolve at a similar pace as IoT’s capabilities of collecting

(20)

sensitive data, allowing IoT appliances to provide reliable authentication frameworks, particularly household appliances collecting sensitive information. If that is not the case, consumers run the risk of having their identities easily exposed and becoming much more vulnerable to virtual attacks.

 Unconsented security expiration dates

Market forces may lead to an unconsented expiration date of the security and reliability of IoT devices, considering costumers will not be informed of the frequency of software updates before or during the purchase of appliances. Hence, since “software updates are an expensive fixed cost for a vendor” (Williams et al., 2016, p. 6), the market may increase appliances’ vulnerabilities by not releasing patches in a regular basis. The risk is magnified by the proliferation of cheap IoT devices in people’s homes, consequently decreasing prices and rendering sales less lucrative. Thus, if companies do not have a profitable initial sale or simply choose not to invest enough money to implement patches regularly, IoT devices might become increasingly vulnerable and eventually compromise the data confidentiality of users (Williams et al., 2016) that will most likely be unaware of the matter.

 Unaware collection of personal data through the presence of devices

One of the biggest challenges IoT faces with current regulations is that users are not always systematically notified when their personal data is being collected by IoT devices, since it is possible that they are not fully aware of the presence of sensors in environments they visit (Chow, 2017). Hence, the system has a “potentially unobtrusive nature of data collection” (Chow, 2017, p. 74) that defies an elementary premise of privacy: personal data can only be collected with a user’s choice and awareness. However, having access to ‘free and well-informed

(21)

consent’ is often challenging when using IoT appliances (Wachter, 2018), since “there is no opportunity for notice and choice in smart publics or any smart shared space” (Timan et al., 2017, p. 255).

 Weaker party

IoT contracts commonly determine ‘data subjects’, namely consumers and users, to be the ‘weaker party’ by not allowing them to refrain from agreeing with terms and continuing using products (Lindqvist, 2018). In other words, the imbalance created by such terms between consumers and companies is considered unfair, given that not agreeing with smart devices’ contract terms prevent individuals from using the purchased products for its original purposes (Lindqvist, 2018). Since consumers cannot choose the conditions they wish to comply with, they become vulnerable to all devices’ data requests and are usually unaware about whether it was essential or not to render all the requested information. Hence, consumers are not informed of data requests before purchases and cannot choose what data to provide after purchasing IoT appliances, therefore being obliged to comply with all data requests to prevent devices from becoming useless.

 Uncontrolled dissemination of data to secondary appliances and third parties Control over the diffusion of data to third parties, as well as from device to device, can be difficult to track. Consequently, even if consumers consent to the use of their data by one appliance, they might not be aware if other appliances will also have access to their data and subsequently share it with third parties: “in an interconnected web of applications and sensors, it appears that the control of information diffusion will almost be impossible” (Caron et al., 2016, p. 9). Thus, the introduction of IoT household appliances has the potential to significantly undermine consumers’ awareness over which third parties and devices linked to

(22)

the same network have access to their data, compromising previously given consent and increasing data vulnerability.

 Unaware collection of data through default settings

Appliances’ default settings may lead to the unaware collection of data considering settings are rarely modified by users. In that context, several controllers choose to obtain consent by asking users to agree with terms of services, but checking the box does not mean users are properly informed since terms are generally difficult to comprehend and reading “the terms of services one encounters in a day is unrealistic” (Timan et al., 2017, p. 252). Furthermore, the monetization of data contributes to the widespread implementation of default settings that “might not respect privacy and rely on the inertia of users to support data collection” (Williams et al., 2016, p. 4) in the private space. In addition, ordinary users might find it difficult to reconfigure devices, particularly if they have “limited interfaces or require advanced technical knowledge” (Williams et al., 2016, p. 4), and will most likely not reconfigure each household appliance when the technology is disseminated to several homes. This, aligned with the development of extremely heterogeneous networks – resulting from the manufacturing of an unprecedented number of IoT devices –, reduces the likelihood of standardization and the probability that regular users will fully master operating systems of IoT appliances (Williams et al., 2016).

 Incorrect data circulating about individuals

The disclosure of inaccurate information about consumers might jeopardize people’s lives and potentially have irreversible consequences. This is usually caused by external factors or the malfunctioning of products, since some devices may start spewing out bad data when malfunctioning (Lawson, 2016); thus,

(23)

“ensuring data quality is a major critical requirement in IoT” (Bertino, 2016, p. 2). Products processing sensitive data pose an even higher threat since the potential impact may be irreparable to users. In that sense, according to Gürses, Troncoso and Diaz (2011), disclosures of sensitive data might lead to individuals becoming targets and victims of harassment by their social environments and by different organizations as well, such as the police.

Nonetheless, it is important to highlight the potential the IoT technology has for increasing household efficiency when operated for private purposes (Weber, 2015). The possibilities are endless, ranging from smart contracts for the use of electricity to the programming of autonomous orders for goods. Unfortunately, the technology still creates significant challenges regarding privacy, security and trust, which are considered the remaining barriers to its full development (Jayaraman et al., 2017). Hence, in order to anticipate risks, regulatory agencies must be capable of adapting and coping with the complications that may arise with IoT’s massive adoption. Its rapid spread worldwide is one of the reasons why it is so important to analyse the risks embedded on how we regulate IoT devices. In a future in which data collection is embedded in billions of insecure devices, privacy might not be an alternative (Williams et al., 2016).

Although the literature comprising IoT continues to broaden, up to now little research encompassing the risks and challenges of ensuring data protection specifically to household appliances connected through the IoT has been published. Moreover, researches about the subject are generally conducted as economic assets and technical analyses (Tanczer et al., 2018). Therefore, this thesis aims to correlate the risks the technology entails with the current regulation’s potential of ensuring privacy and data protection to consumers in the EU.

(24)

2.4 The Theory of Emerging Risks

The subject of emerging technologies is receiving ‘increasing attention’, which is validated by “the growing number of publications dealing with the topic and news articles mentioning emerging technologies” (Rotolo, Hicks, & Martin, 2015, p. 1827). Despite the existence of considerable disagreement about how to accurately define an emerging technology, Rotolo, Hicks and Martin (2015) identify it “as a radically novel and relatively fast-growing technology characterized by a certain degree of coherence persisting over time and with the potential to exert a considerable impact on the socio-economic domain(s)” (p. 1828). Thus, the inherent characteristics of the IoT support its identification as an emerging technology.

Still, emerging technologies “may potentially hide unexperienced risks” (Paltrinieri & Khan, 2016, p. 19), which can be framed as emerging risks. In that sense, the concept of emerging risks “has been widely discussed in both scientific and business communities” (Mazri, 2017, p. 2053), and a substantial variety of risks can be considered under the umbrella of emerging risks (Mazri, 2017). However, Barr and Raimbault (2011) identify a risk as emerging when it conforms with one of the following categories:

 A risk that did not exist before appears as a result of the emergence of a new technology (nanotechnologies, new information and communication techniques and so on) or a change in lifestyle and/ or a mode of production.  The existence of a risk that previously was undetectable is brought to the surface thanks to breakthroughs in scientific knowledge.

 An issue is transformed into a new risk as a result of changes in the perception of society […]. (Barr & Raimbault, 2011, pp. 1–2)

Therefore, according to the first category, the IoT technology can be considered an emerging technology that poses emerging risks. The previous section validates this

(25)

assumption, since it outlines a variety of risks posed by household appliances with IoT. Accordingly, the impacts of these risks were illustrated several times with worldwide hackings of IoT devices in the past few years, for instance by using the Mirai malware to launch unexpected DDoS attacks.

Hence, this thesis applies the theory of emerging risks previously conceptualized by Barr and Raimbault (2011) to assess privacy risks originated from the massive spread of household appliances with IoT. Consequently, the theory exposes the urgent need to identify how the current regulation addresses the technology and protects consumers against the emerging risks it poses.

As such, the GDPR is considered to represent “the most significant change in global privacy law for 20 years” (O’Brien, 2016, p. 81). Thus, the document was selected to identify potential data protection breaches in household appliances connected through the IoT. In this context, this analysis aims to classify the risks to privacy posed by the technology and examine which ones are covered or not by the GDPR. Since it may be difficult to enforce the GDPR requirements to the IoT domain (Lindqvist, 2018), it is currently not clear if the regulation can be applied to prevent privacy risks posed by the technology. Therefore, the following hypothesis will be tested:

Hypothesis: Household appliances connected through the IoT create privacy risks that are not covered by the GDPR.

Thereupon, once the risks have been compared to the legal framework, it will become clear whether there are key aspects missing in the regulation. If the hypothesis is confirmed, this analysis will identify which risks brought by household appliances with IoT should be addressed to protect consumers’ and avoid potential data breaches. Hence, this project aims to explore to what degree the GDPR is able to satisfy the technology’s

(26)

demands and guarantee the optimization of data protection for European citizens with respect to household appliances connected through the IoT.

3. Methodology

This chapter illustrates how this research is conducted. Thus, it elucidates the justification for the research design and the logic of case selection. Further, it presents the research’s operationalization and the codebook used in the analysis. Finally, it estimates potential validity issues and limitations that may incur in the research.

3.1 Justification of Research Design and Logic of Case Selection

Selecting the optimal research method is directly dependent on the particularities of the issue that is going to be analysed (Flyvbjerg, 2006). In that sense, this qualitative research adopts a case study design to provide a thorough assessment of how the current regulation perceives household appliances with the IoT technology. An advantage of the selection of a case study is that “it can ‘close in’ on real-life situations and test views directly in relation to phenomena as they unfold in practice” (Flyvbjerg, 2006, p. 235). Hence, producing ‘concrete context-dependent knowledge’ can be valuable to the field of social science (Flyvbjerg, 2006). Overall, the choice for this case study was based upon two factors: the increasing importance and vulnerability of the IoT technology and the extended impact the GDPR regulation had on companies after its implementation. More specifically, the selection of household appliances was based on the reasoning that they can be ‘specially problematic’ and that, as an information-oriented selection, it is helpful “[t]o maximize the utility of information from small samples and single cases”

(27)

(Flyvbjerg, 2006, p. 230). Finally, a case study is suited for this research since it “is most suited for explorative research” (Siggelkow, 2007, p. 21).

Furthermore, this project aimed to address a significant issue that will most likely increase in importance in the near future: the extent to which household appliances connected through IoT are required to offer data protection to users. In a society in which personal information is shared increasingly faster and relentlessly, having clear regulations to protect fundamental rights is key to safeguard social cohesion and stability. As a result, it is crucial to foster the development of analyses like the one presented here, that can enhance and clarify our knowledge about data protection regulations’ current challenges. Still, more research is needed to address the consequences of data monetization by companies using the IoT technology and their ability to profile consumers and identify their interests remotely.

3.2 Operationalization

This research is operationalized through a content analysis anchored on the theory of emerging risks, which elucidates that, as an emerging technology, the IoT can pose emerging risks to society. Hence, the analysis contrasts the GDPR with the privacy risks identified in the literature review (Section 2.3) to determine how household appliances with IoT may harm privacy in private spaces. The process was developed by using the outlined IoT risks to define the categories and the most suitable indicators for the content analysis, as shown in table 1. Consequently, the content analysis aims to determine whether these risks are currently regulated by the GDPR or not. Likewise, it becomes possible to identify what should change for the regulation to address the potentially uncovered risks.

(28)

Table 1

Codebook adopted in the study to categorize the privacy risks of household appliances connected through the IoT

Code

Category

Definition

Examples

Indicators

1

Identification of individuals through raw data

Raw data can be used along with analytical methods and combination to retrieve the identity

of people “This document concerns the use of personal data.” “This method can be applied to non-personal data to identify natural persons.” Paragraphs mentioning what types of data are

comprised by the document Paragraphs mentioning identification methods applicable to non-identifiable data

2

Data breaches Security vulnerabilities of IoT appliances due

to their unique characteristics, aligned with the market’s influence on production, may originate data breaches “After the occurrence of a data breach it is necessary to…” “Minimum security requirements are…” Paragraphs describing guidelines to be applied after data breaches occur Paragraphs describing mandatory security requirements

3

definitions of Inaccurate ‘data’ and ‘goods’

The definitions of ‘data’ and ‘goods’ may not encompass

the unique characteristics of

IoT appliances; also, they might become broader in the future, compromising data protection in the present “‘Goods’ mean offline systems…” “The definition of ‘data’ applied here is elucidated by …” Paragraphs that define ‘data’ and/or ‘goods’ Paragraphs that determine which existent definitions of ‘data’ and/or ‘goods’ are considered

(29)

4

authentication Inadequate methods Authentication methods may be incapable of ensuring anonymity and secure logins “Services must provide anonymity preferences or adequate secured logins…” Paragraphs addressing authentication methods, anonymity requirements and/or login security

5

Unconsented security expiration dates

Market forces may hamper the frequency of software updates, thereby creating security expiration dates of IoT devices without the

awareness of costumers “Software updates are required in a yearly basis…” “Consumers need to be informed about the frequency of patches…” Paragraphs specifying software update requirements Paragraphs specifying the need to inform users about the frequency of software updates

6

Unaware collection of personal data through the presence of devices Individuals may not be aware of the

existence of data collection devices in the environments they visit “The collection of personal data must be clearly informed to subjects…” “The collection of personal data requires the explicit consent of data subjects…” Paragraphs related to consumers’ awareness about the collection of personal data Paragraphs related to the need of explicit consent before data collection

7

Weaker party Consumers customarily do not

have the choice of deciding what data to provide after purchasing IoT appliances and are

not informed of data requests before purchases; also, they may not know if all data requests have valid

purposes “Service providers must guarantee costumers the choice to partially comply with data requests…” “The purposes

for all data requests must be clear…” Paragraphs regarding the possibility of using purchased devices without providing all the requested data

Paragraphs regarding the need to specify the purposes of all data requests

Paragraphs regarding the

(30)

“Service providers need to inform potential customers of data requests in advance…” necessity to inform customers about data requests before purchases

8

Uncontrolled dissemination of data to secondary appliances and third parties

Data may be shared between appliances

and third parties without consumers’

consent, resulting in unawareness

over who is accessing their

data, for what purposes and for

how long “Consent to the diffusion of data to other devices or third parties must be explicit…” Paragraphs remarking the distribution of data to secondary appliances and/or third parties

9

Unaware collection of data through default settings

Users may lack technical knowledge to modify default settings that might

be configured to support data collection, or simply not be aware of the existence of invasive data collection processes by default “Default settings must display the most relevant privacy options…” Paragraphs addressing default settings with respect to privacy

10

circulating about Incorrect data individuals

The online disclosure of inaccurate data,

particularly sensitive data, can

have irreversible consequences for individuals’ lives

“Data subjects have the right to rectify their data…” “In cases where inaccurate data about data subjects cause harm, compensations should be determined…” Paragraphs guaranteeing users can rectify

their personal data Paragraphs guaranteeing remedy for individuals directly affected by the circulation of inaccurate data

(31)

This study analyses the GDPR’s paragraphs. The choice for paragraphs was based on the regulation’s format: each chapter encompasses several articles, which are elucidated by paragraphs. Since each article may address more than one risk, analysing each paragraph provides an accurate understanding of how paragraphs’ subjects correlate with the outlined categories. The coding rules applied in the research are as follows:

1. Paragraphs can refer to more than one category;

2. Only the most relevant paragraphs for each category will be highlighted in the analysis section.

The first rule was created based on the document’s structure: its paragraphs tend to specify several subparagraphs, which can refer to more than one category. As such, paragraphs usually outline the most relevant information that can be related to indicators, but, in a few cases, subparagraphs can be connected to more than one category and have a similar level of importance. Hence, assigning these paragraphs to only one category would result in a weaker assessment. The second rule was defined so that the analysis section provides enhanced and clarified results. Therefore, fragments that are considered to have a stronger link to indicators were thoroughly explained in the chapter dedicated for the analysis, allowing for a more specific and condensed assessment. Mentioning all

“Particularly sensitive data, such as ethnic origin and sexual orientation, shall be processed with specific protection…” Paragraphs guaranteeing special protection for particularly sensitive data

(32)

paragraphs would result in a rather repetitive analysis and take the focus from the most significant information that can be retrieved and linked to IoT devices.

Accordingly, this analysis outlines non-binary results: since it is possible the regulation only covers some of the risks partially, providing binary results in the ‘yes’ or ‘no’ format would not be enough to answer the research question. Hence, the results incorporate a third possible answer named ‘partially covered’ in cases where the regulation does not entirely frame risks in the categories ‘covered’ and ‘not covered’. Therefore, the risks that are not covered by the regulation represent the emerging risks exposed in the theory; in other words, each risk that is not properly addressed by the GDPR accounts for an emerging risk of IoT household appliances that is able to compromise consumers’ privacy.

3.3 Validity Issues and Limitations

Researches are consistently constrained in all their shapes; thus, it is crucial to assess the potential validity issues and limitations they might entail. In this context, one of the more ordinary criticisms carried out by academics about study cases is that this type of analysis is not suitable for generating ‘theoretical (context-independent) knowledge’, which is supposed to be “more valuable than concrete, practical (context-dependent) knowledge” (Flyvbjerg, 2006, p. 221). However, this is a common misunderstanding since choosing a case study can be an effective method against the damaging tendency of distancing oneself from the object of study (Flyvbjerg, 2006). In that sense, Flyvbjerg (2006) remembers that ‘the force of example’ is often underestimated: the author advocates on behalf of the social relevance of case studies to contribute with more concrete and practical knowledge. Accordingly, this thesis aims to develop a concrete and practical approach that promotes the relevance of more suitable

(33)

privacy regulations capable of safeguarding people against the unnoticed risks of emerging technologies, such as the IoT.

Similarly, a common concern expressed in the literature is that “case studies […] provide little basis for scientific generalization” (Yin, 1994, p. 10). In this regard, the risks analysed in this thesis are not specific to household appliances with IoT, but can also apply to other types of IoT devices. Thus, the choice for household appliances reflects the fact that these appliances amplify the scope of privacy risks from public spaces to private spaces, impacting consumers in a place where they did not have to worry about privacy risks before: their own homes. Hence, the relevance of this project is that it provides an analysis replicable to other types of IoT devices, avoiding the issue of generalization that is usually related to single case studies.

Another possible validity issue is related to the ‘lack of selectivity’ and to the difficulty of exclusively presenting details associated with conceptual arguments (Siggelkow, 2007). Consequently, researchers can become biased and start to consider a wide range of information more interesting than it actually is, which might compromise the objectivity of their work (Siggelkow, 2007). So, it is essential to remain focused and rule out less relevant details that might confuse readers and jeopardize the study. Nonetheless, it is important to note it can be challenging to identify which words, phrases and expressions are more significant than others when doing explanatory research, particularly when dealing with numerous sources.

Finally, the difficulty embedded in assessing future risk scenarios can also be considered a limitation of this study. Despite being based on risks identified in the past by experts in the technology, it is important not to consider this elucidation as flawless since analysts may have identified some of these risks based on assumptions about future scenarios that will not necessarily take place.

(34)

4. Analysis

This chapter contrasts the risks described in the literature review with the GDPR and assesses whether they are clarified or not by the regulation. Accordingly, each risk is classified in one of the following categories: ‘covered’, ‘not covered’ and ‘partially covered’. The analysis is delineated as follows:

4.1 Identification of Individuals Through Raw Data

The first category aimed to shed light on the possibility of retrieving users’ identities from raw data through analytical methods and combination. Since the majority of IoT appliances participate in the collection of raw data, this risk becomes particularly problematic with the massive spread of the IoT technology. Therefore, when regulations do not specifically address the collection of raw data, countless IoT devices are not subject to the applicable laws in the scope of data protection. In addition, the lack of control over the handling of raw data could allow the unrestrained storing of raw data in extensive databases without public awareness, originating new issues such as unsupervised users’ profiling. Furthermore, when data protection is exclusively associated with the collection of personal data, awareness over the retrieving of identifiable information through raw data is most likely to decrease, leaving consumers unmindful of the invasive data collection conducted by IoT appliances they handle.

In this respect, the first article of the GDPR specifies the framework “lays down rules relating to the protection of natural persons with regard to the processing of personal data and rules relating to the free movement of personal data” (“GDPR General Data Protection Regulation,” 2016 Chapter I, Article 1, Paragraph 1). Accordingly, the regulation defines ‘personal data’1 as “any information relating to an identified or

(35)

Regulation,” 2016 Chapter I, Article 4, Point 1). In that sense, it makes reference to the Charter of Fundamental Rights of the European Union and the Treaty on the Functioning of the European Union (“Treaty on the Functioning of the European Union,” 2012) to reiterate that “the protection of natural persons in relation to the processing of personal data is a fundamental right” (“GDPR General Data Protection Regulation,” 2016 Recital 1), which “must be considered in relation to its function in society and be balanced against other fundamental rights” (“GDPR General Data Protection Regulation,” 2016 Recital 4). Hence, the GDPR specifies in which contexts it covers the processing of personal data, for instance when it is wholly or partly conducted by automated means (“GDPR General Data Protection Regulation,” 2016 Chapter I, Article 2, Paragraph 1) and when it does not take place during ‘a purely personal or household activity’ (“GDPR General Data Protection Regulation,” 2016 Recital 18).

Nevertheless, the GDPR elucidates that “the principles of data protection should apply to any information concerning an identified or identifiable natural person” (“GDPR General Data Protection Regulation,” 2016 Recital 26), which can be viewed as a rather ambiguous consideration: despite stating that “account should be taken of all the means reasonably likely to be used […] to identify the natural person directly or indirectly” (“GDPR General Data Protection Regulation,” 2016 Recital 26), the recital endorses the regulation’s focus on personal data, ignoring that raw data can also comply with this requirement. Additionally, personal data that ‘have undergone pseudonymisation’ is subjected to the framework as long as it “could be attributed to a natural person by the use of additional information” (“GDPR General Data Protection Regulation,” 2016 Recital 26), which is precisely the case of raw data as well.

Likewise, the GDPR refers to the possibility of associating natural persons with online identifiers from the devices they own, which could be “combined with unique

(36)

identifiers and other information received by the servers” (“GDPR General Data Protection Regulation,” 2016 Recital 30) and used for identification of users and profiling. Hence, even though the framework repeatedly specifies it only applies to the processing of personal data, the GDPR recognizes the existence of processes that can be applied to non-identifiable data in order to identify individuals, which are analogous to the ones that render raw data sensitive.

The analysis showed the risk of identifying individuals through raw data is ‘not covered’ given the GDPR only tackles personal data and clearly cannot be applied to raw data, which is largely used by the IoT technology.

4.2 Data Breaches

The second category’s objective was to analyse if the regulation is able to provide an accurate remedy to IoT appliances’ unique technological vulnerabilities, thus minimizing the frequency of data breaches and enforcing guidelines on how to manage incidents to reduce the likeliness of data exposure. In this respect, the adoption of multiple household appliances in people’s routines increases the volume of data transferred in the networks and expands the likelihood of security breaches. Unfortunately, the introduction of IoT appliances in the market also means that, due to products’ constrained resources, the encryption of transmissions is compromised in several devices and that functionality and design may be prioritized over security, leaving consumers with multiple vulnerable devices constantly collecting sensitive data.

In this regard, the document exclusively addresses data breaches with disclosure of personal data, namely a ‘personal data breach’, and defines it as “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored or otherwise processed” (“GDPR

(37)

General Data Protection Regulation,” 2016 Chapter I, Article 4, Point 12). Subsequently, the GDPR elucidates potential consequences of data breaches2 and stipulates that, after

becoming aware, controllers must notify occurrences to supervisory authorities within 72 hours “unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons” (“GDPR General Data Protection Regulation,” 2016 Chapter IV, Section 2, Article 33, Paragraph 1). Under these circumstances, data subjects must also be informed without undue delay (“GDPR General Data Protection Regulation,” 2016 Chapter IV, Section 2, Article 34, Paragraph 1). Accordingly, “the facts relating to the personal data breach, its effects and the remedial action taken” (“GDPR General Data Protection Regulation,” 2016 Chapter IV, Section 2, Article 33, Paragraph 5) shall be documented, allowing the verification of compliance by the supervisory authority.

Hence, in order to comply with the regulation IoT household appliances must be able to notify the controller or the supervisory authority whenever a personal data breach takes place, triggering a deliberation on the need for consumers to acknowledge the incident. Following this reasoning, the gravity of the data breach, its adverse effects and its consequences will be taken into account by the supervisory authority assessing the occurrences (“GDPR General Data Protection Regulation,” 2016 Recital 87). Nevertheless, IoT devices might not be designed to successfully comply with the law.

Additionally, the document stipulates that personal data must be processed with proper security and confidentiality through “appropriate technical or organizational measures” (“GDPR General Data Protection Regulation,” 2016 Chapter II, Article 5, Paragraph 1). Thus, article 32 (“GDPR General Data Protection Regulation,” 2016 Chapter IV, Section 2, Article 32, Paragraph 1) enumerates a few measures that shall be applied ‘as appropriate’, for instance the pseudonymization and encryption of personal

(38)

data. The regulation is cautious and also extends this requirement to processors: “the controller shall use only processors providing sufficient guarantees to implement appropriate technical and organisational measures” (“GDPR General Data Protection Regulation,” 2016, p. Chapter IV, Section 1, Article 28, Paragraph 1). However, overall the GDPR refrains from specifying mandatory security requirements, leaving security measures open to controllers’ judgment calls and to supervisory authority’s case-by-case deliberation. Thus, it is difficult to properly assess what security measures need to be embedded in IoT appliances in order to guarantee data protection and compliance.

All in all, the risk of a data breach is ‘partially covered’ by the GDPR, since the regulation defines how to proceed in case of a personal data breach, but it does not set compulsory security requirements to prevent them from happening.

4.3 Inaccurate Definitions of ‘Data’ and ‘Goods’

The definitions of ‘data’ and ‘goods’ are crucial to the drafting of data protection law. Generally, since concepts are continually being updated to ensure the providing of a correct framing, current definitions may be soon considered insufficient and leave individuals partially protected by the regulation in the present. In this respect, IoT devices pose a new challenge to the conceptualization of ‘goods’ and ‘data’ because of their simultaneous engagement in the online and offline domains, which is typically not envisaged by law. Thus, if not accurately defined the concepts might compromise the extent to which IoT appliances are subject to privacy regulations and substantially undermine consumers’ privacy.

Accordingly, the analysis proved the GDPR merely specifies the meanings of ‘personal data’,1 ‘genetic data’,3 ‘biometric data’4 and ‘data concerning health’,5 and

(39)

the framework does not conceptualize ‘goods’, nor does it mention which already stablished definition must be considered when interpreting the regulation.

Hence, the risks posed by inaccurate definitions of both concepts are ‘not covered’ by the regulation, considering the definitions of ‘data’ and ‘goods’ are not specified.

4.4 Inadequate Authentication Methods

The implementation of appropriate authentication methods and anonymity is key for minimizing the possibilities of data theft. But in order to achieve standardization of authentication methods and guarantee a more reliable scenario for data protection, secure authentication methods need further developing and upgrading. However, the release of numerous household appliances with IoT in the market, aligned with the monetization of data, poses major obstacles to the correct implementation of data authentication and anonymization methods, considering the increasing collection of sensitive data by multiple devices and the complexity of ensuring the minimization of data collection. Plus, guaranteeing anonymity since the first interaction with clients and defining in which cases it is imperative to offer anonymity to users require heavy investments in advanced technologies, which is often not feasible for businesses. To tackle these issues, the GDPR should specify acceptable login security, authentication methods and anonymity requirements, which would allow the IoT technology to flourish, preserving privacy and reassuring consumers’ trust.

In that sense, the regulation stablishes:

[…] the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate:

Referenties

GERELATEERDE DOCUMENTEN

Wi e zeven jaar geleden bekend w as met de situatie rondom de Neder­ lands-hervormde kerk in Noordwijk­ Binnen, zal zijn ogen nu niet kunnen geloven.. Een weelde

Stockholm answered to mainly focus on the Smart City components: smart locks, smart traffic and smart lighting. Although it was not mentioned as one of the

In order to build the theoretical framework presented in Chapter 2, articles were analyzed with regards to information on the following topics: general information

in a consideration by Cabinet. The motor industry is the largest and leading manufacturing industry in the domestic economy. Since the introduction of the MIDP the

As suggested in the research question, the purpose of this thesis is to investigate the differences in performance of the baseline AL and the MAB algorithms with regards to

In our proposed linearization method, the SFDR performance is not only limited by the non-flat response of ring resonator but also the increase of noise PSD of the link.. The noise

It is also important to conduct research to assess the knowledge, attitude and practices of health professionals at RFMH as well as other health institutions

Typically, three activity regions could be distin- guished (cf. However, for catalysts in which these crystallites were absent, or were decomposed into surface rhenium