• No results found

New regulations for informed consent; the impact of opt-choice and ambiguity on privacy decisions

N/A
N/A
Protected

Academic year: 2021

Share "New regulations for informed consent; the impact of opt-choice and ambiguity on privacy decisions"

Copied!
36
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

New regulations for informed consent;

the impact of opt-choice and ambiguity

on privacy decisions

By Matthijs van der Linden

Student: Matthijs van der Linden Student id: 10060235

Supervisor UvA: Mieke Kleppe

Supervisor KPMG: Ali Ougajou

Date: 22/07/2016

In partial fulfilment of the requirements for the degree of:

Master of Science in Information Studies, track: Business Information Systems (BIS), University of Amsterdam

(2)

Abstract

As information and communication technologies develop, the diffusion of personal information increases. New technological developments such as Internet of Things and social networking ask for new data protection regulations. To assess modern day issues, the European Commission published the new General Data Protection Regulation (GDPR) on the 14th of April, 2016. Specifically, the new requirements for ‘informed consent’ received enormous attention in the new regulation. In the new regulation, privacy statements containing pre-ticket boxes assuming individuals consent by default (op-out choices) and ambiguous text are forbidden. The current research investigated the effects of these new requirements on the willingness to disclose information of Dutch citizens. In this research, 208 respondents were randomly assigned to one of four conditions: unambiguous-opt-in, unambiguous-opt-out, ambiguous-opt-in and ambiguous-opt-out. A questionnaire was designed to measure privacy concern (PC), perceived privacy risk (PR) and willingness to provide information (WPI). Results indicate that an unambiguous privacy statement increases privacy concern and perceived privacy risk. Increased privacy concern and perceived privacy risk decreases the willingness to provide personal information. Neither ambiguity or opt-choice was found to influence WPI. Respondents with an unambiguous privacy statement read the privacy statement more often and consent less to the processing of personal information. From the results it is concluded that an unambiguous privacy statement can contribute to a more informed consent. However, more research on framing methods is needed in order to encourage individuals to read the statements.

(3)

Table of Contents

1. Introduction ... 4

1.1 Informed consent ... 6

1.2 The economics of privacy ... 9

1.3 Rational choice theory ... 10

1.4 Problems with informed consent ... 13

1.4.1 Information Asymmetries ... 13

1.4.2 Transaction cost: rational ignorance ... 14

1.5 Behavioral economics of privacy ... 16

1.6 Framing ... 16

1.7 Present-bias ... 18

1.8 Status quo bias ... 19

2. Methods ... 21

3. Results ... 24

3.1 Hypotheses testing ... 24

3.2 Exploratory analyses ... 26

4. Conclusion and Discussion ... 27

(4)

1. Introduction

The global adoption of the internet and the more recent popularity of social networks and Internet of Things technologies have significantly increased the amount of personal information that consumers disclose, increasing the possibilities for privacy and data protection complications. In a society where both people and ‘smart’ objects are connected through the internet, possible privacy issues arise (Lodder & Wisman, 2016). As the communication streams between ordinary computers already contain personal information, the new information streams as a result of the Internet Of Things (henceforth: IoT) technologies provide new, more privacy sensitive data. The location/speed of a car, for example, can be communicated through a network to third parties that might value this for commercial purposes. Another example is the smart meter, that can share private and detailed information about household activities with the utility providers (Cecchet et al., 2010). The large amount of data as a result of the increased usage of connected devices has an enormous business value. However, the processing of private information can also raise privacy concerns among consumers, ultimately resulting in reputational damage for organizations (Hirsch, 2013). As the demand and concern for privacy seems to increase, it is increasingly important for organizations to adjust their privacy strategy accordingly. Companies are constantly balancing between managing privacy concerns, regulatory changes and technological innovations (Chan et al., 2005).

As consumers seem to be increasingly concerned with privacy, the regulatory landscape is also drastically changing. The existing European privacy legislation (European Data Directive) dates back to 1995, long before the adoption of the internet, social networking and IoT technologies changed the data processing procedures drastically. While the existing regulatory framework was established under the assumption that data was stored and processed in a centrally kept and easily-traceable registry, current data processing procedures often take place with multiple multitasking actors storing data using cloud-computing technologies (De Hert & Papakonstantinou, 2012).

These new technological developments ask for a new generation of data protection regulations (Tene, 2011). In order to harmonize privacy legislations and increase the protection of privacy for individuals, the EU published the new General Data Protection Regulation (GDPR)

(5)

on April the 14th 2016. The goal of the reform package, as stated by the European Commission

is “ to give citizens back control over of their personal data, and to simplify the regulatory environment for business” (Protection of personal data, European Commission, 2016). Whereas under the original privacy regulations the consequences of non-compliance where very low, the new regulations will enable governments to fine organizations up to 4% of their annual turnover.

Especially the requirements for ‘informed consent’ are viewed as an important means of giving citizens control over personal information (De Hert & Papakonstantinou, 2016). GDPR will force organizations to be transparent towards EU citizens about the processing of personal information, asking for ‘informed consent’ in a clear and understandable fashion. Moreover, companies are not allowed to use opt-out procedures using pre-checked boxes, assuming consumers consent by default. The final GDPR reform package affects virtually every personal data processing of European citizens. It is argued that the GDPR will directly affect the way European citizens live and do business. (De Hert & Papakonstantinou, 2016; European Commission, 2016; Imperiali, 2012).

Strict consent requirements are viewed as the last defense of consumers to control the processing of personal information and the resulting loss of privacy (de Hert & Papakonstantinou, 2016). In scientific literature it is argued that the current privacy statements asking for consent are failing, as no one reads the statements. It is widely accepted that the statements are too long and vague, making the cost of reading them too high (Cranor & MacDonald (2008). Several studies suggest that even when a privacy notice is clearly presented, consumers do not always adjust their behavior accordingly (Acquisti & Grossklags, 2007; Good et al., 2007; Spiekermann et al., 2001). According to Zuiderveen Borgesius (2013), even if people would read a privacy statement, they would not understand it. Additionally, Zuiderveen Borgesius argues that in order for consent requirements to be effective, the privacy statement should not only be clear, but also contain opt-in and take-it or leave it decisions: consumers have to actively agree with the statement, eg. by checking a box, and the service cannot be denied when consumers do not give consent.

Although the arguments concerning privacy statement /consent request in literature might seem plausible, there is very little experimental research done to enforce these claims. It

(6)

remains unclear if understandable privacy statements with opt-in and without take-it-or-leave-it

choices will make consumers read the privacy statement, will affect consumers’ privacy attitudes and ultimately the disclosure of personal information online. To address this gap, the current research investigates the effect of opt-in and ambiguity on the willingness to provide personal information on the internet. As the new General Data Protection regulation forces companies to be more transparent about the personal information processing procedures, this might affect citizens’ intention to disclose personal information. Based on the above mentioned, this research aims to investigate the underlying assumptions of the new regulatory requirements concerning ‘informed consent’. Specifically, it is investigated whether the requirements for transparency/unambiguity towards consumers about personal information processing, along with the prohibition of opt-out choices, affects consumers’ willingness to disclose personal information. For this purpose, the following research question is defined:

‘How do the consent requirements concerning opt-choice and ambiguity in the GDPR regulation affect the willingness to provide personal information of Dutch citizens?’

To answer the research question, the changes in the General Data Protection Regulation and the problems with the previous regulations regarding ‘informed consent’ are discussed. To gain further insights in the shortcomings of previous informed consent requirements, the privacy decision making process is viewed from both an economic and behavioral economic perspective.

1.1 Informed consent

In data privacy law, there is an enormous focus on ‘informed consent’ as a way to protect privacy of individuals. Around the world, privacy regulations stress the importance of obtaining informed consent from individuals before processing their information. In the United States, for example, the Privacy Bill of rights states that personal information can only be accessed with the approval and knowledge of the individual. In accordance, The Federal Trade Commission argues that clear notice and opt-in consent is essential when obtaining personal information (Zuiderveen Borgesius, 2015a). Individual consent is the legal ground on which the vast majority of

(7)

information processing is based. Therefore, the consent requirements as stated in the GDPR are

viewed as one of the most important aspects of the regulation. Consent requirements can be seen as the last weapon individuals have to protect their control over the processing of personal information (De Hert & Papakonstantinou, 2016).

Consent requirements can give a certain control to the individual, ensuring that the information processing will only take place in accordance with the specifications as approved by the individual. However, existing consent requirements have proved to be unsuccessful in achieving their goal. Over the recent years, companies have installed consent procedures using privacy statements in order to comply with privacy regulations. These privacy agreements are often incomprehensible for consumers and are solely designed with the intention to comply to legal requirements. Moreover, European Member States have interpreted the EU Directive differently and defined different legal requirements for privacy policies, making it hard for organizations to comply (De Hert & Papakonstantinou, 2012).

In order to harmonize and improve consent requirements, consent has received a huge focus within the GDPR. In the GDPR proposal designed the European Commission, an ambitious attempt has been made to define consent requirements in order to empower individuals and make consent notices more clear, truly informing individuals before they consent (De Hert & Papakonstantinou, 2016).

In article 11 of GDPR, consent is defined as: “any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her” (General Data Protection Regulation, 2016, p. 113). Although most of the wording of this definition is an exact copy of the definition in the Directive 95/46/EC, the words ‘unambiguous’ and “by a statement or by a clear affirmative action” have been added. These additions change the requirement for consent significantly, leaving out the possibilities for opt-out (pre-ticked boxes) consent regimes (De Hert & Papakonstantinou, 2016). In line with this, in recital 32 this notion is elaborated: “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject's agreement to the processing of personal data relating to him or her, such as by a written statement, including by

(8)

electronic means, or an oral statement. This could include ticking a box when visiting an internet

website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject's acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent” (General Data Protection Regulation, 2016, p. 18).

According to Zuiderveen Borgesius (2015a) in order for regulatory measures ensuring informed consent to be effective, it is crucial that both opt-out and take-it-or leave it choices are banned. It is argued that it should not be allowed for a company to deny a service to an individual when he/ she does not give consent for the processing of personal information irrelevant for the functioning of the service (Henceforth: take-it-or-leave-it choices). For example, in reaction to the national cookie laws, based on the e-Privacy Directive, companies have found clever ways to make individuals consent to tracking. By using ‘cookie walls’ websites force individuals to consent to the use of cookies, denying access to anyone that does not give consent (Zuiderveen Borgesius, 2013). To avoid this , in article 7 it is stated that organizations should be able to prove whether consent is freely given and: “When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract” (General Data Protection Regulation, 2016, p. 122). Another problem with the current situation is the fact that people consent to privacy policies that are often confusing and incomprehensible. People don’t bother to read the policies and if they would, they wouldn’t understand them (Zuiderveen Borgesius, 2013; Ben-Shahar & Schneider, 2011). In response to this, GDPR states that the consent for request should be “intelligible”, “easy accessible” and written in “clear and plain language” (General Data Protection Regulation, 2016, p.122).

The above changes show that there are fundamental lessons learned from the failing consent requirements in the Directive 95/46/EC. However, it still remains unclear what the effect will be of these new requirements. In order to gain further insights, in the following section the problems of consent requirements will be analyzed from an economic and behavioral economic perspective.

(9)

1.2 The economics of privacy

In 2010, researcher Brett Mills studied a BBC wildlife documentary in order to assess whether the documentary considered (or ignored) animal ethics and animal privacy. Mills argues that animals often do not want to get filmed, and that “production teams use newer forms of technology to overcome species’ desire not to be seen” (p.1). Obviously, animals cannot give their consent to filming, leaving the possibilities for privacy infringement by humans. Therefore, Mills suggest to separate what is private from what is not private in the animal world, just like we do for humans (Brandimarte & Aquisti, 2012; Mills, 2010).

Although the ideas proposed by Mills might seem ridiculous, they perfectly illustrate the ambiguity of the privacy concept. Over the last decades, researchers in various fields have been struggling to define privacy and how to figure out the concepts that constitute privacy. One of the most influential researchers on the topic of information privacy, Aquisti (2010a) views privacy from the perspective of economics. Economics, as defined by Robbins & Robbins (1969) is “the science which studies human behavior as a relationship between ends and scarce means which have alternative uses” (as cited in Zuiderveen Borgesius, 2013, p.21). According to Aquisti privacy considerations can be understood as trade-offs. The role of trade-offs and incentives in privacy have gained significant attention in academic research over the years. This field of ‘economics of privacy’, is described by Aquisti is concerned with understanding and measuring the trade-offs related to privacy (Aquisti, 2010a; Zuiderveen Borgesius, 2013). In this respect, privacy is defined as “an interaction, in which the information rights of different parties collide. The issue is of control over information flow by parties that have different preferences over information permeability.” (Noam, 1997, p.2).

While the research field of the economics of privacy is relatively novel, economic trade-offs resulting from privacy concern are nothing new. Through technological improvements in general, and the world-wide adoption of the internet in specific, the collection and processing of data has become cheaper than ever. Brandimarte & Aquisti (2012) state that because of these developments, the trade-offs along with their opposing interest become more apparent. When using a social network, for example, users disclose personal information online in return for the free usage of the service. The information internet-users disclose can contribute to their online

(10)

experience, by showing more relevant advertisements through behavioral tracking. However,

when this information is used to offer individual users different prices for products, it can considerably disadvantage users. Although these trade-offs seem to become apparent, a lot of internet users are still not aware that they pay for free services with their personal information (Acquisti, 2010 b).

1.3 Rational choice theory

In economics, human behavior is explained using rational choice theory. In rational choice theory, it is assumed that humans are rational agents (homo economici), weighing cost and benefit to ultimately maximize utility. Additionally, it is assumed that people are always able to do so, that people always have a choice to maximize their utility (Brandimarte & Aquisti, 2012). The economics of privacy and rational choice theory have been very influential in privacy and data protection legislation. As mentioned earlier, in both the US and the EU notice and choice (consent) are viewed as the foundation of privacy protection. One professor in particular that contributed enormously to the adoption of this view in privacy legislation was Professor Westin. According to Westin, the majority of people favor a notice and choice approach in privacy policies. Westin states that the majority of people are ‘privacy balancers’, consciously assessing the benefits of disclosing information, weighing them to the resulting benefits. In doing so, consumers supposedly examine whether organizations use fair and responsible information practices and make decisions accordingly (Hoofnagle & Urban, 2014).

Two influential researchers that were one the first to adopt the economics perspective of information privacy were Posner and Stigler. However, contrary to Westin, Posner (1977) and Stigler (1980) opposed the idea of protecting privacy through regulations, stating that existing privacy protection regulations are perverse measures that distort the effectiveness of both consumers and organizations when trying to maximize utility. As Posner argues, there is conflict between an individual’s right to conceal information and the opposing party’s right to receive information. The more control an individual has over their personal information, the higher the likelihood that individuals present themselves untruthfully in relationships. This would counter the right of other parties to receive accurate information and therefore will lead to inefficiencies

(11)

in society. In line with this thought, Stigler (1980) argues that governmental interference with

privacy is redundant, as all parties are rational agents that rationally weigh the cost and benefits of disclosing information. Under normal conditions of competitions, maximal efficiency and utility would be reached resulting from the trade-offs related to privacy (Brandimarte & Aquisti, 2012).

As the advancement in technology offered the possibility for personal information disclosure as well as processing, in the 1990s the topic of privacy gained renewed attention among economist. Varian (1996), for example, was concerned with the new technological possibilities for privacy infringement. However, according to him too much regulation of the diffusion of personal information could ultimately disadvantage the consumer. As consumers seek to maximize utility, they would be willing to disclose personal information in return for a certain gain, i.e. relevant advertisements and offerings. When not enough information is disclosed, this might negatively affect the profit for both consumers and organizations. However, unlike Posner and Stigler, Varian acknowledges that although there might be an agreement between organizations and individuals about the disclosure of personal information, the organization might sell this information to third parties, resulting in added cost for consumers that were not initially expected.

Today, the economics literature on privacy mainly focusses on investigating consumer’s considerations when making a trade-off between the cost of protecting and the benefit of disclosing personal information. Corresponding to rational choice theory, Culnan & Armstrong (1999) for example, argue that information disclosure is a result of a trade-off called ‘the privacy calculus’, adding that the intention to disclose information is positively affected by procedural fairness: when consumers view the procedures concerning the processing information as fair, the consumer is more likely to disclose personal information (Dinev & Hart, 2006).

Dinev & Hart (2006) created a model and questionnaire to “measure the beliefs that influence the behavioral intention to disclose the personal information necessary to successfully complete a transaction on the Internet” (p. 62). In this research, the beliefs were categorized as ‘risk beliefs’ and ‘confidence and enticement beliefs’ (see table 1). This research provides empirical evidence showing that Perceived internet privacy risk (PR) and Internet Privacy concern

(12)

(PC) can negatively influence the willingness to provide personal information (PPIT). According

the researchers “A higher level of Internet privacy concerns is related to a lower level of willingness to provide personal information to transact on the Internet“ and “A higher level of perceived Internet privacy risk is related to a lower level of willingness to provide personal information” (p.72). However, when consumers weigh the cost and benefits, the combined influence of internet trust (T) and personal internet interest (PI) can outweigh consumers Perceived internet privacy risk (PR) and Internet Privacy concern (PC).

Table 1. Constructs in the extended privacy model Construct

category

Construct Acronym Definition

Willingness to act

Willingness to provide personal information to transact on the internet

PPIT Willingness to provide personal information required to complete transactions on the internet

Risk beliefs Perceived internet privacy risk PR Perceived risk of opportunistic behavior related to the disclosure of personal information submitted over the internet Internet Privacy concerns PC Concerns about opportunistic behavior

related to the personal information submitted over the internet

Confidence and enticement beliefs

Internet trust T Trust beliefs reflecting confidence that personal information submitted to websites will be handled competently, reliably and safely

Personal internet interest PI Personal interest or cognitive attraction to internet content overriding privacy concerns Note: Adapted from: Extended Privacy Calculus Model for E-Commerce Transactions 64 Information Systems Research 17(1), p. 64, by Dinev & Hart, 2006

(13)

There are some problems with the assumptions of economics and the corresponding

rational choice theory. For example, it is assumed that consumers are capable of rationally assessing the costs of personal information disclosure. In practice, surveys show that consumers are rarely aware of the contents of privacy statements and do not know what happens to their personal information, leaving out the possibility of making a truly rational and informed decision (Hoofnagle & Urban, 2014; Zuiderveen Borgesius, 2013). In the following section, the problems of the existing informed consent regulations are discussed from an economics perspective.

1.4 Problems with informed consent

While the existing privacy regulations for informed consent are based on the assumptions of economics and it is widely acknowledged that these regulations are not successful, it is essential to gain further insights in the shortcomings of privacy regulations from the economic perspective (Hoofnagle & Urban, 2014; Zuiderveen Borgesius, 2013). In this section two important aspects of complications with informed consent will be discussed; information asymmetries and transaction cost.

1.4.1 Information Asymmetries

Information asymmetries can be defined as “a situation where one party possesses information about a certain product characteristic and the other party does not.” (Luth, 2010, as cited in Zuiderveen Borgesius, 2013, p.27). In economics, information asymmetries are considered a threat to the effectiveness of the free market. For the correct functioning of the free market and utility maximization it is essential for both parties in a transaction to possess equal information about relevant parameters of a good or service (Grundmann, 2002). Information asymmetries can limit the extent to which individual rational agents can make truly informed decision. By analogy, consider a buyer and seller of a car both seeking to maximize utility. As the buyer is already familiar with the car, he/she possesses more information about it than the buyer. As the buyer is not fully informed about possible complications, it will offer the price an average car with those characteristics is worth. A possible outcome could be that the buyer pays a higher price than the car is actually worth. At the same time, this could also mean that sellers of good cars get

(14)

offered too little for their vehicle. By consequence, the market fails because goods and services

are not evaluated based on quality, but on price (Akerlof, 1995; Grundmann, 2002; Zuiderveen Borgesius, 2013).

In consumer privacy, information asymmetries also pose a threat. Results from numerous surveys show that consumers are often not aware of the extent to which their information is processed. Cranor & McDonald (2008), for example, found that only 40% of respondents thought that their email providers scanned emails. In their research, 62 % of respondents reported to be very uncomfortable with providers scanning their e-mails. When asked for consent, consumers initially possess less information about the processing of personal information than the provider of the good or service. Although it is possible to read into the privacy policies, consumers rarely do so and even if they did, they would most likely not understand it (Zuiderveen Borgesius, 2013). According to Acquisti & Grossklags (2007) “The complexity of the privacy decision environment leads individuals to arrive at highly imprecise estimates of the likelihood and consequences of adverse events, and altogether ignore privacy threats and modes of protection” (p.365). In the case of consumer privacy, the rational consumer cannot determine the cost of disclosing personal information and weigh the against the benefits. Therefore, a truly informed, rational decision would be impossible to make.

The solution to the problem of information asymmetries seems obvious; provide consumers with all the information. To avoid information asymmetries, current legislation requires organizations to do so. However, these requirements are unsuccessful. In the following section an explanation will be provided.

1.4.2 Transaction cost: rational ignorance

When the cost of informing oneself about a certain decision exceeds the benefits of that decision, ignorance can be a rational choice (Acquisti & Grossklags, 2007). In case of privacy decisions, individuals rarely take the effort to inform themselves by reading privacy policies. Even when an individual is concerned with privacy protection, reading privacy policies would take up an enormous chunk of their time. Not only are the privacy policies too long, they are often very hard to understand for the majority of people. While the privacy policies are required to ensure

(15)

transparency, they generally fail to make data processing procedures more clear. Research shows

even lawyers are having trouble understanding the contents of privacy policies. Even if people would fully understand the privacy policies, it is hard to determine the consequence and likelihood of privacy infringement, as companies are able to collect data from different sources that in combination can constitute a detailed profile (Verhelst, 2012; Zuiderveen Borgesius, 2013).

In their research, Cranor & MacDonald (2008) attempted to quantify the cost of reading all the privacy policies an individual encounters. Based on existing literature on monetary value of time, Cranor & Macdonald estimate that the national cost in the US if everyone would read privacy policies is approximately $781 billion a year, corresponding to 40 minutes a day per person.

Some researchers have proposed solutions to protect privacy based on free market principles, instead of regulation. Loudon (1996), for example proposed a ‘National Information Market’, were people can sell their private information exchange for small payments. Cranor & MacDonald (2008) state that the time to read a privacy statement can itself be seen a form of payment. Instead of receiving payments, consumers now pay for their privacy protection with their time.

Because consumers are unwilling to pay for their privacy protection with their time, privacy policies are unsuccessful in protecting privacy. According to the European Commission, the cost of reading privacy policies can be diminished by requiring privacy policies to be more concise and clear (GDPR, 2016). However, the effect of clear and unambiguous privacy policies remains unclear. Therefore, this paper will empirically test the proposed assumptions in literature. Several studies suggest that even when a privacy notice is clearly presented, consumers do not always adjust their behavior accordingly (Acquisti & Grossklags, 2007; Good et al., 2007; Spiekermann et al., 2001).

Using insights from behavioral economics, a further understanding of the problems with informed consent will be provided in the next section. Based on these insights new solutions can be deduced to improve informed consent requests.

(16)

1.5 Behavioral economics of privacy

In response to the shortcomings of economics, a more psychological approach to rational choice theory gained popularity, namely: behavioral economics (Kahneman, 2003). As defined by Acquisti & Grossklags (2007) “Behavioral economics studies how individual, social, cognitive, and emotional biases influence economic decisions”(p. 368). Because of the complex environment of privacy decisions, individuals are likely to be affected by numerous behavioral and cognitive boundaries described in behavioral economic literature.

One of the first critics of the assumptions of traditional economic theory was Herbert Simon (1955). According to Herbert Simon, the actual human behavior often doesn’t correspond to the behavior one would expect based on rational choice theory. A ‘rational’ human is assumed to able to calculate the exact consequence of a decision. A rational man is always capable of determining what outcome is better than the other, comparing the probabilities of the different outcomes and make decisions accordingly. In his research, Simon proposed an alternative to the rational choice model, taking into account the boundaries of the accessible information in the environment and the cognitive (computational) capacity of man. Simon calls this ‘bounded rationality’: rational choice that takes into account the cognitive limitations of the decision-maker, as well as the limitations of the environment (Simon, 1955; Zuiderveen Borgesius, 2013).

Kahneman & Tversky (1981) elaborately researched the different aspects of decision making, taking into account the circumstances that cause individuals to make decisions that violate the rules of rational choice and the corresponding utility maximization. In their papers, the two researchers describe how different biases, heuristics and framing can cause individuals to make decisions against their own interest.

1.6 Framing

The way in which a situation is framed, strongly affects the decisions an individual makes. The attractiveness of options and the resulting preferences of individuals change as the options are framed differently. By analogy, consider traveling through the mountains; while travelling, the relative heights of the mountain peaks might seem to vary as your position changes. A mountain that initially seemed smaller than its neighbor now seems to be larger. When noticing this, one

(17)

must conclude that some estimations of relative heights are likely to be false (Kahneman &

Tversky, 1981).

Kahneman & Tversky (1981) highlight different examples of cases were decisions are strongly affected by the presented frame. For example, the value difference between 10 $ and 20$ is perceived higher than between 110$ and 120$. Another important aspect of the so-called ‘value function’ is the difference between losses and gains. People respond more extremely to losses, than to gains. As Kahneman & Tversky state “The displeasure associated with losing a sum of money is generally greater than the pleasure associated with winning the same amount“(p.5). Because people respond more extremely to losses than to gains, people are more likely to choose an option that is framed as preventing loss than acquiring a gain.

In privacy legislation, specifically concerning consent requirements, these insights from behavioral economics can be valuable. When asking consumers for their consent, the way in which the consent request is framed is essential for the approval/disapproval by the consumer. For example, in a research done by Good et al. (2007) it was found that consumers believe a privacy statement to be more trustworthy when the statement is framed in ambiguous and vague language compared to clear and detailed privacy policies. Numerous research shows that people are largely unaware of the extent to which their information is processed and would be very uncomfortable with it if they did know (Zuiderveen Borgesius, 2013). According to Dinev & Hart (2006) privacy concerns and requirements to provide personal information are two of the most determining factors that discourage uses to make use of online services.

Based on the insights from both economics and behavioral economics the hypothesis below are defined using the ‘risk belief’ constructs from the extended privacy calculus model developed by Dinev & Hart (2006) as depicted in table 1. Based on the notion of transaction cost, along with framing the following hypotheses are defined:

H1: An unambiguous privacy statement will increase Privacy Concern (PC)

H2: An increase in Privacy Concern will decrease the Willingness to Provide Information (WPI) (Dinev & Hart, 2006)

H3: An unambiguous privacy statement will decrease the Willingness to Provide Information (WPI)

(18)

H4: An unambiguous privacy statement will increase Perceived Privacy Risk (PR)

H5: An increase in Perceived Privacy Risk will decrease the Willingness to Provide Information (WPI) (Dinev & Hart, 2006)

H6: The effect of ambiguity on WPI is mediated by PC H7: The effect of ambiguity on WPI is mediated by PR 1.7 Present-bias

In order to gain further understanding of the decision making process when confronted with a privacy statement, another psychological phenomenon should be discussed; the ‘present-bias’. In the 1960 and 1970 behavioral psychologist Walter Mischel conducted what was called ‘the Stanford marshmallow experiment’. In this experiment, the researcher gave children a marshmallow. After doing do, the children were told that if they did not eat the marshmallow immediately, the researched would come back a little later with an extra marshmallow. Although most children seemed to want an extra marshmallow, the majority of children could not resist but eat the marshmallow immediately. According to Mischel, children’s ability to delay gratification - to not eat the first marshmallow immediately - is predictive for their cognitive, social and mental health status in the future. A vast amount of longitudinal studies in the 40 years after Mischel’s first research on delayed gratification confirmed these findings (Mischel et al., 2011). In summary, people tend to be focused on the present, even when it will result in disadvantages in the future. In behavioral economics, this phenomenon is called the present-bias (Zuiderveen Borgesius, 2013). For example, research indicates that present- biased individuals have higher credit card debts and lower socio-economic status (Meier & Sprenger, 2010).

When asked to accept a privacy statement, the present bias provides an explanation of individual behavior. Present-biased individuals might accept privacy statements for immediate gratification (access to a service), even when the consequences will disadvantage them in the future. People are likely to consent because they are inclined to choose for immediate gratification (Zuiderveen Borgesius, 2013).

(19)

1.8 Status quo bias

According to rational choice theory, people will make decisions based on their preferences. However, the decision making process is different in practice. The majority of decisions are influenced by the status-quo bias; people are inclined to stay with the default option. A decent amount of empirical research shows that the status quo bias is an important influencer when making decisions (Samuelson & Zeckhauser, 1988). Samuelson and Zeckhauser (1988), found that default options were a good predictor of an individual’s choice when filling in a questionnaire. Also, the more answer choices in the questionnaire, the bigger the effect of the default option.

The status quo bias is also an important influencer of privacy decisions. When asked for consent, whether the boxes ‘I agree/ I disagree’ are pre-ticked is thought to have a big impact on privacy decisions. Understanding the status quo bias is important in the fierce discussion about opt-in versus opt-out systems. In essence, choosing opt-in or opt-out systems is choosing who profits from the status quo bias. Organizations will likely prefer opt-out, as the pre-ticked box will increase the possibility of collecting data. Likewise, consumers and privacy advocates will prefer opt-in, to prevent personal data from being processed (Zuiderveen Borgesius, 2013; 2015b). In their research, Lai & Hui (2006) found that providing respondents with an opt-in frame evoked positive attitudes towards participation in online activity, even when respondents were initially highly concerned with privacy.

As mentioned, in the new General Data Protection Regulation it is stated that ‘Silence, pre-ticked boxes or inactivity should not constitute consent’ (General Data Protection Regulation, 2016, p. 18). Based on status quo bias, it is assumed this new consent requirement will decrease the likelihood that someone gives consent to the processing of personal information. In line with the findings of Lai & Hui (2006) it is hypothesised that opt-out choices evoke negative attitudes. Based on the above mentioned literature the following hypotheses are defined:

H8: A privacy statement with an opt-out choice will increase Privacy Concern (PC) H9: A privacy statement with an opt-out choice will increase Perceived Privacy Risk (PR)

H10: A privacy statement with an opt-out choice will decrease the Willingness to Provide Information

(20)

H11: The effect of opt-choice on WPI is mediated by PC

H12: The effect of opt-choice on WPI is mediated by PR

H13: A privacy agreement with an opt-in choice decreases the likelihood that people give consent to the processing of personal information

To make the hypothesized relations more clear, a graphical representation is presented in Figure 1. For each arch, the type of influence (positive or negative) and the corresponding hypothesis are denoted.

Figure 1. Hypothesized relations

(21)

2. Methods

In order to investigate how the new consent requirements, specifically opt-in vs. opt-out and unambiguous vs ambiguous, affect an individual’s willingness to provide information, a questionnaire was designed based on the extended privacy calculus model by Dinev & Hart (2006). Before filling in the questionnaire, respondents were presented with different privacy statements depending on the condition. Respondents were randomly assigned to one of four conditions: unin, unout, in and ambiguous-opt-out.

The first condition (unambiguous-opt-in) included a privacy statement that complies with the new requirements as stated in the General Data Protection Regulation (2016). The privacy statement in this condition contained plain and unambiguous language, clearly explaining what personal information will be processed and for what purposes using an opt-in choice (no pre-ticked boxes). In the second condition, the unambiguous-opt-out-condition, the privacy statement contained the same unambiguous content, now using an opt-out choice; the box ‘I agree’ was pre-ticked.

(22)

The content of the first two conditions were generated using a tool called

‘Privacyverklaring generator’ developed by the Dutch government (Privacyverklaring generator, 2016). The independent variable ‘unambiguity’ was operationalized by creating a privacy statement with plain and clear language that only contains the most relevant information to the consumer. To increase the likelihood of finding an effect, an extreme case of privacy infringement was simulated. For example, it was stated that first and last name, sex, date of birth, location data, email address, browsing habits were recorded, analyzed and sold to third parties (see Figure 2). To decrease the size of the statement and consequently increase the likelihood that the statement would be read, some the less alarming and relevant content of the initially generated statement was left out.

Figure 2. Unambiguous-opt-out privacy statement

In the third condition, the ambiguous-opt-in-condition, a traditional, ambiguous privacy statement was used using an opt-in choice. The privacy statement contained a lot of ambiguous text, using the same content as the first two conditions differently formulated. In the fourth condition, the unambiguous-opt-out-condition, the privacy statement contained the same ambiguous content, now using an opt-out choice. In the third and fourth condition, ambiguity

(23)

was operationalised by generating a privacy statement that contained unclear/vague language

along with a lot less relevant information. The privacy statements used for the third and fourth conditions were based on the privacy statement used by the survey software Qualtrics.

After agreeing or disagreeing with the privacy statement, respondents were asked to answer a questionnaire measuring the constructs Perceived Privacy Risk (PR) and Privacy Concern (PC) adopted from the extended privacy calculus model and questionnaire developed by Dinev & Hart (2006). In order to maintain validity and reliability the original questionnaire was kept intact as much as possible. Because the questions had to be tailored to the purpose of the research, questions such as ‘What do you believe is the risk for regular internet use due to the possibility that..’ were changed into ‘What do you believe is the risk of participating in this survey due to..’ (see Appendix A). The questions corresponding to the constructs Internet Trust (T), Personal Internet Interest (PI) and willingness to provide personal information to transact on the Internet (PPIT) were hard to tailor to the purpose of this research. To prevent loss of reliability by amending the questions, these constructs were left out of the questionnaire. Instead of asking individuals about their willingness to provide information, in this questionnaire willingness to provide information is measured by asking individuals to provide personal information. Empirical research shows there tends to be a significant discrepancy between privacy attitudes and privacy behavior, also referred to as ‘the privacy paradox’ (Awad & Krishnan, 2006). By directly asking individuals to disclose personal information, the result from the questionnaire are unlikely to be affected by this paradox. WPI was measured by asking respondents to provide their first and last name, date of birth, e-mail address and birthplace.

To test the hypothesized relations, statistical analyses were conducted. In order to assess whether opt-in/vs opt-out and unambiguous/ ambiguous directly affect Privacy Concern, Perceived Privacy Risk and Willingness to Provide Information and consent decisions an ANOVA analysis was performed. To assess whether the effect of ambiguity and opt-in/out on WPI was mediated by PC and PR (as hypothesized in H6,H7,H11,H12), a mediation analysis was conducted using the INDIRECT method as proposed by Preacher & Hayes (2008).

Additionally, some exploratory analyses were performed. A Chi-square test was performed to analyze the relation between ambiguity, opt-choice and consent. Additionally, a

(24)

test was performed to investigate the relation between Willingness to Provide Information (WPI)

and consent. Ultimately, A Chi-square test was performed to analyze the effect of ambiguity and opt-choice on the response to the question “Have you read the privacy statement?”.

3. Results

3.1 Hypotheses testing

To test the hypotheses, statistical analyses were performed. In total, the questionnaire yielded 208 usable respondents with an average age of 34 years. Of this sample 72 percent were male, 28 percent female, 71.2 percent had a master's degree, 18.8 percent a bachelor degree and the rest did not hold any degree in higher education.

In hypothesis 1, it was expected that an unambiguous privacy statement would increase Privacy Concern . As expected, The ANOVA revealed a significant effect of ambiguity on Privacy Concern (F(1,204) = 25,407, p< 0,05). This effect indicates that when participants are presented with an unambiguous privacy statement, their Privacy Concern increases.

As hypothesized in H2, the mediation analysis showed a significant effect of Privacy Concern on Willingness to Provide information (see Figure 3).

As stated in hypothesis 3, an unambiguous privacy statement was expected to decrease the Willingness to Provide Information (WPI) compared to an ambiguous statement. However, results from the ANOVA show no significant effect of ambiguity on WPI (F(1,204) = 4.428, NS). In hypothesis 4, it was expected that an unambiguous privacy statement would also increase Perceived Privacy Risk. Results from the ANOVA indeed revealed a significant effect of ambiguity on Perceived Privacy Risk (F(1,204) = 9.791, p < 0,05). This effect indicates that when participants are presented with an unambiguous privacy statement, the Perceived Privacy Risk increases significantly.

As hypothesized in H5, the mediation analysis showed a significant effect of Perceived Privacy Risk on Willingness to Provide information (see Figure 3). To check whether Privacy Concern and Privacy Risk were mediating the effect of ambiguity on WPI (H6,H7,H11,H12), a mediation analysis was performed. The mediation analysis, as designed by Preacher & Hayes (2008), indicates no indirect effect of ambiguity on WPI through PC (CI = 0.082 to 0.988) or PR (CI

(25)

= 0.065 to 0.578) , meaning Privacy Concern and Perceived Privacy Risk do not mediate between

the dependent and independent variables in this study. All the effects of the analysis are visualized in Figure 3.

Figure 3. Results from mediation analysis

It was also expected that an opt-in choice would increase privacy concern (H8). However, the ANOVA showed no significant effect of opt-choice (opt-in vs. opt-out) on Privacy Concern (F(1,204) = 1.380, NS).Again, unlike hypothesized in H9, no significant effect of opt-choice on Perceived Privacy Risk was found (F(1,204) = 1.213, NS). In H10 it was expected that privacy statement with an opt-ín choice would decrease the willingness to provide information (WPl). However, no significant effect of opt-choice on WPI was revealed (F(1,204) = 1.268, NS).

(26)

Figure 4. Supported and unsupported hypotheses

Hypothesis Result

H1: An unambiguous privacy statement will increase Privacy Concern (PC) Supported H2: An increase in Privacy Concern will decrease the Willingness to Provide Information (WPI) Supported H3: An unambiguous privacy statement will decrease the Willingness to Provide Information

(WPI)

Unsupported

H4: An unambiguous privacy statement will increase Perceived Privacy Risk (PR) Supported H5: An increase in Perceived Privacy Risk will decrease the Willingness to Provide Information

(WPI)

Supported

H6: The effect of ambiguity on WPI is mediated by PC Unsupported H7: The effect of ambiguity on WPI is mediated by PR Unsupported H8: A privacy statement with an opt-in choice will increase Privacy Concern (PC) Unsupported H9: A privacy statement with an opt-in choice will increase Perceived Privacy Risk (PR) Unsupported H10: A privacy statement with an opt-in choice will decrease the Willingness to Provide

Information (WPI) Unsupported

H11: The effect of opt-choice on WPI is mediated by PC Unsupported H12: The effect of opt-choice on WPI is mediated by PR Unsupported H13: A privacy agreement with an opt-in choice decreases the likelihood that people give

consent to the processing of personal information Unsupported

3.2 Exploratory analyses

A Chi-square test was performed to analyze the relation between ambiguity, opt-choice and consent. Results indicate a significant relationship between ambiguity and consent X2(1, N=208) = 31.62, <0.05. Of the respondents presented with an ambiguous privacy statement, 9,8 %

(27)

disagreed, compared to 45.7% percent with a unambiguous privacy statement. Unlike

hypothesized (H8) no significant relationship was found between opt-choice and consent. Additionally, a T-test was performed to investigate the relation between Willingness to Provide Information (WPI) and consent. There was a significant difference between the WPI scores of respondents who agreed (M=2.9 SD = 2.3) and respondents who did not agree (M=1.2, SD= 2.1) with the privacy statement; t(206) = 5.08, p<0,05.

At the end of the survey, respondents were asked if they have read the privacy statement. A Chi-square test was performed to analyze the effect of ambiguity and opt-choice on the response to this question. There was a significant relationship between the ambiguity of the privacy statement and reading the privacy statement (X2 (1) = 39.47, <0.05). Of the respondents presented with an unambiguous privacy statement, 67,8% reported to have read the statement. Of the respondents presented with an ambiguous privacy statement, only 23,9% reported to have read the statement. No significant relationship was found with the opt-choice variable.

A large amount of respondents (46%) did not continue the survey after being presented with the privacy statement. A Chi-square test was performed to analyze the effect of ambiguity and opt-choice on whether or not respondents finished the questionnaire. No significant relationship was found.

4. Conclusion and Discussion

The goal of this thesis was to investigate the assumptions of the new consent requirements (as stated in the new European data protection regulations) on consumers’ willingness to provide personal information. In the new General Data Protection Regulation, it is assumed that strict consent requirements, asking consumers for consent in an unambiguous fashion using opt-in choices, will contribute to a more ‘informed’ consumer, ultimately influencing consumers’ willingness to provide personal information. In this research, the following question was investigated: ‘How do the consent requirements concerning opt-choice and ambiguity in the GDPR regulation affect the willingness to provide personal information of Dutch citizens?’

To answer the research question, a questionnaire was designed containing privacy statements corresponding to the old and new requirements. Specifically, the assumptions of

(28)

in and ambiguity requirements were tested by randomly assigning participants to one of four

conditions. After agreeing or disagreeing with the privacy statement, the Privacy Concern and Perceived Privacy Risk was measured, after which we assessed the willingness to provide information.

As hypothesized, respondents presented with an unambiguous privacy statement, reported an increased perceived privacy risk. Similarly, participants presented with an unambiguous privacy statement, reported an increased privacy concern. These results are similar to the results from research by Good et al. (2007) that found that consumers believe a privacy statement to be more trustworthy when the statement is framed in ambiguous and vague language compared to clear and detailed privacy policies. The clear and detailed privacy statement in the current research was not perceived as trustworthy, given the high privacy concern and perceived privacy risk reported.

Correspondingly, significantly more respondents read the privacy statement when the content was unambiguous. The transaction cost theory, as described in behavioral economics, can provide an explanation. Possibly, because the transaction cost of reading the unambiguous privacy statement were lower compared to the ambiguous statement, respondents were willing to make the cost. In line with the hypotheses and behavioral economic literature, the framing of the privacy statement seemed to have an impact as well; as the unambiguous privacy statement contains strong and clear language informing respondents that their data will be processed and sold, a high perceived privacy risk and privacy concern is reported. Correspondingly, when participants are presented with an unambiguous privacy statement, they less often give consent to the processing of their information, compared to an ambiguous privacy statement. Respondents who agreed with the privacy statement (gave consent), provided more personal information compared to respondents that did not agree. Contrary to expectation, the ambiguity of the privacy statement did not seem to affect respondent’s willingness to provide personal information, measured through the amount of personal information respondents provided in this study.

The results from this study seem to be in line with research done by Culnan & Armstrong (1999). Culnan & Armstrong (1999) found that the intention to disclose information is positively

(29)

affected by procedural fairness: when respondents were told the procedures concerning the

processing information were fair, the respondent was more likely to disclose personal information. In the current research, respondents presented with the unambiguous privacy statement were explicitly informed about seemingly unfair processing procedures. Although no effect was found on the willingness to provide personal information, respondents did disagree significantly more with the unambiguous privacy statement. Possibly, respondents disagreed because they found the procedures to be unfair. Similar to the results from research by Dinev & Hart (2006), results in this study also indicate that an increase in privacy concern and perceived privacy risk decreases the willingness to provide personal information.

In conclusion, the new consent requirements (ambiguity and opt-choice) do not seem to affect the willingness to provide personal information as operationalized in this research. However, ambiguity does seem to influence perceived privacy risk, privacy concern, consent decisions and whether or not respondents read the privacy statement. Although no direct effect of ambiguity/opt-choice on WPI is found, the reported results do indicate a greater awareness as a result of the new ambiguity requirements. As respondents with an unambiguous privacy statement read the statement more often and agree with it less, the new ambiguity requirements in GDPR can contribute to a more ‘informed’ consent.

There might be an additional factor accounting for the absence of effect of the independent variables on the dependent variable WPI. The way the dependent variable ‘Willingness to Provide Information’ was operationalized has it flaws. Whether respondents provide personal information is less relevant, when the respondents do not consent to the processing of personal information. When respondents disagreed with the privacy statement, providing personal information has a different meaning to the respondent, as the personal information cannot be processed and sold to third parties. Therefore, the validity of the operationalization of ‘Willingness to provide information’ is questionable. Respondents might be comfortable with disclosing personal information, as long as nothing is done with this information.

Given the limited timeframe of this research, respondents were recruited within the available network of the researcher. This provided a convenience sample that is not

(30)

representative for Dutch citizens, as the majority of respondents were highly educated males

that were 34 years old on average. Several studies show, both age and educational level seem to influence privacy attitudes, possibly making the results of the current study hard to generalize. For example, Phelps et al. (2000) found that there is a significant relationship between age and privacy concern; higher educated people are more concerned with privacy. Age does also seem to have an effect on privacy related concepts. For example, van den Broeck et al. (2015) found that older age groups reported higher privacy concerns than younger age groups. To prevent any effect of age and educational level, future research could replicate the current study using a sample that is representative for the population.

Although privacy concern was measured in this research, no distinction was made between different categories of respondent’s initial privacy concern. Possibly, some respondents are usually more concerned with privacy than others, which would influence the results significantly. In future research, we recommend to categorize the initial privacy concern of respondents using the three categories as used by Westin (as cited in Kumaraguru & Cranor (2005), namely: fundamentalist, pragmatist and unconcerned. By using the categories, the influence of the initial privacy concern on the results can be analyzed, contributing to a deeper understanding of the topic.

Another limitation is related to the design of the questionnaire. A large amount (46%) of respondents decided not to continue to the questionnaire, when presented with the privacy statement. No significant difference was found between the four conditions. A possible explanation for the large amount that left the questionnaire could be that individuals are used to take-it-or-leave-it choices; when you do not agree with a privacy statement you cannot continue. In this case, respondents did have the possibility to disagree with the privacy statement and continue to the questionnaire. Possibly, a large group of respondents automatically assumed that they had to agree with the statement and quit the questionnaire when they did not agree. Nevertheless, a sufficient amount of participants did complete the questionnaire, providing a decent sample size (N=208).

While the assumptions of the new data privacy regulation were tested, the privacy statements designed for this research did not completely resemble real life statements. In order

(31)

to increase the likelihood of an effect, an extreme case of privacy infringement was simulated,

presenting the information in the most understandable fashion. To accomplish this, some of the less alarming content of the privacy statement as generated by the ‘Privacyverklaring generator’ developed by the Dutch government, was eliminated. Possibly, when the privacy statements do contain all the generated content, the transaction costs would be too high for consumers to read it, as the privacy statement would be too long. Therefore, the actual effect of the new GDPR ambiguity requirements might be more modest than results from this study indicate. However, the goal of this research was to merely test the assumptions about opt-choices and unambiguous privacy statements. As the wording in GDPR requiring ‘unambiguous and understandable’ privacy statements can itself be viewed as ambiguous, the interpretations of this law may vary in practice. Organizations might find ways of complying with legislation, while still keeping the likelihood of consumers not reading the privacy statement high. Once the consent requirements are implemented by organizations, future research could investigate how the actual new privacy statements influence privacy attitudes and information disclosure behavior in practice.

Another limitation of this study is the way in which the independent variable ‘ambiguity’ is operationalized. In this study, ambiguity was operationalized by changing the text from vague to clear and leaving out less relevant text out of the privacy statement. As the unambiguous privacy statement contains less text than the ambiguous statement, the effect of ‘ambiguity’ might be caused by the different lengths of the privacy statements, not by the wording of the content. In this study, the ‘ambiguity’ variable actually contained two distinct variables: wording and length. Future research could attempt to measure these variables separately to investigate the individual effects on privacy decisions.

The effect of an unambiguous privacy statement is likely dependent on the information processing procedures described in the statement. For the purpose of this study, the privacy statement contained descriptions of maximal privacy infringement. However, not all services process personal data as extensively. Milne & Culnan (2004), for example, found that reading a privacy statement could increase trust, instead of perceived risk and concern. Further research could investigate when and why consumers view certain descriptions of information processing procedures as trustworthy or not.

(32)

The findings in this research provided new insights into the effects of consent

requirements and corresponding framing methods on privacy related decisions. These findings can help scientist and legislators understand the impact of consent requirements. Future research could investigate how differently designed privacy statements can encourage consumers to read the content, using different types of wording and graphics. As privacy concerns differ between groups, this research hopefully inspires further investigation to discover the source of these differences and how to inform consumers of all ages and educational levels about privacy. This will ultimately contribute to the goal of giving the control over personal information back to the individual.

5. References

Acquisti, A. (2010a). From the economics to the behavioral economics of privacy: a note. In Ethics and Statement of Biometrics (pp. 23-26). Springer Berlin Heidelberg.

Acquisti, A. (2010b). The economics of personal data and the economics of privacy.

Acquisti, A., & Grossklags, J. (2007). What can behavioral economics teach us about privacy. Digital Privacy: Theory, Technologies and Practices, 363-377.

Akerlof, G. (1995). The market for “lemons”: Quality uncertainty and the market mechanism (pp. 175-188). Macmillan Education UK.

Awad, N. F., & Krishnan, M. S. (2006). The personalization privacy paradox: an empirical evaluation of information transparency and the willingness to be profiled online for personalization. MIS quarterly, 13-28.

Ben-Shahar, O., & Schneider, C. E. (2011). The failure of mandated disclosure.University of Pennsylvania Law Review, 647-749.

Brandimarte, L., & Acquisti, A. (2012). The economics of privacy. The Oxford Handbook of the Digital Economy, 547-571.

Chan, Y. E., Culnan, M. J., Greenaway, K., Laden, G., Levin, T., & Smith, H. J. (2005). Information Privacy: Management, marketplace, and legal challenges. Communications of the Association for

Information Systems,16(1), 12.

Culnan, M. J., & Armstrong, P. K. (1999). Information privacy concerns, procedural fairness, and impersonal trust: An empirical investigation.Organization science, 10(1), 104-115.

(33)

De Hert, P., & Papakonstantinou, V. (2012). The proposed data protection Regulation replacing

Directive 95/46/EC: A sound system for the protection of individuals. Computer Law & Security Review, 28(2), 130-142.

de Hert, P., & Papakonstantinou, V. (2016). The new General Data Protection Regulation: Still a sound system for the protection of individuals?.Computer Law & Security Review, 32(2), 179-194.

Dinev, T., & Hart, P. (2006). An extended privacy calculus model for e-commerce transactions. Information Systems Research, 17(1), 61-80.

European Commission, http://ec.europa.eu/justice/data-protection/

Good, N. S., Grossklags, J., Mulligan, D. K., & Konstan, J. A. (2007, April). Noticing notice: a large-scale experiment on the timing of software license agreements. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 607-616). ACM.

Grundmann, S. (2002). Information, party autonomy and economic agents in European contract law. Common Market Law Review, 39(2), 269-393.

Hirsch, D. D. (2013). In Search of the Holy Grail: Achieving Global Privacy Rules Through Sector-Based Codes of Conduct. Ohio State Law Journal,74(6).

Hoofnagle, C. J., & Urban, J. M. (2014). Alan Westin's privacy homo economicus. http://ec.europa.eu/justice/data-protection/

Imperiali, R. (2012). Data Protection Compliance Program, The. J. Int'l Com. L. & Tech., 7, 285. Kahneman, D. (2003). A perspective on judgment and choice: mapping bounded rationality. American psychologist, 58(9), 697.

Kumaraguru, P., & Cranor, L. F. (2005). Privacy indexes: a survey of Westin's studies.

Lai, Y. L., & Hui, K. L. (2006, April). Internet opt-in and opt-out: investigating the roles of frames, defaults and privacy concerns. In Proceedings of the 2006 ACM SIGMIS CPR conference on computer

personnel research: Forty four years of computer personnel research: achievements, challenges & the future (pp. 253-263). ACM.

Laudon, K. (1996). Markets and Privacy. Computerization and Controversy, 697-726. doi:10.1016/b978-0-12-415040-9.50142-1

Laudon, K. C. (1996). Markets and privacy. Communications of the ACM,39(9), 92-104.

Lodder, A. R., & Wisman, T. (2016). Artificial Intelligence Techniques and the Smart Grid: Towards Smart Meter Convenience While Maintaining Privacy.Journal of Internet Law (Dec. 2015), 19(6), 20-27.

McDonald, A. M., & Cranor, L. F. (2008). Cost of reading privacy policies, the. ISJLP, 4, 543. Meier, S., & Sprenger, C. (2010). Present-biased preferences and credit card borrowing. American Economic Journal: Applied Economics, 193-210.

(34)

Mills, B. (2010). Television wildlife documentaries and animals’ right to privacy.Continuum:

Journal of Media &

Milne, G. R., & Culnan, M. J. (2004). Strategies for reducing online privacy risks: Why consumers read (or don't read) online privacy notices. Journal of Interactive Marketing, 18(3), 15-29.

Mischel, W., Ayduk, O., Berman, M. G., Casey, B. J., Gotlib, I. H., Jonides, J., ... & Shoda, Y. (2011). ‘Willpower’over the life span: decomposing self-regulation. Social Cognitive and Affective Neuroscience, 6(2), 252-256.

Mischel, W., Shoda, Y., & Peake, P. K. (1988). The nature of adolescent competencies predicted by preschool delay of gratification. Journal of personality and social psychology, 54(4), 687.

Molina-Markham, A., Shenoy, P., Fu, K., Cecchet, E., & Irwin, D. (2010, November). Private memoirs of a smart meter. In Proceedings of the 2nd ACM workshop on embedded sensing systems for

energy-efficiency in building (pp. 61-66). ACM.

Noam, E. M. (1997). Privacy and self-regulation: Markets for electronic privacy.Privacy and Self-Regulation in the Information Age, 21-33.

Phelps, J., Nowak, G., & Ferrell, E. (2000). Privacy concerns and consumer willingness to provide personal information. Journal of Public Statement & Marketing, 19(1), 27-41.

Posner, R. A. (1977). Right of privacy, the. Ga. L. Rev., 12, 393.

Preacher, K. J., & Hayes, A. F. (2008). Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behavior research methods, 40(3), 879-891.

REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), 27/04/2016.

Robbins, L., & Robbins, L. (1969). An essay on the nature and significance of economic science Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of risk and

uncertainty, 1(1), 7-59.

Simon, H. A. (1955). A behavioral model of rational choice. The quarterly journal of economics, 99-118.

Spiekermann, S., Grossklags, J., & Berendt, B. (2001, October). privacy in 2nd generation E-commerce: privacy preferences versus actual behavior. InProceedings of the 3rd ACM conference on Electronic Commerce (pp. 38-47). ACM.

Stigler, G. J. (1980). An introduction to privacy in economics and politics. The Journal of Legal Studies, 9(4), 623-644.

(35)

Tene, O. (2011). Privacy: The new generations. International Data Privacy Law, 1(1), 15-27.

Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science, 211(4481), 453-458.

Van den Broeck, E., Poels, K., & Walrave, M. (2015). Older and wiser? Facebook use, privacy concern, and privacy protection in the life stages of emerging, young, and middle adulthood. Social

Media+ Society, 1(2), 2056305115616149.

Varian, H. R. (1996). Economic aspects of personal privacy. Privacy and Self-regulation in the

Information Age.

Verhelst, E. W. (2012). Recht doen aan privacyverklaringen.

Zuiderveen Borgesius, F. J. (2013). Consent to Behavioural Targeting in European Law-What are the Statement Implications of Insights from Behavioural Economics?. Amsterdam Law School Research Paper, (2013-43).

Zuiderveen Borgesius, F. J. (2015). Informed Consent: We Can Do Better to Defend Privacy. FJ

Zuiderveen Borgeswius, Security & Privacy,‘Informed Consent. We Can Do Better to Defend Privacy’, IEEE, 13(2), 103-107.

Zuiderveen Borgesius, F. J. (2015b). Improving privacy protection in the area of behavioural targeting. Available at SSRN 2654213. Privacy, (2), 103-107.

(36)

Referenties

GERELATEERDE DOCUMENTEN

Next to that, the analysis based on survey data provided no evidence for the presence of a significant effects of source availability and promotion of feedback-seeking behavior of

Within this model, the relation between an individual’s boundary spanning behaviour and his or her perceived role conflict and role ambiguity was examined by including two

MEASURING THE EFFECT OF BRANDS AND CUSTOMER REVIEWS ON UNCERTAINTY BY ELICITING

While the effect of brands is captured by the level of brand equity, customer reviews are represented by the average rating (valence) and the number of reviews available (volume)

The aim of the present study was to investigate whether baseline CTA and CTP measures in acute ischemic stroke patients can improve prediction of infarct presence and infarct volume

When the processes of the choice is different for Italian students, it is exclu- sively for the implication that being part of an Italian family might have in terms of endowment

Through in-depth qualitative interviews with employees of the Royal Netherlands Marine Corps, this study looks at the ways Dutch marines construct their identity and perform

De mogelijkheden voor het CVC om juist algemener over politiek te spreken zijn bo- vendien beperkt door de politieke cultuur: de federale overheid kan in de Verenigde Staten