• No results found

The effect of transparency in implementations of the General Data Protection Regulation on consumers' online privacy concern and behaviour

N/A
N/A
Protected

Academic year: 2021

Share "The effect of transparency in implementations of the General Data Protection Regulation on consumers' online privacy concern and behaviour"

Copied!
82
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

The effect of transparency in implementations of the General Data Protection Regulation on consumers’ online privacy concern and behaviour’

Date of Submission 22 June 2018

Qualification MSc Business Administration

Track Digital Business

Name: Sunny de Blok

Student Number 10385290

Email sunny.deblok@student.uva.nl

Institution University of Amsterdam

(2)

Statement of originality

This document is written by Sunny de Blok, who declares to take full responsibility for the contents of this document. I declare that the text and the work presented in this document is original and that no sources other than those mentioned in the text and its references have been used in creating it. The Faculty of Economics and Business is responsible solely for the supervision of completion of the work, not for the contents.

Acknowledgements

I would like to express my gratitude towards people who have helped me during the processes of writing this research. Primarily, I would like to thank my thesis supervisor, Abhishek Nayak, who has helped me in creating the outlines of this project and by providing constructive feedback. Furthermore, I would like to thank Pieter Erik de Ridders, who, as a legal officer, gave me a lot of in-depth insights on the General Data Protection Regulation. Additionally, I would like to thank my second reader for taking the time to read and evaluate this thesis. Last but not least, I would like to thank Edo Santema and Bram Walda, who have been very supportive during the writing process of this project.

(3)

Abstract

The concern about collecting and management of personal data online has become more significant in scale, to the extent that the government needed to intervene. Studies have been conducted with the focus on privacy regulations and their effect on consumers. As the GDPR has been enforced recently, no prior studies have focused on this new legislation. Therefore, the purpose of this study was to determine a causal relationship between the GDPR and consumers. Specifically, we looked into the effect of transparency in the implementation of the GDPR on consumers’ online privacy concern and behaviour. Previous research provided evidence of the relationship between transparency and online privacy concern and behaviour. So, we predicted that transparency in the implementation of the GDPR would affect consumers’ online privacy concern and behaviour. A sample of internet users (N=281) filled out an online questionnaire, in which we manipulated online consents. No relation was found between the transparency in the implementation of the GDPR and consumers’ online privacy concern and behaviour. Based on these findings, our model is not supported. However, results showed relations between online privacy attitude, online privacy concern and online privacy behaviour. Moreover, the relation between concern and behaviour is moderated by trust in the company. This indicates that online businesses should not only focus on being compliant to the GDPR, but also to gain trust to improve their consumer’s online privacy experiences. Future research is needed to gain insight into the generalizability of these results across multiple ways of implementing the GDPR.

(4)

Table of content

Statement  of  originality   2  

Acknowledgements   2  

Abstract   3  

1  Introduction   6  

1.1  Problem  statement   6  

1.2  Research  objective  and  method   7  

1.3  Structure  of  research   9  

2  Literature  review   10  

2.1  Privacy  regulations   10  

2.2  General  Data  Protection  Regulation   10  

2.3  Transparency  and  awareness   12  

2.4  Online  privacy  concern   14  

2.5  Sensitivity  of  industry   15  

2.6  Online  privacy  attitude   16  

2.7  Trust   17  

2.8  Privacy  risk  profile   18  

2.9  Conceptual  model   19   3  Methodology   21   3.1  Pre-­‐‑test   21   3.2.1 Experiment design   24   3.2.1  The  experiment   25   3.3  Sample  characteristics   25  

3.4  Measures  of  the  variables   26  

4  Results   30  

4.1  Data  Preparation   30  

4.2    Comparing  experimental  groups  to  background  characteristics   32  

4.3  Comparing  means   32  

4.5  Testing  hypotheses   36  

4.5.1 Online privacy concern   36  

4.5.2 Online privacy behaviour   39  

4.5.3 Summary of results   42  

5  Discussion   45  

5.1  General  discussion   45  

5.2  Theoretical  and  practical  implications   48   5.3  Limitations  and  future  research   49  

6  Conclusion   51  

References   53  

Appendix  1:  GDPR  Articles   57  

Art.  15:  Right  of  access  by  the  data  subject   57   Art.  17:  Right  to  erasure  (‘right  to  be  forgotten’)   59  

Art.  32:  Security  of  Processing   61  

Appendix  2:  Conditions   63   Condition  1   63   Condition  2   64   Condition  3   65   Condition  4   66   Condition  5   67  

(5)

Condition  6   68  

Condition  7   69  

Condition  8   70  

Appendix  3:  Pretest   71  

Appendix  4:  Situations   75  

Situation:  Car  rental  company  YouCar  (low  sensitive  industry)   75   Situation:  Dentist  De  Wit  (high  sensitive  industry)   77  

Appendix  5:  Questionnaire  items   79  

(6)

1 Introduction

1.1 Problem statement

As the daily usage of computers, network technologies and internet increases, so do the concerns about the collecting and management of personal data (Caudill & Murphy, 2000). By collecting personal data, businesses retrieve information on consumers’ interests and preferences, which helps them to tailor their products and services (Liu & Arnett, 2002). Because personal data is collected, stored and managed in online databases, companies are now able to develop customer profiles through analysis, manipulation, and integration of the data (Schwaig, Kane & Storey, 2006). Although companies can provide personalized services and products, these techniques present a possible threat to the consumer’s privacy.

At the beginning of the 1970’s, attention has been focused on information privacy. By 1986, privacy was one of the four ethical issues of the information age (Mason, 1986). Governments have introduced several international policy requirements that address areas of particular concerns, including; computer monitoring, public data and access to it, personal information privacy and individuals’ attitudes towards privacy on the internet (Schwaig, Kane & Storey, 2006). These regulations limit collecting and management of personal data to protect citizens from companies invading their privacy.

The Data Protection Directive, also known as the Privacy Directive, Data Protection Directive or Directive 95/46/EC, is a European directive that regulates the processing of personal data within the European Union (EU). The Data Protection Directive was active from 1995 and replaced by the General Data Protection Regulation (GPDR) in 2016. The GDPR is a regulation which is intended to strengthen and unify data protection for all individuals within the EU. It is a regulation by the European Parliament, the Council of the EU and the European Commission that was adopted in April 2016 and has become enforceable on May 25th, 2018 (European Commission, n.d.).

(7)

Research has indicated the readiness of companies for the GDPR and the effects of privacy regulations on online privacy concerns. Since the majority (60%) of the companies does not seem to be ready for the enforcement of the GDPR and over a third (36%) of the companies worry failure to comply to the GDPR will have severe impact on their brand reputation, it is essential to understand the effect of implementing the GDPR on consumers’ online privacy concern and behaviour.

In the academic field, researchers have been focusing on privacy policies and their effects on consumers’ awareness, attitude, concern, and behaviour. Researchers Dommeyer and Gross (2003) and Miltgen and Smith (2015) have studied this thematic in depth. The researchers found positive correlations between the consumers’ level of awareness of the regulation and the level of perceived privacy protection, trust in the company and their online privacy concern. One can say that privacy legislations influence consumers trust in their online privacy positively. However, because the GDPR has been enforced only recently, it remains unclear to what extent the implementation of the GDPR will impact consumers’ online privacy concern and behaviour. This study aims to fill that literature gap.

1.2 Research objective and method

In this paper, we will look into the effect of the GDPR on consumers’ online privacy concern and behaviour. More specifically, we will research the effect of transparency in the implementation of the GDPR on consumers’ online privacy concern and online privacy behaviour. While testing this main effects, the following variables will be tested as moderators in our model: online privacy attitude, trust in the company and online privacy risk profile. By looking into this, the following questions can be answered: Does transparency of the companies’ compliance with the GDPR influence consumers’ online privacy concern and behaviour? Can customers’ online privacy attitude or the sensitivity of the industry change the effect of transparency in the implementation of the GDPR on their online privacy concern?

(8)

Can customers’ online privacy risk profile or their trust in the company change the effect of transparency in the implementation of the GDPR on their online privacy behaviour? Answering these questions is relevant now because the two-year trial of the GDPR is over and the enforcement supposedly has a huge effect on online consumers’ online privacy concerns and behaviour. This effect can have a tremendous impact on online industries. This paper will answer the following research question:

RQ: To what extent does transparency in the implementation of the General Data Protection Regulation effect consumers’ online privacy concern and behaviour?

To answer the research question, an experimental survey will be employed. We will manipulate online environments to test the effect of transparency in the implementation of the GDPR. To do so, we selected three selected GDPR articles: right to access, right to be forgotten and security of data processing. In the experiment, we will also test the effect of industry sensitivity.

This study builds on the existing literature on the relationship between privacy regulations and consumers’ concern and behaviour. Unlike other studies, this study will focus on the implementation of one regulation in particular. The effects of the GDPR on online businesses have raised a lot of media attention the last year. However, little to no attention has been given to the effect of the GDPR on the customers of these businesses. This study brings causal insights on the enforcement of privacy regulations, the GDPR specifically, on consumers’ online privacy concern and behaviour. Besides filling a literature gap, results of this study can give managers insights into the importance of companies to comply with the GDPR guidelines. Once the managers realise the urgency of implementing the GDPR correctly, fewer fines will be distributed. Moreover, the positive impact on their customers by

(9)

providing them with assurance about the use of their data can bring the business competitive advantage.

1.3 Structure of research

This study will begin with an extensive review of the previous academic literature. Hereafter, the design and methodology of the paper will be explained. The results of the experiment will be presented and analysed in the fourth chapter. Ultimately, we will discuss both the academic and managerial implications of the findings, explain future research suggestions, conclude and summarize the paper.

(10)

2 Literature review 2.1 Privacy regulations

Several factors that influence the strategy of data privacy. Sarathy and Robertson (2003) created a model that shows how these factors affect privacy strategy. According to their model, the approach of privacy is affected by precursors such as national history and culture and global societal trends. Besides that, there are external factors such as existing and pending legislation and the importance and sensitivity of the data being gathered. In spite of several firm-specific factors and cost-benefit analysis filters that can influence the privacy strategy choice, regulations imposed by the government are out of their hands. Governments are enforcing regulations to protect their citizens’ privacy for several decades. There are several levels of government involvement in corporate privacy management. The model of Milberg, Smith, and Burke (2002) describes five levels of government involvement (low to high): self-help, voluntary control, data commission, registration and licensing.

In 1980, the privacy principles of the United States were adopted by the OECD Council. The guidelines aimed to form a basis for privacy legislation across member states. These principles formed the basis of the European Commission Data Protection Directive (Directive 95/46)/EC), which is the core of Europeans privacy regulation (Cambell, Goldfarb & Tucker, 2015). These guidelines summarize eight fundamental principles: collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability. The Data Protection Directive was active from 1995 and replaced by the General Data Protection Regulation (GPDR) in 2016, which was enforced on May 25th, 2018.

2.2 General Data Protection Regulation

(11)

organization is located within the EU. The GDPR aims to give citizens back control of their personal data, and to simplify the regulatory environment for business. It strengthens and harmonizes the rules for protecting individual’s privacy rights and freedoms within and under certain conditions, outside the EU territory. Companies that do not comply with the GDPR can get fined up to 20 million euros or 4% of their global turnover (European Commission, n.d.). The GDPR will result in restrictions on commercial data use and will increase the spending on compliance. On the other hand, implementing the GDPR strengthens customer trust and confidence in the business. In the long term, the GDPR is a step to safeguard data security rights of EU citizens. To be compliant to the GDPR, companies have to follow the regulation, comprises of 99 articles (European Union, 2016). The articles that are explicitly there for the rights of the data subject are:

1)   Transparency and modalities:

a.   Transparent information, communication, and patterns for the exercise of the rights of the data subject.

2)   Information and access to personal data:

a.   Information to be provided where personal data are collected from the data subject,

b.   Information to be provided where personal data have not been obtained from the data subject,

c.   Right of access by the data subject. 3)   Rectification and erasure of personal data:

a.   Right to rectification,

b.   Right to erasure “right to be forgotten”, c.   Right to the restriction of processing, d.   Right to data portability.

(12)

4)   Right to object and automated individual decision-making (including profiling): a.   Right to object,

b.   Automated individual decision-making, including profiling.

A study by Senzing (2018) showed the readiness of companies regarding the implementation of the GDPR. The study found that 60% of companies in general are not compliant with the GDPR. Based on evaluations, Senzing (2018) says that 24% of the companies appear at “GDPR risk”, 36% appear “GDPR challenged”, and only 40% of the companies appear “GPDR ready”. Besides Schwaig et al. (2006) showing a difference in compliance amongst different industries, the study of Senzing (2018) showed that, regarding compliance for the GDPR in specific, there is a noticeable difference in readiness between SMEs and large enterprises. The findings show that 13% of SMEs, opposed to only 8% of large companies, are “not confident they can justify where their data is stored”. Overall, only a third of the companies stated they are “very confident” that all databases that contain personal data are GDPR compliant. Companies are worried that the compliance of their company will affect their reputation. Over a third (36%) of all companies in the study think that failure to comply to the GDPR will have an impact on their brand reputation. This concern rises for large companies: 56% of the large companies anticipate an impact on their brand, compared to 36% of SMEs and 27% of micro businesses (Senzing, 2018).

2.3 Transparency and awareness

Information privacy has become a prominent subject since the 1970’s, which indicates that internet consumers are concerned about their privacy. The concept of information privacy “deals with the rights of those people whose information is shared” (Okazaki, Li, and Hirose, 2009, p. 64). The concept is defined as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others” (Westin, 1967, p. 7). It becomes increasingly relevant, especially in

(13)

online digital environments, where the automated and often invisible process of collecting consumer’s personal information occurs. This personal information can be stored indefinitely and used for later use (Okazaki et al., 2009). Consumers are becoming aware of the fact that “companies are collecting consumer information and using it for marketing purposes, or monitoring consumers in a way that could be perceived as intrusive” (Udo, 2001, p. 166).

Privacy awareness refers to the scope of what extent an individual is aware or has the knowledge, of what data is being collected (Park & Chung, 2017). Ginosar and Ariel state that privacy awareness “indicates an individual’s level of knowledge of Internet technologies and risks related to that” (2017, p. 949). According to Okazaki et al., awareness of privacy practices is the “degree to which consumers worry about their awareness of organizational information privacy practices” (2009, p. 65). Trepte, Teutsch, Masur, Eicher, Fischer and Hennah (2014) divide privacy awareness into four dimensions, namely: (1) knowledge about the activities of the organization in question, (2) knowledge of technical aspects of online privacy and data protection, (3) knowledge of online data protection laws, and (4) knowledge of personal strategies to maintain privacy regulation. To become aware of their privacy risks, consumers need to read the privacy policies and information on organization’s websites. The literature review of Ginosar and Ariel (2017) points out that half of the internet users read these policies, which provide them insights on whether they can cope with the risks or not.

Research on the effect of privacy regulation on consumers is rare. Researchers that did focus on this literature gap are Dommeyer and Gross (2003) and Miltgen and Smith (2015). Miltgen and Smith found a positive correlation between the level of awareness of the regulation and the level of perceived privacy regulatory protection. In their research, perceived regulatory privacy protection refers to “an individual’s perceptions regarding the existence and adequacy of provisions and systems for protecting his or her personal data” (2015, p. 743). When it comes to research of online privacy in general, several studies found

(14)

that consumers with higher awareness of their only privacy are more cautious when it comes to sharing personal data (Christorfides, Muise & Desmarais, 2012; Bartsch & Dienlin, 2016). Another study from Debatin, Lovejoy, Horn, and Hughes (2009) similarly found a significant positive relation between the understanding of the privacy settings and the usage of the settings (more cautious).

According to Cambell et al., government regulations do not attempt to limit the collecting and management of data, but to add transparency to it (2015). We argue that online privacy awareness is triggered by the amount of information disclosed by the company. Therefore, transparency plays an important role when it comes to online privacy behaviour. Online privacy behaviour could be specified as visiting a website, online registering, or disclosing personal data to a website. Because previous research has proven the relationship between online privacy concern and online privacy behaviour (Dienlin & Trepte, 2015), we suggest that a higher level of transparency leads to less online privacy concern and more positive online privacy behaviour.

Hypothesis 1. Higher transparency (presence of GDPR or not) will lead to lower online

privacy concern.

Hypothesis 2. Higher transparency (presence of GDPR or not) will lead to more positive

online privacy behaviour.

2.4 Online privacy concern

Privacy concern has been described as “the desire to keep personal information out of the hands of others” (Buchahan, Paine, Joinson & Reips, 2007, p. 158). According to Dienlin and Trepte (2015) privacy concern captures negatively the attitude that people feel when third parties are copying personal rights, information or behaviour. Their findings prove that there is an indirect effect between online privacy concern and online privacy behaviour. This effect implies that people who have privacy concern are more sceptical regarding their online

(15)

behaviour. We assume this effect will be not only significant when indirect but also direct. We also assume that this online privacy concern will mediate the effect of transparency on online privacy behaviour.

Hypothesis 3. Higher online privacy concern will lead to more negative online privacy

behaviour.

Hypothesis 4. Online privacy concern will mediate the effect of transparency (presence of

GDPR or not) on online privacy behaviour.

2.5 Sensitivity of industry

Previous studies showed differences in the level of compliance of privacy regulations within industries. One particular study examined the privacy policies of the Fortune 500 to assess their regular information practices and the degree to which they complied with the fair information practices (FIP) (Schwaig, Kane & Storey, 2006). The research showed that hardware, software and telecommunication industries alleged for 55% of the participation in privacy seal programs, even though they were only 13% of the companies in the sample they have used (Schwaig et al., 2006). An explanation for the differences between the industries is that the United States has industry-specific regulatory rules (Culnan 2000). Therefore, consumers’ reactions to online privacy will differ per industry sector. According to Malthore, Kim, and Argawal (2004), it is important for professionals to understand the privacy concern in general and specifically to the industry sector. In their article, they state that “the practical boundary of information privacy in real life varies with numerous factors including industry sectors” (Malthore et al. p. 337). The results of their study show that an individual’s privacy concern specifically will be influenced by industry sector as well. Besides that, a request for more sensitive information reduces trust and increases perceived risk. Based on these findings, we assume that consumers will be more concerned about their online privacy in industries that are highly sensitive (e.g. financial, medical, etc. sectors).

(16)

Hypothesis 5a. Consumers are more concerned about the sensitivity of their data if the

industry is highly sensitive.

Hypothesis 5b. Industry sensitivity will positively moderate the relation between

transparency and online privacy concern.

2.6 Online privacy attitude

According to Dienlin and Trepte (2015), privacy concern can be related to the concept of attitudes. An attitude is “an evaluative integration of cognition and affects experienced in relation to an object” (Crano & Prislin, 2006, p. 347). In general, attitudes towards online privacy can be both positive and negative. The main difference to privacy concern is the polarity and scope: online privacy concerns are unipolar, whereas privacy attitudes are bipolar. The privacy concerns measure if, for example, people are afraid that their bank account would be hacked, which will result in a negative feeling. Privacy attitudes measure if, for example, people think that it is to their advantage or disadvantage to using online banking, which can result in a positive or negative feeling. Regarding the scope, privacy attitudes can be applied to every single online privacy action, such as creating a social media account or posting pictures (Dienlien & Trepte, 2015). Several studies researched the gap between attitude and behaviour (Kaiser, Byrka & Hartig, 2010). It has been shown that subjective norms, peer pressure and situational constraints can have a substantial influence on respondent answering behaviour. Respondents might withhold their real opinions or even provide false answers if they perceive strong situational constraints and norms are forcing them to do so. According to previous studies, it seems that privacy attitudes are largely built on second-hand experiences (European Commision, 2012; Trepte, Dinelin & Reinecke, 2013). It appears that a lack of personal experiences is the reason why there is no significant effect between the attitude and behaviour. Based on these findings, we reasoned that there is no effect between online privacy attitude and online privacy behaviour. Because someone’s

(17)

attitude towards online privacy can be positive or negative on personal or second-hand experiences, and privacy concern can also be positive or negative based on previous experiences, we assume that there is an effect between online privacy attitude and online privacy concern.

Hypothesis 6a. Consumers are more concerned about the sensitivity of their data if their

attitude towards online privacy is more critical.

Hypothesis 6b. Attitude towards online privacy will positively moderate the relation between

transparency and online privacy concern.

2.7 Trust

Several studies are dedicated to the online privacy concern of consumers. The research of Ginosar & Ariel (2017) explains that consumers are mainly concerned with: 1) the collection of personal information without informed consent, 2) the degree to how much information is shared with third parties and, 3) to what extent data is being used for secondary purposes without the consumer knowing while using the internet. Miltgen and Smith view the perceived privacy protection as a salient factor in determining consumers trust in companies (2015). They found results that supported the positive correlations between perceived privacy regulatory protection, online privacy concern, and preferences for regulatory protections. Miltgen and Smith results support their hypothesis that implies a positive relationship between trust and privacy risk concern: “Higher levels of trust in entities associated with information privacy will be associated with lower levels of privacy risk concerns” (p. 744, 2015).

To answer the consumers’ online privacy concern, companies are producing software and technologies to help consumers protect their anonymity and keep their financial information (such as credit card numbers) safe. Companies are doing this because privacy concern is one of the main reasons that consumers give for avoiding purchasing online (Udo,

(18)

2001). Matt and Peckelsen build further on this and state that “people tend to proceed into protective actions such as refraining from information disclosure and falsifying information when such privacy concern occur” (2016, p. 4833). According to Park (2013), the level of privacy awareness affects consumers’ trust in online activities. Park also found that trust affects the willingness to disclose personal data and thus influences their online privacy behaviour. Mayer, Davis, and Schoorman state that “Trust becomes manifest in the willingness of a party to be vulnerable to the actions of another, based on the expectations that the other will perform a particular action important to the truster” (1995, p. 712). Also, the study of Sheehan and Hoy (1999) proves that when online privacy concern increases, users register less frequently and provide incomplete information, possibly because they have less trust in the website.

In line with these findings, we suggest that trust influences privacy behaviour directly, and that trust moderates the relationship between online privacy concern and online privacy behaviour.

Hypotheses 7a. Consumers with more trust in the company will have a more positive privacy

behaviour.

Hypotheses 7b. Trust will positively moderate the relation between online privacy concern

and online privacy behaviour.

2.8 Privacy risk profile

In marketing literature, perceived privacy risk is defined as beliefs about uncertainty and consequences about privacy (Pavlou, 2003). Okazaki et al. (2009) researched how privacy risk influenced opening, reading or responding to mobile advertising. They assumed that people are reluctant to release personal information if they expect adverse outcomes. Their results showed that online privacy concern increases mobile users’ perceived privacy risk in mobile advertising. It also showed that when mobile users are concerned about their

(19)

online privacy, they likely observe an increase in the likelihood of a negative online behaviour. Though the research of Okazaki et al. (2009) focusses on mobile advertising, their research had findings that proved the relationship between privacy risk and online privacy concern. It also shows a relationship between privacy risk and online privacy behaviour. Based on these findings, we formulate the following hypothesis:

Hypothesis 8a. Consumers with a higher online privacy risk profile will have more negative

online privacy behaviour.

Hypothesis 8b. Online privacy risk profile will positively moderate the relation between

online privacy concern and online privacy behaviour.

2.9 Conceptual model

Following from the literature review several hypotheses are formulated which are reflected in the conceptual model.

H1 Higher transparency (presence of GDPR or not) will lead to lower online privacy concern.

H2 Higher transparency (presence of GDPR or not) will lead to more positive online privacy behaviour.

H3 Higher online privacy concern will lead to more negative online privacy behaviour.

H4 Online privacy concern will mediate the effect of transparency (precesence of GDPR or not) on online privacy behaviour

H5a Consumers are more concerned about the sensitivity of their data if the industry is highly sensitive.

H5b Industry sensitivity will positively moderate the relation between transparency and online privacy concern.

H6a Consumers are more concerned about the sensitivity of their data if their attitude towards online privacy is negative.

H6b Attitude will negatively moderate the relation between transparency and online privacy concern.

(20)

H7a Consumers with more trust in the company will have a more positive privacy behaviour.

H7b Trust will positively moderate the relation between online privacy concern and online privacy behaviour.

H8a Consumers with a higher online privacy risk profile will have more negative online privacy behaviour.

H8b Online privacy risk profile will positively moderate the relation between online privacy concern and online privacy behaviour.

Table 1: Summary of hypothesis

(21)

3 Methodology

The purpose of this study is to bring causal insights on the enforcement of the GDPR on consumers’ online privacy concern and behaviour. More specifically, the effect of transparency in the implementation of the GDPR is tested. To test the effect of transparency in the implementation of the GDPR, a survey was employed. To understand to what extent transparency effects the dependent variables, we manipulated online environments with either no implementation of the GDPR (no transparency), or three different implementations of the GDPR that highlighted one of the three selected GDPR articles: right to access, right to be forgotten and security of data processing (transparency). These articles have been selected based on their addition to the GDPR compared to the previous enforced privacy regulation. The original articles can be found in Appendix 1. In the methodology, we give an overview of the pre-test, followed by the experimental design, procedure and sample characteristics. Finally, we will explain the operationalization of the measurements.

3.1 Pre-test

Before conducting the main experiment, we did a pretest (N=27). The goals of this pre-test were 1) to indicate which (manipulated) consents in the online environments would be suitable for the actual experiment, 2) to determine if the explained situation in the experiment was clear and 3) what industries were experienced as low or high sensitive. The questionnaire of the pre-test can be found in Appendix 3. The pretest was carried out through an online survey via Qualtrics with a within-subjects design. We have chosen this method to make the environment as similar as possible to the actual experimental environment. A self-selection sampling was also used when recruiting respondents for the pretest.

The first step was to determine which consents would be suitable for the experiment. We tested a 2 x 4 design: transparency in the implementation of the GDPR (no GDPR article

(22)

highlighted, right to access highlighted, right to be forgotten highlighted, security of processing highlighted) written in technical and in layman style (see table 4).

No GDPR articles highlighted Right to access highlighted Right to be forgotten highlighted Security of processing highlighted

Technical wording Layman wording 1 2 3 4 5 6 7 Table 4: Conditions in the pre-test

At first, each participant was shown all nine conditions of the stimulus material, followed by an open question of what they had noticed about the stimulus material. We asked open questions first so that the respondents would not be primed. The test showed us that 85.2% of the respondents saw the correct differences between the stimulus material. After this, they were asked dichotomous questions about whether they had seen the GDPR articles in the stimulus material. On the basis of the following statements, it was examined whether there was a clear difference between the stimuli of the three selected GDPR articles: "In image #, a privacy statement is shown”, “In image # the privacy policy emphasizes the following (right to access, right to be forgotten, security of processing)”, “The privacy statement in image # is clear to me”. Table 5 summarizes the results of the manipulation per condition. The test showed us that 79.2% recognized the correct highlighted article in manipulation 4 and 66.7% thought it was clear (versus 62.5% and 79.2% in manipulation 2), 76.0% recognized the correct highlighted article in manipulation 6 and 80.0% thought it was clear (versus 65.2% and 78.3% in manipulation 3) and 95.2% recognized the correct highlighted article in manipulation 7 and 71.4% thought it was clear (versus 79.2% and 66,7% in manipulation 4). Based on these figures, it became clear that the manipulations were ready for the experiment and that the manipulations written in layman style were experienced as clearer rather then manipulations written in technical style. After the respondents answered questions about the manipulations, they were informed about the differences between the

(23)

consents in technical and layman wording and were asked which style was more apparent to them. This test confirmed that respondents preferred layman wording (50.0%) over technical wording (15.4%) and that 34.6% did not know what style they preferred. Therefore, we choose to use layman wording in the experiment.

Recognized privacy policy Correct highlighted article Clarity consent

Condition 1 n.a. n.a. 57.1%

Condition 2 88.9% 62.5% 79.2% Condition 3 88.5% 65.2% 78.3% Condition 4 92.3% 79.2% 66.7% Condition 5 84.6% 59.1% 81.8% Condition 6 96.2% 76.0% 80.0% Condition 7 80.8% 95.2% 71.4%

Table 5: Results pre-test: manipulation per condition

Afterwards, the pre-test determined if the situation explained in the experiment was clear for the respondents. The test showed us that 96.2% of respondents found that the explained situation of the online consumer was clear and that 80.8% of the respondents thought the visualized online environments represented the situation clear. Therefore, we did not change the situations nor the visualizations after the pre-test.

Finally, the pre-test determined what industries were experienced as low or highly sensitive. Respondents were asked to rank eight type of industries from low to highly sensitive (see table 6). The test showed us that the respondents experienced car rental services as low sensitive (weighted average: 5.55) and medical services as highly sensitive (weighted average: 3.09.) The results showed us that car rental industries were experienced as a low sensitive, and medical industries were experienced as high sensitive. To exclude pre-existing biases against or in favour of existing companies, we choose to create situations for fictional companies in the experiment.

Medical Travelling

Ranking Weighted average 1

2

3.09 3.39

(24)

Banking Housing E-commerce low E-commerce high Car rental 3 4 5 6 7 3.70 3.91 4.48 5.09 5.55 Table 6: Ranking of the industries

3.2.1 Experiment design

The study was conducted in the form of an experiment, executed through an online survey. An experiment was used for this study as the aim was to determine a causal relationship between the transparency in the implantation of the GDPR and consumers’ online privacy concern and behaviour. Gathering data via an experiment indicates high internal validity because the questions can be shown statistically to be associated with an outcome (Saunders et al., 2016).

This experiment had a 2 (industry sensitivity: low sensitive, high sensitive) x 4 (transparency in the implementation of the GDPR: no GDPR article highlighted, right to access highlighted, right to be forgotten highlighted and security of data processing highlighted) between subjects factorial design (conditions summarized in table 3). The experiment included two control groups (condition 1 and 5) to rule-out alternative

explanations and to guarantee that change in the dependent variables could only be assigned to the manipulated independent variables, instead of other unknown variables. An overview of the conditions can be found in Appendix 2.

No GDPR articles highlighted Right to access highlighted Right to be forgotten highlighted Security of processing highlighted

Low sensitive industry High sensitive industry 1 2 3 4 5 6 7 8 Table 3: Research design: conditions

(25)

3.2.1 The experiment

The experiment has been conducted online, using Qualtrics. In the experiment, randomly assigned respondents were exposed to different (manipulated) online environments (Appendix 2). After obtaining informed consent, respondents were given the self-­‐ administered online questionnaire. Dependent on their condition, participants were either registering on the website of Car rental company YouCar (low sensitive industry) or dentist De Wit (high sensitive industry) (Appendix 3). They were asked to read the privacy policy of the website and after that saw the register form that included the manipulated consent. After this, they were asked to answer questions regarding the transparency in the implementation of the GDPR, their online privacy attitude, online privacy concern, online privacy behaviour, trust in the company and privacy risk profile on five-­‐point Likert scales (1 = Fully agree, 5= Fully disagree, 1 = Very concerned, 5= Not concerned at all). The questions were formulated based on existing scales (Appendix 5). Additionally, they were asked to answer other demographic questions about their gender, age and highest level of education as these variables could serve as control variables. The total duration of the testing procedure was about 10 minutes.

3.3 Sample characteristics

In total 281 people (50.18% of the respondents male, 65.83% of the respondents between 18 and 29 years old) participated voluntarily in the experiment. In table 2, we summarize the respondents' profiles regarding gender, age and highest level of education.

Gender Male Female Age 18-29 years old 30-49 years old N % 141 140 185 63 50.18% 49.82% 65.83% 22.42%

(26)

50+ years old

Highest level of education High school or lower University bachelor University master of PhD 33 43 126 112 11.74% 15.30% 44.84% 39.86% Table 2: Demographics (N=281)

The sampling technique used is non-probability sampling, specifically convenience sampling. The only criteria for participants of this research were that they had to be 18 years or older, and therefore convenience sampling was a sufficient method. The survey was distributed via several online forums (such as indiehackers.com and reddit.com) and social media, requesting volunteers. Data collection lasted two weeks.

3.4 Measures of the variables

Most of the measures we use are adapted from published studies. An overview of the measures can be found below. All questionnaire items can be found in Appendix 5.

Online privacy concern

We modified some scale items from an established scale (Lwin et al. 2007) to measure the construct online privacy concern. Online privacy concern is operationalized with the four following statements: 1) How concerned are you that your personal data may be used for purposes other than the reason that you proved the information for? 2) How concerned are you about your online personal privacy on this website? 3) How concerned are you about the fact that this website might know/track the sites you visited? 4) How concerned are you about this website sharing your personal information with other parties? These statements are measured on a 5-point Likert scale (1 = Very concerned, 5= Not concerned at all). In the current sample, Cronbach’s alpha coefficient indicated a good internal consistency (α =.81). The corrected item-total correlations indicate that all the items have a good correlation with the total score of the scale (all above .30).

(27)

Online privacy behaviour

We modified some scale items from an established scale (Lwin et al. 2007) to measure the construct online privacy behaviour. Online privacy behaviour is operationalized with the three following statements: 1) I would register on this website. 2) I would agree to provide personal information to this website. 3) I would visit this website without concern. These statements are measured on a 5-point Likert scale (1 = Fully agree, 5= Fully disagree). In the current sample, Cronbach’s alpha coefficient indicated a good internal consistency (α =.79). The corrected item-total correlations indicate that all the items have a good correlation with the total score of the scale (all above .30).

Transparency in the implementation of the GDPR

Transparency in the implementation of the GDPR is operationalized by showing a consent with no GDPR articles highlighted, or a consent with one of the three selected GDPR articles highlighted.

Industry sensitivity

Industry sensitivity is operationalized by exposing participants to either an industry with low sensitivity: car rental company YouCar, or an industry with a high sensitivity: dentist De Wit. These industries are selected based on the outcomes of the pre-test (chapter 3.2).

Online privacy attitude

We modified some scale items from an established scale (Okozaki et al. 2009) to measure the construct online privacy attitude. Online privacy attitude is operationalized with the three following statements: 1) Companies seeking information online should disclose the

(28)

way the data are collected, processed, and used. 2) A good consumer online privacy policy should have clear and conspicuous disclosure. 3) It is very important to me that I am aware of and knowledgeable about how my personal information will be used by websites. In the current sample, Cronbach’s alpha coefficient indicated a reasonable internal consistency (α =.69). The corrected item-total correlations indicate that all the items have a good correlation with the total score of the scale (all above .30).

Trust in the company

We modified some scale items from an established scale (Hong & Cho, 2009) to measure the construct trust in the company. Trust in the company is operationalized with the following statements: 1) I think that the information offered by this site is sincere and honest, 2) I think I can have confidence in the promises that this website makes, 3) This website does not make false statements, 4) This website is characterised by the frankness and clarity of the services that it offers to the consumer. These statements are measured on a 5-point Likert scale (1 = Fully agree, 5= Fully disagree). In the current sample, Cronbach’s alpha coefficient indicated a good internal consistency (α =.79). The corrected item-total correlations indicate that all the items have a good correlation with the total score of the scale (all above .30).

Online privacy risk profile

We modified some scale items from an established scale (Okazaki et al. 2009) to measure the construct online privacy risk profile. Online privacy risk profile is operationalized with the following statements: 1) It would be risky to give personal information to online companies. 2) There would be high potential for information loss associated with giving personal information to online companies. 3) Providing online firms with personal information would involve many unexpected problems related to privacy. 4) I

(29)

would not feel safe giving personal information information to online companies. In the current sample, Cronbach’s alpha coefficient indicated a good internal consistency (α =.83). The corrected item-total correlations indicate that all the items have a good correlation with the total score of the scale (all above .30).

Demographics

Gender, age, and highest level of education were assessed with a self-report questionnaire. Gender has been reported on two levels: male and female. Age has been measured on three levels: 18-29 years old (young), 30-49 years old (middle), 50 years or older (old). Highest level of education was also measured on three levels: high school or lower (low), university bachelor (middle), university master or PhD (high).

(30)

4 Results

To perform the statistical analyses, the IBM SPSS Statistics 25 software was used. This chapter describes the statistical analysis and the results of this research. First, it is explained how the data is prepared for analysis. Next, the reliability and correlations are described, and means are compared. After this, the hypotheses are tested with an PROCESS analysis for direct and indirect effects in moderated mediation models with multiple moderators.

4.1 Data Preparation

After removal of incomplete surveys (N=179), 281 complete surveys remained. There was no need to recode reversed items of the scales. Two demographical scales (age and highest level of education) needed to be recoded because of a lack of participants in some of the items (see chapter 3.3). Furthermore, dummies were computed for the independent variable transparency and control variables age and highest level of education. After preparing the data for analysis 281 cases remained.

The conditions were equally distributed amongst the participants (Table 7). Condition one consists of 28 surveys, condition two consists of 42 surveys, condition three consists of 37 surveys and condition four consists of 35 surveys, condition five consists of 34 surveys, condition six consists of 38 surveys, condition seven consists of 34 surveys and condition eight consists of 33 surveys.

Condition N

1: car rental, no GDPR article highlighted 2: car rental, article right to access highlighted 3: car rental, article right to be forgotten highlighted 4: car rental, article security of processing highlighted 5: dentist, no GDPR article highlighted

6: dentist, article right to access highlighted 7: dentist, article right to be forgotten highlighted 8: dentist, article security of processing highlighted

28 42 37 35 34 38 37 33

(31)

Table 7: Distribution of conditions

To test the trustworthiness of the variables, a Principal Axis Factoring analysis with Varimax Rotation was conducted with 17 variables. The scree plot (Figure 2) shows that the first five factors account for most of the total variability (56,31%) in data (given by the eigenvalues). The eigenvalues for the first five factors are all greater than 1.

Figure 2: Scree Plot Factor Analysis

In Appendix 6 the factor loading after rotation is shown. The factor loading on the components suggests that the first component is represented by online privacy attitude, the second component by online privacy concern, the third component by online privacy behaviour, the fourth component by trust in the company, and the fifth component by online privacy risk profile. There is only one factor loading that suggests that one item into the component online privacy behaviour (0,343) would fit better in the component trust in the company (0,403) (Appendix 6). As this is the only item that does not have the highest factor load in its component, it will not be recoded to another component.

(32)

Furthermore, the Cronbach’s Alpha was checked for the scales online privacy attitude, online privacy concern, online privacy behaviour, trust in the company and online risk profile was checked. The results for this test can be found in Appendix 5. All the scales had good reliability. Besides that, the corrected item-total correlations indicated that all the items have a good correlation with the total score of the scale (all above .30). Also in all of the scales, none of the items would substantially affect reliability if they were deleted.

4.2 Comparing experimental groups to background characteristics

To ensure that the experimental groups had no significant differences, a Chi-Square test was conducted. The Chi-Square test proved that all groups are similarly distributed. No significant differences were found in gender (Χ2= 9.292, p = 0.232), age (Χ2= 8.076, p = 0.885), nor highest level of education (Χ2= 13.300, p = 0.503). Because the groups are comparable to each other, differences that cannot be traced back to background characteristics.

4.3 Comparing means

The means, standard deviations, and correlations of the primary variables of this study are reported in Table 8. Table 8 clarifies the correlations and strength between variables. For this analyses, the independent variables (transparency in the implementation of the GDPR, industry sensitivity, online privacy attitude, trust in the company, online privacy risk profile), dependent variables (privacy concern, privacy behaviour) and control variables (gender, age, highest level of education) are included. Because the demographic variables (gender, age, highest level of education) are significantly correlated with the dependent variables in the model, they will be included in the analysis as control variables (Table 8).

(33)

variables from the model. First, trust in the company is correlated with online privacy behaviour (r = .53, p<.01). This means that the higher participants score on trust in the company, the more likely they are to score higher on online privacy behaviour. This is not surprising, as this suggests that when participants have more trust in a company, they have more positive online privacy behaviour (visiting, registering and providing personal information to this website). Furthermore, online privacy risk profile is correlated (r = .49, p<.01) with online privacy concern. This means that if participants score higher on online privacy risk profile, they are more likely to score higher on online privacy concern. This was as we predicted: participants that have a high online privacy risk profile in general are more likely to score higher on online privacy concern. Also, online privacy attitude is correlated (r = .33 p<.01) to online privacy concern. This means that if participants have a negative attitude about online privacy, they are more likely to be concerned about it. Furthermore, trust in the company is correlated with online privacy risk profile (r = -.20, p<.01) and online privacy concern (r = -.30, p<.01). This means that if participants score higher on trust in the company, the more likely they are to score lower on online privacy risk profile and online privacy concern. Also, online privacy risk profile is correlated to online privacy attitude (r = -.38, p<.01) and online privacy behaviour (r = -.37, p<.01). This was expected; thus the higher participants score on online privacy risk profile, and thus higher concerned about their privacy in general, the higher they score on online privacy attitude and online privacy behaviour in the experiment. Online privacy attitude (r = -.31, p<.01) correlated to online privacy behaviour. This means that if participants score higher on online privacy attitude (and thus have a negative attitude), they are more likely to score lower (and thus more negative) on online privacy behaviour. Lastly, online privacy concern (r = -.44, p<.01) is correlated to online privacy behaviour. This is as expected: participants that score high on online privacy concern score lower (and thus more negative) on online privacy behaviour.

(34)

M SD 1 2 3 4 5 6 7 8 9 10 1. Gender a 1.50 .50 - 2. Age 1.46 .70 .04 - 3. Highest level of education 2.25 .70 .02 .10 - 4. Transparency b .78 .42 .03 .01 -.02 - 5. Industry sensitivity .49 .50 .02 .06 .09 -.06 - 6. Trust in the company 4.01 1.35 -.06 -.17** -.04 .08 0.06 - 7. Online privacy risk

profile 4.71 1.33 .07 .20** -.08 -.05 0.04 -0.20** - 8. Online privacy attitude 5.96 1.00 -.03 -.17** -.02 -0.5 0.06 -.11 -.38** - 9. Online privacy concern 5.30 1.88 -0.14* .28** 0.08 -.01 -0.02 -.30** .49** .33** - 10. Online privacy behaviour 4.31 1.50 -.09 -.25** 0.06 0.4 0.00 0.53** -.37** -.31** -.44** - ** = p < .01 * = p < .05

a
Gender (0 = Male, 1 = Female)

b Transparency (0 = No GDPR, 1 = GDPR) Table 8: Means, standard deviations, correlations

4.4 Manipulation check

An independent T-Test was used to check whether the consent was significantly differently perceived across the transparent groups (GDPR articles highlighted) and non-transparent groups (no GDPR articles highlighted). Immediately after the participants had seen the consent, five control questions were asked to check if the form was clear and detailed. Besides that, the manipulation questions were asked to check if the participants noticed the difference in highlighting different articles of the GDPR (right to access, right to be forgotten, security of processing) in the consent.

The groups that were assigned to the manipulations with transparent consents (GDPR articles highlighted) rated the clearness of the consent higher (M = 4.35, SD = 1.99) than the non-transparent groups (NO GDPR articles highlighted) (M = 3.63, SD = 2.01). This difference was significant (t (279) = 2.48, p = .014).

(35)

The groups that were assigned to the manipulations with transparent consents (GDPR articles highlighted) rated the detailedness of the consent slightly higher (M = 3.25, SD = 1.89) than the non-transparent groups (NO GDPR articles highlighted) (M = 3.02, SD = 1.95). However, this difference was not significant (t (279) = .84, p =.400).

The groups that were assigned to the manipulations with transparent consents (GDPR articles highlighted) rated perception of right to access in the consent higher (M = 3.53, SD = 2.02) than the non-transparent groups (NO GDPR articles highlighted) (M = 2.81, SD = 1.93). This difference was significant (t (279) = 2.51, p = .013).

The groups that were assigned to the manipulations with transparent consents (GDPR articles highlighted) rated perception of right to be forgotten in the consent higher (M = 3.20, SD = 1.99) than the non-transparent groups (NO GDPR articles highlighted) (M = 2.58, SD = 1.81). This difference was significant (t (106.47) = 2.33, p = .022).

The groups that were assigned to the manipulations with transparent consents (GDPR articles highlighted) rated perception of security of processing in the consent higher (M = 3.54, SD = 1.77) than the non-transparent groups (NO GDPR articles highlighted) (M = 3.21, SD = 1.86). However, this difference was not significant (t (279) = 1.28, p = .201).

Three out of five of the control questions were significantly different. Participants who were assigned to the non-transparent group (no GDPR articles highlighted) did not significantly differ on the perception of detailedness or security of processing from participants who were assigned to the transparent group (GDPR articles highlighted).

A Oneway between subjects Oneway ANOVA was conducted to continue the manipulation check. This test also showed no significant effect of perception of clarity of the consent for the two groups (F (3, 277) = 2.25, p = 0.083). There was also no significant effect of perception of detailedness of the consent found (F (3, 277) = 0.71 p = 0.544). Finally, the results showed no significant effect of perceived security of processing (F (3, 277) = .75, p =

(36)

0.524).

However, the Oneway ANOVA did find significant effects as well. There was a significant effect of perceived right to access (F (3, 277) = 15.35, p = .000). Also, there was a significant effect of perceived right to be forgotten (F (3, 277) = 14.98, p = .000).

This means that not all manipulations were experienced as expected from the pre-test. Thus, the manipulation of transparency was only partly successful. This should be taken into consideration while testing the hypothesis.

4.5 Testing hypotheses

PROCESS was used to run the analysis in SPSS. PROCESS was used since this analysis can be used to estimate direct and indirect effects in moderated mediation models with multiple moderators. We used model 45 (Hayes, 2013) to test a conditional indirect effect of transparency in the implementation of the GDPR on online privacy behaviour through online privacy concern. This indirect effect is tested with the following moderators: industry sensitivity, online privacy attitude, trust in the company and privacy risk profile. Besides the indirect effect, the direct effect of transparency in the implementation of the GDPR on online privacy behaviour was tested. The model is visualized in the conceptual model (chapter 2.9).

4.5.1 Online privacy concern

In Table 9 the results for the direct and indirect effects of transparency in the implementation of the GDPR on online privacy concern with moderation of online privacy attitude, industry sensitivity and the control variables gender, age and highest level of education can be found.

B t p F p R2

Online Privacy Concern

(37)

Transparency .12 .07 .942

Industry sensitivity -.94 -2.05 .041*

Transparency X Industry sensitivity .91 1.77 .078

Online privacy attitude .66 2.71 .007*

Transparency X Attitude -.85 -.32 .751

Gender .50 2.41 .016*

Age young (versus age old) -1.09 -3.19 .001* Age middle (versus age old) -.51 -1.35 .177 Education low (versus education high) -.37 -1.12 .239

Education middle (versus education high) -.22 -.92 .359 Significant at the p<.05 level Table 9: Process Hayes results dependent variable Online Privacy Concern

The model (transparency in the implementation of the GDPR, online privacy attitude, industry sensitivity, gender, age and highest level of education) explained a significant portion of the variance in online privacy concern scores, R2 = .19, F (10, 270) = 6.53, p<.001.

Transparency in the implementation of the GDPR does not support significantly predicted scores on online privacy concern, b = .12, t (281) = .07, p>.05. The null hypothesis cannot be rejected, which means that there is no difference between the groups of transparency in the implementation of the GDPR (no GDPR versus GDPR in consent) and there is no direct effect on online privacy concern. Therefore, H1 cannot be supported, there is no direct effect of transparency in the implementation of the GDPR on online privacy concern.

Industry sensitivity (low- versus high sensitive) does support significantly predicted scores on online privacy concern, b = -.94, t (281) = -2.05, p<.05. This means that if participants score higher on an industry sensitivity (high sensitive), they score lower on online privacy concern than participants that score low on industry sensitivity (low sensitive). However, for industry sensitivity, the null hypothesis predicted a positive correlation, in the effect of industry on online privacy concern. Therefore, H5a cannot be supported, there is no direct effect of industry sensitivity on online privacy concern. Furthermore, there is no

(38)

significant interaction effect between transparency in the implementation of the GDPR and industry sensitivity on online privacy concern, b = .91, t (281) = 1.77, p<.05. Therefore, the null hypothesis cannot be rejected and H5b cannot be supported, industry sensitivity does not positively moderate the relation between transparency in the implementation of the GDPR and online privacy concern.

Online privacy attitude does support significantly predicted scores on online privacy concern, b = .66, t (281) = 2.71, p<.01. This means if participants score higher on online privacy attitude, they score higher on online privacy concern: the more negative the attitude about online privacy, the higher the online privacy concern. Therefore, the null hypothesis can be rejected and H6a can be supported, consumers are more concerned about the sensitivity of their data if they are more aware of online privacy. Furthermore, there is no significant interaction effect between transparency in the implementation of the GDPR and online privacy attitude on online privacy concern, b = -.85, t (281) = -.32, p<.05. The null hypothesis cannot be rejected and H6b cannot be supported, online privacy attitude does not positively moderate the relation between transparency in the implementation of the GDPR and online privacy concern.

The PROCESS analysis also shows us the effect of the covariates. Gender supports significantly predicted scores on online privacy concern, b = .50, t (281) = 2.41, p<.05. This means women (1) are more likely to be concerned about their online privacy than men (0). Age young (18-29 years old) also supports significantly predicted scores on online privacy concern, b = -1.09, t (281) = -3.19, p<.01, with age old (50 years or older) as a reference category. This means that age young and age old score significantly different on online privacy concern. Age young scores lower on online privacy concern then age old, which means that young consumers are less concerned about their online privacy then old consumers. Age middle (30-49 years old) does not support significantly predicted scores on

(39)

online privacy concern, b = -.51, t (281) = -1.35, p>.05, with age old as a reference category. This means that age middle does not score significantly different on online privacy concern from age old. Educational level low (high school or lower) does not support significantly predicted scores on online privacy concern, b = -.37, t (281) = -1.12 p>.05, with educational level high (university master or PhD) as a reference category. This means educational level low does not score significantly different on online privacy concern from educational level high. Educational level middle (university bachelor) also does not support significantly predicted scores on online privacy concern, b = -.22, t (281) = -.92, p>.05, with educational high as a reference category. This means educational middle (university bachelor) does not score significantly different on online privacy concern from educational level high (university master or PhD).

4.5.2 Online privacy behaviour

In Table 10 the results for the direct and indirect effects of transparency in the implementation of the GDPR and online privacy concern on online privacy behaviour with moderation of trust in the company, online privacy risk profile and control variables gender, age and highest level of education can be can be found.

B t p F p R2

Online Privacy Behaviour

Overall model 19.51 . 000* .44

Online Privacy Concern -.56 -3.03 .003*

Transparency -.01 -.38 .970

Trust -.11 -.68 .497

Online Privacy Risk Profile -.10 -.66 .510 Online Privacy Concern X Trust .11 3.89 .000* Online Privacy Concern X Online Privacy

Risk Profile -.01 -.46 .644

Gender -.08 -.56 .579

Age young (versus age old) .46 2.01 .045*

Age middle (versus age old) .43 1.72 .087 Education low (versus education high) -.41 -1.93 .055

Referenties

GERELATEERDE DOCUMENTEN

(a) Brian: All Entities (b) Brian: Wiki Entities (c) Brian: Non-Wiki Entities Figure 3: Disambiguation results over different top k frequent terms added from targeted tweets.. the

In this thesis it is shown that the General Data Protection Regulation (GDPR) places anony- mous information; information from which no individual can be identified, outside the

In conclusion: parental consent is not likely to lead to improved protection of children’s personal data, given that consent does not actually give us control over our personal

States shall not impose any further security or notification re- quirements on digital service providers.” Article 1(6) reads as fol- lows: “This Directive is without prejudice to

50 There are four certification schemes in Europe established by the public authorities.The DPA of the German land of Schleswig- Holstein based on Article 43.2 of the Data

Taking into account that data separation strategies constrain commercial communication and strengthen responsible gambling approaches, their implementation may lead

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of

Thus, on the one hand, hospitals are pressured by the EU government, causing them to form similar policies concerning data protection, but on the other hand, the ambiguity of the GDPR