• No results found

Privacy Enhancing Technologies

N/A
N/A
Protected

Academic year: 2021

Share "Privacy Enhancing Technologies"

Copied!
66
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Master Thesis Business & ICT

Tim Reijnders

Privacy

Enhancing

Technologies

Case study: OV-Chip card

(2)
(3)

timreijnders@gmail.com November 2010

S1464248

Intellectual property disclosure:

Nothing from this paper may be reproduced, republished, distributed, transmitted, displayed, broadcast or otherwise exploited in any manner without the express prior written permission of the author.

Under no circumstances neither the author, nor the University of Groningen nor Accenture can be held liable for any damages. Whether direct, indirect, special or consequential damages for lost revenues, lost profits, or otherwise, arising

from or in connection with the content of this paper.

Accenture Nederland B.V. Technology Consulting Group

(4)

ACKNOWLEDGEMENTS

This MSc thesis is the conclusion of the Master of Business and ICT in Business Administration Sciences at the University of Groningen.

The goal of this research is to provide a framework that can increase the privacy of an information system with the use of privacy enhancing technologies by answering the following research question: How can Privacy Enhancing Technologies increase the privacy of users of information

systems, and specifically of the OV-Chip card?

This thesis was written under the supervision of the University of Groningen and the Security Service line within Accenture. I would especially like to thank Shay Uzery from Accenture for his considerable effort supporting and reviewing during the process of writing my thesis. I would also like to thank Laura Maruster from the University of Groningen for the time she spend supervising and providing feedback.

Furthermore, I would like to thank Patrick, Wouter, Hillebrant, Martijn and Hans for their company and laughs during the writing of our theses.

I would like to thank my girlfriend and friends for their moral support. Finally, I would like to thank my parents for their moral and financial support during the little over 6 years I was studying in Groningen.

(5)

EXECUTIVE SUMMARY

This research shows how to increase the privacy of an information system using privacy enhancing technologies. It has found 13 requirements in the literature an information system should adhere to in order to collect, store and process personal information in a privacy-safe way. These privacy requirements provide a benchmark for organizations to evaluate what privacy requirements should be addressed within their information system. To address these privacy issues, this research has described 11 different Privacy Enhancing Technologies using the categorization by Koorn et al. (2004).

(6)

TABLE OF CONTENTS ACKNOWLEDGEMENTS ... IV EXECUTIVE SUMMARY ... V TABLE OF CONTENTS ... VI TABLE OF FIGURES ... IX INDEX OF TABLES ... X 1. INTRODUCTION ... 1

1.1 TOPIC AND CONTEXT ... 1

1.2 RESEARCH QUESTIONS ... 2

1.3 RESEARCH DESIGN ... 2

1.4 SCOPE ... 4

2. PRIVACY ... 5

2.1 THE NEED FOR PRIVACY ... 5

2.2 BUSINESS PRIVACY INCENTIVES ... 6

2.3 DEFINING PRIVACY AND ANONYMITY ... 6

2.4 PRIVACY REQUIREMENTS ... 7 2.4.1 COLLECTION LIMITATION ... 8 2.4.2 DATA QUALITY ... 9 2.4.3 PURPOSE SPECIFICATION ... 9 2.4.4 USE LIMITATION ... 9 2.4.5 SECURITY SAFEGUARDS ... 9 2.4.6 NOTICE/OPENNESS ... 10 2.4.7 ACCESS/PARTICIPATION... 10 2.4.8 ACCOUNTABILITY ... 10 2.4.9 CHOICE/CONSENT ... 10

2.4.10 ANONYMITY AND PSEUDONYMITY ... 11

(7)

2.4.20 LOCALITY AND PROXIMITY ... 13 2.4.21 CONFIDENTIALITY ... 13 2.4.22 TRAFFIC DATA ... 13 2.4.23 PREVENTING HARM ... 13 2.5 DISCUSSION ... 15 2.6 CONCLUSION ... 17

3. PRIVACY ENHANCING TECHNOLOGIES FOR INFORMATION SYSTEMS ... 18

3.1 INFORMATION SYSTEMS ... 18

3.2 DEFINE PRIVACY ENHANCING TECHNOLOGIES (PET) ... 18

3.3 THE PET STAIRCASE ... 19

3.3.1 GENERAL PETMEASURES ... 20

3.3.1.1 Encryption ... 20

3.3.1.2 Access Security ... 20

3.3.1.3 Data Minimization ... 20

3.3.1.4 Authorization Technologies ... 21

3.3.1.5 Quality Enhancing Technologies ... 21

3.3.1.6 Retention Management ... 22

3.3.2 SEPARATION OF DATA ... 22

3.3.2.1 Separate Identity and Pseudo-Identity ... 22

3.3.2.2 Identity Protector... 24

3.3.3 PRIVACY MANAGEMENT SYSTEMS ... 24

3.3.3.1 P3P & EPAL ... 24

3.3.3.2 PISA ... 25

3.3.3.3 Privacy Rights Management ... 26

3.4 NON-ADRESSED PRIVACY REQUIREMENTS ... 27

3.5 PRIVACY REQUIREMENTS FRAMEWORK ... 28

3.5.1. EVALUATE CURRENT SITUATION ... 29

3.5.2. PETSELECTION ... 29

3.5.3. PRIVACY AUDITS ... 29

4. OV-CHIPKAART ... 30

4.1 INTRODUCTION ... 30

4.2 OV-CHIP CARD INFORMATION SYSTEM ... 31

4.2.1. LEVEL 0 ... 33 4.2.2. LEVEL 1 ... 34 4.2.3. LEVEL 2 ... 34 4.2.4. LEVEL 3 ... 34 4.2.5. LEVEL 4 ... 34 4.3 PRIVACY FRAMEWORK ... 35

4.3.1. EVALUATION CURRENT SITUATION ... 35

4.3.1.1 Collection, storage and processing of Personal Information ... 35

(8)

4.3.1.3 Identify Gaps in Privacy Protection ... 40

4.3.2. PETSELECTION ... 42

4.3.3. PRIVACY AUDITS ... 44

4.4 FOCUSED PET ARCHITECTURE ... 45

5. CONCLUSIONS AND RECOMMENDATIONS ... 46

5.1 CONCLUSIONS ... 46

5.2 SCIENTIFIC RELEVANCE ... 47

5.3 MANAGERIAL RELEVANCE ... 47

5.4 LIMITATIONS ... 47

5.5 DIRECTIONS FOR FURTHER RESEARCH ... 48

6. GLOSSARY ... 49

7. REFERENCES ... 50

APPENDIX A ... 55

(9)

TABLE OF FIGURES

Figure 1: Research Model ... 3

Figure 2: The PET-Staircase: translated from Dutch (Koorn et al., 2004) ... 19

Figure 3: Authorization Process ... 21

Figure 4: Privacy Framework ... 28

Figure 5: The Structure of the OV-Chip card System ... 33

(10)

INDEX OF TABLES

Table 1: Short Overview of the Privacy Principles ... 8

Table 2: Overview Privacy Requirements in Literature ... 14

Table 3: Final Privacy Requirements ... 17

Table 4: PET & Privacy Requirements ... 27

Table 5: Summary of PI Collected, Stored and Processed within the 5 levels ... 32

Table 6: Privacy Threats ... 41

Table 7: PET Solutions ... 44

(11)

1. INTRODUCTION

This chapter will provide the introduction to this MSc thesis. Section 1.1 will provide information about the topic and context of this thesis. The research design will be addressed in section 1.2. This leads to the research questions in section 1.3. Section 1.4 will provide the scope of the research.

1.1 TOPIC AND CONTEXT

Privacy and security are two factors in today’s society that seem to be in conflict (Aquilina, 2010). The increasing digitalization of people’s lives enables governments and companies to follow and contact people in a fashion that could lead to violations of their privacy. Society is struggling to find a balance between the need for privacy and the need for security (Muller et al., 2007).

The uprising of information technologies provides the government with an enlarging spectrum of possibilities to increase the security of its citizens. These Security Enhancing Technologies (SETs) create an environment which is more secure. People are monitored and possible problems are easily discovered. However, society needs to bear in mind that privacy is a fundamental right as defined in article 12 by the Universal Declaration of Human Rights of the United Nations (United Nations, 1984). According to Blarkom et al. (2003) informational privacy has two distinct characteristics:

1. The right to be left alone

2. The right to decide oneself what to reveal about oneself

In the digitalizing world, privacy will become more important and in need of protection. Fischer-Hübner (2001) identifies four ways that privacy protection can be achieved; (1) protection by government laws, (2) protection

by privacy-enhancing technologies, (3) self-regulation for fair information practices by codes of conduct, and (4) privacy education of consumers and professionals. These types of privacy protection often overlap and need to work together in order to provide adequate privacy protection. The focus of this research will be on privacy protection by privacy enhancing technologies.

Today there are several directives in Europe that regulate the protection of privacy. The most important directive is the 95/46/EC, Data Protection Directive (European Parliament, 23 november 1995). The Directive, in effect since 1998, requires all EU member states to introduce necessary legislation in order to protect the rights on privacy, with respect to the collection, processing, storage and transmission of personal data. Among the objectives of the Directive is the free flow of personal data between European countries, as well as the transfer of personal data to countries outside Europe with the condition they have enforced an appropriate level of data protection (Lioudakis et al., 2007).

(12)

PET can protect the privacy of consumers when they are using an electronic service. The government can use several of these technologies to better protect the privacy of its citizens. Companies can use PET to gain the trust of consumers and thus build a competitive advantage over its competitors. Revocable privacy (Stadler, 1996) is a concept that can enable the government and companies to strike a balance between the need for privacy and the need for security. Instead of having an organization that uses and controls the information of consumers, there will be a third trusted party (TTP) that has the control over the data. A TTP is an entity that decides whether data can be retrieved if certain predetermined conditions are fulfilled. It is paramount to the revocable privacy concept that revoking privacy is only used when demanded and under very strict predetermined conditions.

The OV-Chip card is a system that enables people to use and pay for the public transport in the Netherlands. It is a system that is initiated and supervised by the government to increase the use and safety of the public transport system. One of the challenges regarding the OV-Chip card is the protection of the privacy of the consumers. In this thesis we use the OV-Chip card as a case study that will focus on the collection, storage and processing of personal information by Trans Link Systems1 (TLS) and public transport organizations (PTO’s) and the associated privacy threats. It will illustrate the implementation of the privacy framework developed in this paper and increases the privacy of the OV-Chip card information system.

1

Trans Link Systems was founded by the five largest public transportation companies in the Netherlands to achieve the implementation of the OV-Chip card.

1.2 RESEARCH QUESTIONS

With respect to the aforementioned topics of security and privacy the following problem statement can be derived.

Is the privacy of consumers of information systems, especially in case of the OV-Chip card, adequately protected?

From the problem statement the following research question and sub-questions are derived:

How can Privacy Enhancing Technologies increase the privacy of users of information systems, and specifically of the OV-Chip card?

The following research questions are developed:

1. What are the privacy issues that concern the collection, storage and processing of personal information of people using an information system?

2. What are Privacy Enhancing Technologies (PET) and how can PET guarantee the privacy of people using an information system?

3. How can privacy requirements be incorporated in a PET based information system architecture for OV-Chip card? 4. How can PET address the privacy concerns

of users of the OV-Chip card information system?

1.3 RESEARCH DESIGN

(13)

research will provide a list of requirements information systems have to implement in order to adequately protect the privacy of its users. Furthermore, this research will discuss the PET that enable the information system to adhere to the aforementioned privacy requirements. After that, a framework is developed which helps companies to change their conventional information system into a privacy-safe information system. As a case study, this research will propose a focused architecture for the OV-Chip card information

system that incorporates the list of requirements aimed at the implementation of privacy enhancing technologies. This study is done on the basis of a literature review and interviews with experts in the field. No statistical data are gathered to verify the findings. In this thesis information is gathered about security and privacy from the available literature sources and expert interviews, and is analyzed following a qualitative research method.

Figure 1: Research Model

Figure 1 provides the research model used in this project. This research model is based on the research methodology described by Verschuren en Doorewaard (2007). Their method helps to define the research questions and to create an overview of activities that need to be performed.

The research model in Figure 1 is divided into five parts. Each part portrays a deliverable and/or an activity.

1. A literature study on the requirements an information system should adhere to in order to become a privacy-safe information system. Several requirements, guidelines and principles are discussed in the literature. The basis for this research is an extensive search in the academic literature and privacy regulations.

2. The second part of the research model will provide a merged list of privacy requirements distilled from the literature. These privacy requirements need to be Theory on Privacy Requirements Fair Information Practice Guidelines Merged List of Privacy Requirements Expert Interviews PET Theory on Revocable Privacy Privacy Framework

Case Study: OV-Chipkaart Architecture

Expert Interviews

(14)

addressed in order to increase the privacy of an information system. The merged list of privacy requirements is validated using experts in the field. Semi-structured interviews are held with these experts to find what requirements are important and which are not. Next to the merged list of privacy requirements, several PET identified in the literature are investigated.

3. In the framework the merged list of privacy requirements and the PET will be combined. The framework will show what PET are needed to address the privacy requirements.

4. A focused architecture in case of the OV-Chip card is proposed, in order to illustrate the application of the framework. To build the OV-Chip card case several semi-structured interviews are held with experts involved with implementing the OV-Chip card in the Netherlands.

1.4 SCOPE

The research will focus on the privacy requirements for back-end information systems that collect, store and process personal information for companies and the government. Furthermore, it will focus on the Privacy Enhancing Technologies that should be implemented in these information systems.

(15)

2. PRIVACY

As early as in 1890 the concept of privacy in the modern society was evaluated (Warren & Brandeis, 1890). They have acknowledged the fact that privacy as a fundamental right was endangered by technological advances. Today the scientific debate about privacy in an increasingly more digital society is as important as ever.

This chapter introduces the privacy concept and explains which privacy principles have to be considered when developing an information system. Paragraph 2.1 explains why privacy is needed. Why do people consider privacy to be an important value? Paragraph 2.2 discusses the benefits of privacy protection from a company’s perspective. Paragraph 2.3 defines privacy and anonymity. Paragraph 2.4 provides a list of privacy requirements derived from the literature and applicable legislation that explain the principles information systems should adhere to. In paragraph 2.5, the importance of these privacy requirements is discussed. From this discussion the eventual list with 13 privacy requirements is constructed.

2.1 THE NEED FOR PRIVACY

On average, in 2003, 60% of all EU152 citizens were concerned to a greater or lesser degree, about the broad issue of the protection of privacy. This figure shows a small increase on the figures recorded in an identical poll seven years previously. Virtually two-thirds (64%) of EU15 citizens polled tended to agree that they were worried about leaving personal information on the internet (The European

2 EU15 is short for the member countries in the

European Union prior to the accession of ten candidate countries on 1 May 2004.

Opinion Research Group, 2003). This research and others (Business Week/Harris Poll, 2000; Cranor et al., 1999) have indicated that citizens are increasingly concerned about threats to their privacy.

The aforementioned researchers indicate that EU15 citizens are concerned about their privacy due to the increasing digitalization of society and a larger need for a secure society. This paragraph will explain why privacy is important for people and will provide insights in what directions to improve the privacy of these people.

(16)

control on the solicitation, storage, use and disclosure of various types of personal information (Hann et al., 2007). Such perceptions could lead to a decreasing use of transactions that involve personal information solicitation (Culnan, 1993).

2.2 BUSINESS PRIVACY INCENTIVES

From a company perspective there is much to gain in implementing sound privacy measures. From the research discussed in paragraph 2.1, it is evident that consumers value their privacy. By implementing sound privacy protection companies can gain in two ways. First, it can have a competitive advantage over its competitors since consumers value their privacy and as such value companies that guard their privacy. Second, companies less likely have to endure the implications of a privacy breach.

A majority of organizations have lost personal information, and among these organizations, the biggest causes are internal and therefore something they potentially could control. This suggests accountability for, and ownership of, how sensitive data is used may be lacking in many organizations. A research conducted by Accenture (2010) polled 5,500 business leaders in 19 countries. Fifty-one percent of these participants were in management positions. According to this research, fifty-eight percent of executives polled said they have lost sensitive personal information, and for nearly 60 percent of those who have had a breach, it was not an isolated event.

Companies have much to lose if they don’t implement sound privacy protection measures. As organizations find their data management activities receiving more scrutiny from a privacy perspective, information systems managers should be aware of exposures and be accountable to

their organizations (Straub & Collins, 1990; Smith et al., 1996).

For companies, privacy breaches can have serious implications and have substantial financial costs attached to the response and remedy the breach (Accenture, 2010), namely:

 Regulatory enforcement and lawsuits if privacy regulations are not compliant,

 Erosion of shareholder value. Campbell et al. (2003) find a highly significant negative market reaction for information security breaches involving unauthorized access to confidential data,

 Inability to conduct business or, in the most extreme case, a collapse of political and economic stability.

2.3 DEFINING PRIVACY AND

ANONYMITY

In order to build on the concept of privacy a definition is needed. A well-known and influential definition is given by Westin (1967):

Privacy is the claim of individuals to determine for themselves when, how and to what extent information about them is communicated to others

In recent years, increasing attention has been paid to the issue of information privacy, including (a) the rights of individuals to control information about themselves and (b) the organizational policies and practices required to protect the same rights (Stone et al., 1983). They define information privacy as:

The ability of the individual to personally control information about one's self.

(17)

to increase his or her level of control over personal information (Awad & Krishnan, 2006). Control of personal information requires that an individual manages the outflow of information as well as the subsequent disclosure of that information to third parties (Awad & Krishnan, 2006).

To increase the control of personal information one should try to increase its anonymity within information systems. Goldberg et al. (1997) divide anonymity into two cases: persistent anonymity (or pseudonymity), where the user maintains a persistent online persona which is not connected with the user's physical identity, and one-time anonymity, where an online persona lasts for just one use. The key concept here is that of linkability; when using persistent anonymity a number of messages can be linked to each other, whereas with one-time anonymity none of the messages can be linked (Goldberg et al., 1997).

2.4 PRIVACY REQUIREMENTS

This paragraph will provide a review of the literature on fair information practice principles and combine these insights to formulate a list of privacy requirements an information system should adhere to. The final privacy requirements list states the requirements a privacy-safe information system should address. In many papers different concepts and principles are used that sometimes conflict and often overlap each other (Cate, 2006). At the end of the paragraph Table 2 provides an overview of the literature sources used and the privacy requirements that were mentioned in those sources. The list with privacy requirements is validated during interviews with experts in the field.

Because of technology advances, lower data storage costs, the rise of the internet and the

emergence of major data brokerage companies the volume of personal data that is being collected and shared by organizations is growing exponentially (Accenture, 2010). To protect consumers from privacy risks associated with an increasing digitalization several researchers provided principles that an information system should adhere to (Fischer-Hübner, 2001; Wang & Kobsa, 2009; Langheinrich, 2001; Lioudakis et al., 2007; Borking, 2010).

The most important source for privacy guidelines are the OECD Guidelines on the protection of privacy and transborder flows of personal data (OECD, 1980). The privacy requirements that are part of the OECD Guidelines are shown in Table 2. However, these guidelines are in need of enhancement since the technical possibilities have dramatically changed since the 1980’s (Clarke, 2000).

The OECD Guidelines influences privacy legislation worldwide of which the European Directive 95/46/EC (European Parliament, 23 november 1995) is the most influential. The Directive, in effect since 1998, requires all member states to introduce necessary legislation in order to protect the right to privacy, with respect to the collection, processing, storage and transmission of personal data. Among the objectives of the Directive is the free flow of personal data between the European countries, as well as the restriction of personal data transfer only to countries outside Europe that have enforced an appropriate level of data protection (Lioudakis et al., 2007).

(18)

responsible for the processing of personal data has to take appropriate technical measures to protect against loss and against any form of unlawful processing. Moreover, the article states that unnecessary collection and further processing of unnecessary data should be avoided (Koorn et al., 2004).

Several researches (Fischer-Hübner, 2001; Langheinrich, 2001; Borking & Raab, 2001; Borking, 2010) describe the functional privacy principles that are stated by the Organization for Economic Co-operation and Development (OECD, 1980). These principles have to signal and constitute the basis for the definition of the functional requirements for a system that manages privacy on large scale. Furthermore, many authors have supplemented the OECD principles with additional privacy requirements.

Borking (2010) describes the OECD principles and adds choice/consent, limiting linkability, challenging compliance, confidentiality, traffic data, location data and unsollicitated communications as requirements for information systems. Wang & Kobsa (2009) considered the privacy principles developed by the OECD and extended these with anonymity related principles and with other desirable principles for privacy enhancement. Other authors that have researched privacy principles and have adopted different sets of requirements for their research are Langheinrich (2001), Lioudaikis (2007) and Kobsa (2007). Next to the academic literature, other institutions have researched privacy requirements as well. The APEC privacy framework (Asia-Pacific Economic Cooperation, 2004), the EU Data Protection Directive Principles (European Parliament, 23 november 1995), the FTC Privacy Principles (Federal Trade Commission, 2000) and the Safe Harbor Privacy Principles (U.S. Department of Commerce, 1999) have all

adopted a certain combination of privacy requirements for information systems. In the next section these articles and directives are combined by which this research will provide 13 privacy requirements that are necessary for a privacy-protected information system. In Table 1 a short overview is provided of the privacy requirements that will be discussed in the remainder of this paragraph.

Privacy Principles 1 Collection Limitation 2 Data Quality 3 Purpose Specification 4 Use Limitation 5 Security Safeguards 6 Notice/Openness 7 Access/Participation 8 Accountability 9 Choice/Consent

10 Anonymity and Pseudonymity

11 Ease of adoption 12 Ease of compliance 13 Usability 14 User preference 15 Negotiation 16 Responsiveness 17 Enforcement/Redress 18 Onward Transfer 19 Seclusion

20 Locality and Proximity

21 Confidentiality

22 Traffic Data

23 Preventing Harm

Table 1: Short Overview of the Privacy Principles Each privacy requirement is defined below. In paragraph 2.5 the privacy requirements are discussed and their importance is explained. An extensive list of papers and other sources in which these requirements have been used is given in Table 2 at the end of this paragraph.

2.4.1 COLLECTION LIMITATIO N

(19)

to use the system. Furthermore, personal data should be obtained in a lawful manner and, where appropriate, with the knowledge of the data subject. Kobsa (2007) describes the minimization principles that should be considered for the collection limitation principle and the use limitation principle (see 4). These minimization principles provide a specific basis for requirements an information system should adhere to.

 Information systems should collect and use only the personal information that is strictly required for the purposes stated in the privacy policy.

 Information systems should not store personal information longer as needed for the stated purposes.

 Information systems should have implemented systematic mechanisms to evaluate, reduce, and destroy unneeded personal information on a regular basis, rather than retaining it indefinitely.

 Before deployment of new activities and technologies that might impact personal privacy, carefully evaluate them for their necessity, effectiveness, and proportionality. One should always consider privacy implications when deploying a new activity.

2.4.2 DATA QUALITY

This principle ensures that the quality of the data and the processing should adhere to certain quality standards. Borking (2010) distinguishes six aspects that should be considered to keep the data at a high quality level. (1) The length of the retention period has to be in line with the use of the data. (2) The use of diacritics has to be defined. The recording of the data should be in a uniform way across the information system. (3) The data in the information system has to be checked and cleaned on a periodic basis to ensure its quality. (4) It has to be ensured that

all corrections of personal information are propagated in a timely manner to all parties that have received or supplied the inaccurate data. (5) Decisions that are taken automatically by the information system should have proficient control mechanisms. (6) The personal information should be sufficiently accurate and up-to-date for the intended purposes.

2.4.3 PURPOSE SPECIFICATION

To use certain services on the internet it is often required to register and submit a lot of personal information. Most personal information is often not necessary for the website to provide the service. The purpose for which this personal data is gathered should be specified and legitimate (Lioudakis et al., 2007). The use of this personal data should be limited to these specified purposes. The purposes should be specified explicitly within the information system, where it is important that there are control mechanisms which only allow use if the requirements are fulfilled.

2.4.4 USE LIMITATION

The processing of the personal data should only be allowed, if it is necessary for the tasks falling within the responsibility of the data processing entity (Lioudakis et al., 2007) as specified in the purpose specification guideline or by the authority of law. Also the minimization principles as described by Kobsa (2007), which are referred to in the section of collection limitation, are relevant for the use limitation requirement.

2.4.5 SECURITY SAFEGUARDS

(20)

risks as loss or unauthorized access, destruction, use, modification or disclosure of data. Cryptography and authentication schemes as defined by Chaum (1983) could result in adequate security.

2.4.6 NOTICE/OPENNESS

There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data, and the main purposes of their use, as well as the identity and usual residence of the data controller (OECD, 1980). Users should be able to view in a clear and concise way, what purposes information is collected and used and for what time this data is retained. People that provide personal information in order to use a service should accept a single and concise overview of their privacy rights. Often privacy policies of different companies dealing with the same personal information differ. Kobsa (2007) describes three principles that are related to the notice/openness requirement. (1) Whenever any personal information is collected, explicitly state the precise purpose for the collection and all the ways that the information might be used, (2) explicitly state how long this information will be stored and used, consistent with the minimization principles and (3) make these privacy policy statements clear, concise, and conspicuous to those responsible for deciding whether and how to provide the data.

2.4.7 ACCESS/PARTICIPATION

The users have the right to be informed, to be notified and the right to correct, erase or block incorrect personal data.

According to Mason (1986), there are four critical areas that that should be protected by information policies: (1) the individual should

have the right to keep personal data private, (2) the individual should have the responsibility for the authenticity of the personal data, (3) the individual should maintain the ownership of the personal data, and (4) an individual should have the right to access the personal data.

2.4.8 ACCOUNTABILITY

A data controller should be accountable for complying with measures which give effect to the principles stated above (OECD, 1980). This requirement is frequently left out in scientific papers and data protection derivatives. However, the ownership and accountability for data and processes is an important requirement, which is not fully captured by the previous requirements.

2.4.9 CHOICE/CONSENT

The enactment of the EU Data Protection Directive added the choice and consent requirement. Before this requirement data collectors only had to announce and declare data collection. This requirement states that data collectors should receive explicit consent from the data subject.

(21)

However, no PKI has actually achieved widespread usage, which makes the verification of signatures, as well as their revocation difficult (Langheinrich, 2001). 2.4.10 ANONYMITY AND

PSEUDONYMITY

Important aspects to consider that are related to the OECD guidelines are principles from the field of identity management. Wang & Kobsa (2009) state the following five principles as separate privacy requirements. These concepts are defined in the security requirements (ISO 15408)/Common evaluation criteria as stated in the information security handbook written by Tipton & Krause (2007). Furthermore, the article of Pfitzmann & Hansen (2005) provides solid definitions of these anonymity related subjects as well.

1. Anonymity

Anonymity ensures that a user may use a resource or service without disclosing the user’s identity. The requirements for anonymity provide protection of the user’s identity. Anonymity is not intended to protect the subject identity. Pfitzmann & Hansen (2005) define anonymity as the state of being not identifiable within a set of subjects, the anonymity set. There are three anonymity sets described by Pfitzmann & Hansen (2005): (1) the sender anonymity set, (2) the recipient anonymity set, and (3) the relation anonymity set. These sets represent the actors of a certain group that remain anonymous to each other. The relation anonymity set has the property that it is unlinkable (Acquisti et al., 2008).

2. Pseudonymity

Pseudonymity ensures that a user may use a resource or service without disclosing the user identity, but can still be accountable for that use (Pfitzmann & Hansen, 2005).

3. Unobservability

Unobservability ensures that a user may use a resource or service without others, especially third parties, being able to observe that the resource or service is being used.

4. Unlinkability

Unlinkability ensures that a user may make multiple uses of resources or services without others being able to link these uses (Pfitzmann & Hansen, 2005).

5. Deniability

Deniability means that users are able to deny some of their characteristics or actions, and that other cannot verify the veracity of this claim (Wang & Kobsa, 2009).

These five principles are often overlapping each other. For example, when achieving full anonymity, the user is by definition not observable, unlinkable and has deniability. If an information system uses pseudonymity techniques, the user also has a form of unobservability, unlinkability and deniability. We consider these three principles as characteristics of the Anonymity and Pseudonymity requirement. Therefore, this requirement states that the information system should implement a form of anonymity or pseudonymity. It is by definition not possible to implement both principles at the same time for the same part of the information system. It will depend on the information system which principle suits better.

(22)

undesirable feature. Anonymous crimes could be conducted without leaving any trace and the suspect wouldn’t have any risk of getting caught. These problems regarding the tradeoff between privacy and security can be addressed by implementing revocable privacy technologies, which are further elaborated on in paragraph 3.2 .

2.4.11 EASE OF ADOPTION

The ease of adoption of privacy protection mechanisms could be harmed by the reliance of these mechanisms on other infrastructures or technologies. This principle relates to the readiness of organizations to adopt the examined privacy protection (Wang & Kobsa, 2009).

2.4.12 EASE OF COMPLIANCE

The ease of compliance takes into account the ease of fulfilling legal requirements by adopting a specific privacy protection solution (Wang & Kobsa, 2009).

2.4.13 USABILITY

The privacy protection solution should be easy for users to use (Wang & Kobsa, 2009). 2.4.14 USER PREFERENCE

The privacy preferences should be tailored to each individual user’s preferences (Wang & Kobsa, 2009).

2.4.15 NEGOTIATION

This principle calls for the support of negotiation between a user and a website, during which they can reach an agreement on privacy practices that the website may employ for the respective user (Wang & Kobsa, 2009).

2.4.16 RESPONSIVENESS

The privacy protection solution should response promptly to changes in users’ privacy decisions (Wang & Kobsa, 2009).

2.4.17 ENFORCEMENT/REDRESS

It is generally agreed that the core principles of privacy protection can only be effective if there is a mechanism in place to enforce them (Federal Trade Commission, 2000). Effective privacy protection must include mechanisms for assuring compliance with the safe harbor principles, recourse for individuals to whom the data relate affected by non-compliance with the principles, and consequences for the organization when the principles are not followed (U.S. Department of Commerce, 1999). The Safe Harbor principles distinguish three factors such mechanisms should include at a minimum. (a) readily available and affordable independent mechanisms by which an individual’s complaints and disputes can be investigated and resolved and damages awarded where the applicable law or private sector initiatives so provide; (b) follow up procedures for verifying that the attestations and assertions businesses make about their privacy practices are true and that privacy practices have been implemented as presented; and (c) obligations to remedy problems arising out of failure to comply with these principles by organizations announcing their adherence to them and consequences for such organizations. Sanctions must be sufficiently rigorous to ensure compliance by organizations (U.S. Department of Commerce, 1999).

2.4.18 ONWARD TRANSFER

Onward transfer is the requirement that member states enact laws prohibiting the transfer of personal data to nonmember states that fail to ensure an ‘adequate level of protection’ (Cate, 1995).

(23)

2.4.19 SECLUSION

According to Wang & Kobsa (2009) seclusion means that users have the right to be left alone and thus not receive communications they not requested. Borking (2010) writes in his paper the right to not receive unsollicitated communications, which is effectively the same notion. An example of a violation of this principle is spam mail.

2.4.20 LOCALITY AND PROXIMITY

A new trend in today’s mobile world is the collection of locality and proximity data. Users need to give their permission to store their personal locality and proximity data.

2.4.21 CONFIDENTIALITY

The Confidentiality requirement states the confidentiality of personal information needs to be ensured (European Parlement, 2002). Especially, the listening, tapping, storage or other kinds of surveillance or interception without the consent of users is prohibited. Of course, an exception is made for legally authorized institutions.

2.4.22 TRAFFIC DATA

Traffic data is information that relates to the transport of personal information. The requirement states that if traffic data is processed and stored, it should be deleted or anonymized when it is not necessary anymore. In fact, is relates a different kind of personal information to the privacy requirements.

2.4.23 PREVENTING HARM

Preventing harm is a requirement that is added in the APEC framework (Asia-Pacific

(24)

Table 2: Overview Privacy Requirements in Literature

3

Not specifically mentioned in the requirements section but used in methodology

4

Seclusion is called unsollicitated communications

Privacy Principles OECD GUI DEL IN ES ( 198 0) A PEC P RIV A CY FRA MEW ORK EU DATA PR O T ECT ION DI REC T IV E FT C P RI V A CY PR IN CI PL ES SA FE H A RB OR PR IN CIPL ES BO RKIN G (2 010) WAN G & KOB SA (2009) LIOU DAK IS ( 20 07) KOBSA (2007) LA N GH EI N RICH ( 2001) TOT A L 1 Collection Limitation X X X X X X 6 2 Data Quality X X X X X X X X 8 3 Purpose Specification X X X X X 5 4 Use Limitation X X X X X X 6 5 Security Safeguards X X X X X X X X 8 6 Notice/Openness X X X X X X X X X X 10 7 Access/Participation X X X X X X X X X X 10 8 Accountability X X X X 4 9 Choice/Consent X X X X X X X3 X X 9

10 Anonymity and Pseudonymity X X X 3

11 Ease of adoption X 1 12 Ease of compliance X 1 13 Usability X 1 14 User preference X 1 15 Negotiation X 1 16 Responsiveness X 1 17 Enforcement/Redress X X X X 4 18 Onward Transfer X X X 3 19 Seclusion X4 X 2

20 Locality and Proximity X X 2

21 Confidentiality X 1

22 Traffic Data X 1

(25)

2.5 DISCUSSION

This discussion will review the privacy requirements mentioned in the literature and discuss their importance. As seen in Table 2, there is a large discrepancy in the amount of times certain privacy requirements are mentioned in the literature. The more often a requirement is discussed in a paper, the more important this requirement for the privacy of the information system is. However, we need to bear in mind some limitations. Of course, not all available literature is mentioned in the table. We have strived to incorporate the most important literature sources and guidelines/frameworks. In line with this reasoning we have omitted papers that only mention the OECD Guidelines from the table since they do not provide more insights into different requirements. Furthermore, some papers will have a more significant impact than other.

When considering the amount of times privacy requirements are shown in the table we have chosen a threshold of three. This indicates that privacy requirements which are mentioned three times or more are included into the final privacy requirements table. Next to this quantitative approach the remaining privacy requirements are discussed. From discussions with experts in the field we infer that certain privacy requirements could prove to be important. In the following discussion we will show what privacy requirements are left out and for what reason. At the end of this paragraph, a new and concise overview of the privacy requirements is developed, which inhibits the privacy requirements best suited to provide a benchmark for increasing the privacy of an information system.

The following privacy requirements have a threshold of three or more and will be part of the final privacy requirements; (1) Collection Limitation, (2) Data Quality, (3) Purpose

Specification, (4) Use limitation, (5) Security Safeguards, (6) Notice/Openness, (7) Access/Participation, (8) Accountability, (9) Choice/Consent, (10) Anonymity and Pseudonymity, (17) Enforcement/Redress, and (18) Onward Transfer.

Ease of Adoption, Ease of Compliance, Usability and Responsiveness all relate to usability features of PET. These requirements are not as important as the requirements discussed earlier, but still should be considered. Ease of Adoption and Ease of Compliance refer to the ease of implementation within the environment. Usability and Responsiveness are technical features that enable the PET to work in the environment. They reflect on the environmental and technical possibilities of implementing PET within an information system. If not considered, the PET solution could be not used to improve the privacy of the information system. For further reference these four requirements will be referred to as the Usability requirement, which ensures that the PET that are implemented will work in the environment. The requirements are desirable to consider when implementing PET in an information system, but are not a privacy requirement as such. The implementation of this requirement ensures that PET function adequately, but the privacy of the information system is not necessarily increased. To summarize, the usability requirements are not a part of the final privacy requirements, but need to be considered when implementing PET.

(26)

specify the amount of privacy they prefer. It provides users with more power than either to agree or disagree with the privacy measures taken by the service provider, which is covered by the Choice/Consent principle. The two requirements, User Preference and Negotiation will be combined into the privacy requirement Privacy Preferences.

Seclusion states that users have the right to

be left alone. This requirement is quite similar to the choice/consent requirement. The only difference is that the Choice/Consent requirement states that people have a choice whether to opt-in or opt-out, where the seclusion requirement states that the preset state should be that users do not receive any communication they did not request. If users have to ability to either opt-in or opt-out, this requirement does not serve a purpose and therefore can be omitted from the privacy requirements list.

The proximity and locality requirement and the sensitive data requirement are both requirements specifically aimed at a certain type of personal information. It is unnecessary to create separate requirements for different types of personal information because the privacy requirements are applicable to all sorts of personal information. The privacy of users does not increase if a specific requirement is added for each type of personal information. All personal information collected and processed by service providers should adhere to the privacy requirements mentioned in this paper. Hence, these two requirements will not be part of the final privacy requirements table.

The Confidentiality requirement is already covered by other privacy requirements. It states that personal information is confidential and only can be collected or used with consent or on legal grounds. First, personal information is by definition

confidential information. Second, the choice/consent and purpose specification requirements state that the user knows what information is used for what purposes. The access/participation requirement states that it is clear who has access to the personal information. Therefore, the confidentiality requirement does not add any value to the existing privacy requirements and therefore is omitted from the eventual privacy requirement list.

The Traffic Data requirement concerns the collection, storage and use of traffic data. The definition of personal information does not consider traffic data to be personal information. However, when a sufficient amount of traffic data is combined, the privacy of the subject could be decreased. With large amounts of traffic data, the identity of the subject could be accurately determined. When increasing the privacy of an information system, designers should consider traffic data. However, if traffic data is considered to be personal information, the privacy requirements discussed should provide an adequate level of protection. Therefore, the traffic data requirements can be omitted from the final list of privacy requirements.

(27)

2.6 CONCLUSION

The discussion in paragraph 2.4 leads to a final Table 3 with 13 privacy requirements an information system should exhibit to increase the privacy of its users.

Privacy Requirements 1 Collection Limitation 2 Data Quality 3 Purpose Specification 4 Use Limitation 5 Security Safeguards 6 Notice/Openness 7 Access/Participation 8 Accountability 9 Choice/Consent

10 Anonymity and Pseudonymity 11 Privacy Preferences

12 Enforcement/Redress 13 Onward Transfer

(28)

3. PRIVACY ENHANCING

TECHNOLOGIES FOR

INFORMATION SYSTEMS

As mentioned earlier, developments in information technology provide more possibilities to collect, store, process and distribute personal data. Legislation is not able to catch up with these developments to provide sufficient privacy protection (Fischer-Hübner, 2001). According to Blarkom et al. (2003) the same technologies that appear to threaten citizens’ privacy can also be used to protect it. These technologies are known under the name Privacy Enhancing Technologies (PET).

In this chapter we investigate the PET that can address the privacy requirements from Table 3. Paragraph 3.1 will provide an insight into the information system and personal information used within these systems. Paragraph 3.2 will formally define PET. Paragraph 0 provides a categorization of several PET using the categorization provided by Koorn et al. (2004). This will indicate which PET are effective and in which cases they can be implemented. In paragraph 3.3, the privacy requirements that are not addressed by PET are discussed. In paragraph 3.4, the privacy requirements framework is developed. It links the privacy requirements from chapter 2 with the PET from chapter 3.

3.1 INFORMATION SYSTEMS

Hes & Borking (2000) state that information systems serve to provide information required for performing goal-oriented activities. They acknowledge two types of information systems that have a different perspective on privacy. Conventional information systems do

not have any or only a bit of privacy protection in place. Privacy-safe information systems deal with their users’ personal information according to privacy requirements set in rules and regulations. Most information systems are still conventional systems that use identity in every process in the information system. The change to a privacy-safe information system dramatically decreases the number of times identity is used in the processes of the information system.

3.2 DEFINE PRIVACY ENHANCING

TECHNOLOGIES (PET)

The concept of PET has a growing interest in the privacy world. Many studies have been undertaken to explore how PET could be implemented to improve the privacy of users on the internet and of information systems. It is the goal of PET to address privacy requirements and where possible to minimize the collection of personal information (Rotenberg, 2001).

To understand the concept of Privacy Enhancing Technologies we need a formal definition. According to Blarkom et al. (2003) PET can be defined as:

(29)

Figure 2: The PET-Staircase: translated from Dutch (Koorn et al., 2004) The Pet Staircase

Figure 2 shows the PET-Staircase developed by Koorn et al. (2004). It categorizes several PET measures in relation to their effectiveness. At the first level general PET measures that are relatively easy to implement and could provide a base level of protection of personal information are shown. The second column introduces PET which relate to the separation of data. Pseudonymity technologies are more comprehensive to implement than the general PET measures. The third column discusses privacy management systems. These systems address several privacy requirements within one system. Privacy Management Systems are an effective way of increasing a user’s privacy, but are harder to implement. Privacy Management Systems require substantial adaptations to the information systems before implementation is possible. The fourth column regards the total

anonymization of an information system. In an anonym information system no personal information is collected or processed. However, this column will not be a part of this thesis because most information systems are not able to function without the use of personal information. The PET staircase model is useful for this paper because it categorizes different PET and shows their effectiveness. It provides us a way to categorize several PET and show the privacy requirements that are solved. Furthermore, this model provides the best starting point to implement a privacy-safe architecture within the information system.

(30)

3.2.1 GENERAL PET MEASURES

General PET measures are technologies that provide a basic level for the security of the information system. They often address one privacy requirement at a time. General PET measures comprise of encryption technologies, Access Security technologies, Data Minimization technologies and Quality Enhancing Technologies.

3.2.1.1 ENCRYPTION

Encryption is the act of transforming information using an algorithm to make it unreadable without a key to decrypt the information.

Two general types of encryption exist. In symmetric key encryption each computer has a secret key that it can use to encrypt or decrypt a packet of information before it is sent. For this type of encryption it is important that you know which computers will sent information to each other. In Public Key Encryption the computers do not have to know each other (Diffie & Hellman, 1976). Public key encryption uses two different keys at once; a private and a public key. The private key is known only to the user, and the public key is made widely available. When the user makes an action, it can sign the document with its private key. The service provider can use the public key to decrypt the document.

SSL is a popular implementation of public key encryption. Later, it became a part of an overall security protocol called Transport Layer Security (TLS). Both TLS and SSL use certification authorities to check that (1) the certificate comes from a trusted party, (2) the certification is valid, and (3) the certificate is related to the site its coming from.

3.2.1.2 ACCESS SECURITY

Access Security ensures that data collected and processed by service providers does not fall in the hands of unauthorized people. Examples of these PET are a firewall, router, token, and password protection.

3.2.1.3 DATA MINIMIZATION

Data minimization is not implemented using technologies but it merely consists of protocols. Of course, these protocols are eventually implemented within the information systems procedures. Before implementing data minimization protocols it is paramount to document what personal information is absolutely required within the information system. Only required personal information should be collected, stored and processed. All other personal information should not be collected or deleted if it is already in the information system. Data minimization consists of three steps. (1) Do not collect data that is not absolutely necessary, (2) minimize the amount of locations data is stored on, and (3) delete data if it is not longer necessary to retain. An example of data minimization is developed by Schweidel et al. (2008). Companies often keep data in order to track consumer behavior. However, many companies are not allowed to track consumers on an individual basis. Schweidel et al. (2008) suggest that companies aggregate the data they need to track consumer behavior and translate this data in histograms. The original data can be deleted, while the company still has the availability to adapt to their consumers behavior. Important is to carefully determine what data is important for the company’s performance.

(31)

(2004). When the service provider needs personal information, but not necessary all the personal identifiable information, some data can be deleted. For example an email-address is not necessary anymore after the user has opted-out of the mailing service. Another solution is to delete only parts of the personal information field. For example, a service provider could choose to only store the first 3 digits of a postal code, which gives an indication of the area, but does not directly lead to a street. If the service provider needs the entire set of personal information, but the exact value of the data is not important one could categorize the data. For example, if a service provider needs to know if a person is older than 16, the information system could answer this question with ‘yes’ or ‘no’, without telling the exact age.

Another data minimization uses linkages between information systems from different companies. For information systems it is crucial to understand which data is transferred to its partners. The amount of personal data that is send to the information systems of partners should be minimized. Furthermore, it should be documented what personal information is sent to the partners.

3.2.1.4 AUTHORIZATION TECHNOLOGIES

Authorization is the process of giving users certain rights on the basis of their personal or group characteristics. Anonymous credentials are the digital equivalent of a paper-based credential (Neven, 2008). A credential could be a passport, driver’s license or any kind of document with a signature. Anonymous credentials do tell something about the owner’s status, but do not reveal the owner’s identity. For example, an anonymous credential proves that the owner has a driver’s license without providing any personal information. In the following

example it becomes clear what an anonymous credential actually does.

The user engages in an issue protocol with an issuer, which could be a TTP, to obtain a valid credential on a certain set of attributes (Neven, 2008). For example, the user receives the credential that he is older than 18 years. This credential is only valid under the issuer’s public key, of which only the issuer knows the secret key. When the user wants to show the verifier that he is older than 18 years old, he submits a requests containing his attributes, and the issuer certifies the fact that the user is older than 18. See Figure 3 for a graphical representation.

Figure 3: Authorization Process

3.2.1.5 QUALITY ENHANCING TECHNOLOGIES

Quality enhancing technologies ensure that the data collected is complete, actual and accurate. These technologies are important because trustworthy information should ensure that correct decisions are made (Koorn et al., 2004). Data quality can be increased by verifying personal information when users access a service. For example, if the data quality is insufficient, a service provider could send personal information to a user’s old address after the user is moved.

(32)

on accurate, actual and complete information. Furthermore, when a change to personal data is made, companies should submit the change in the associated companies’ information systems. However, in order to exchange personal information between companies the user’s privacy should be adequately protected.

3.2.1.6 RETENTION MANAGEMENT

Retention management weighs privacy concerns against economics concerns to determine the retention time. From a privacy perspective, retention periods should be as short as possible. Retention management ensures that these retention periods are part

of automatic processes within the information system and the retention of data does not require any human action. Retention management addresses a part of the collection limitation requirement, which requires that the storage of personal information is limited.

3.2.2 SEPARATION OF DATA

This category describes pseudonymity techniques from the field of identity management. It allows people to use a service while remaining anonymous. Both digital pseudonyms and the identity protector will be discussed.

3.2.2.1 SEPARATE IDENTITY AND PSEUDO-IDENTITY

Digital pseudonyms allow people to use a pseudonym while using a service; this allows them to keep their personal information private.

Hes & Borking (2000) define digital pseudonyms as;

‘A method of identifying an individual through an alternate or pseudo-identity, created for this particular purpose.’

To increase the privacy of an information system, digital pseudonyms are crucial. It enables people to use a service while remaining anonym. Digital pseudonyms build on blind signature techniques. A blind signature is a form of a digital signature where the content of the message is blinded before it is signed. Like with normal digital signatures, the blind signature can be publicly verified against the original unblended message. Blind signatures are often used where the signer and message author are different parties. Examples are election systems and the digital cash schemes as introduced by Chaum (1983).

Revocable Privacy

This privacy and security tradeoff mentioned in chapter 2.3 is can be addressed using the techniques known under the name ‘revocable privacy’ (Wright et al., 2002; Hoepman, 2008). Revocable privacy is described by Hoepman (2008) as;

‘A system implements revocable privacy if the architecture of the system guarantees that personal data is revealed only if a predefined rule has been violated.’

Revocable privacy is a mechanism which ensures that people can use a service with the benefits of anonymity, while misusers can be traced and held accountable for their misuse. It ensures the balance between the users’ right to privacy and the legitimate concerns of public authorities and organizations (Claessens et al., 2003). The requirements which define misuse need to be described before the users can be held accountable.

(33)

the user and the service provider alike. To solve the problem that people can misbehave when being anonymous the TTP is a good implementation form. The TTP has the ability to reveal the user’s identity under previously agreed upon term to the service provider. Hoepman (2008) describes several possibilities to implement revocable privacy. The first is spread responsibility, which entails that one or more TTP verify whether the predetermined conditions for releasing personal data have been met and thus release the data if these conditions are met. The second variant is the

self-enforcing architecture, where the rules, or

conditions, are hard-coded into the information system. Only if these rules are trespassed the information system will reveal personal information. This variant abolishes the need for one or more TTP, which could be an unsecure factor. However, not for all information systems and predetermined conditions the self-enforcing architecture can be implemented. Three cryptographic techniques that could enable revocable privacy are described by Stadler (1996). Fair blind signatures are explained earlier in this paper. Public

verifiable secret sharing is used for sharing

information among several participants. Such that only qualified subsets of them can recover this information (Stadler, 1996). In other words, several trustees can share the secret revocation key, but only if at least two trustees work together they can execute the revocation. Auditable tracing makes the tracing itself detectable by the users of the systems. Therefore, it could prevent unlawful tracing by TTP because the abuse will be discovered eventually.

Köpsell et al. (2006) describe five requirements that summarize the attributes

full traceability and full anonymity. If these

requirements are fulfilled privacy and security benefits should be possible to obtain at the same time.

1. The possibility to disclose the identity of the sender who violated the predetermined conditions, without the help of the sender.

2. A revocation should not affect the anonymity of other users, only of the user who violated the predetermined conditions.

3. It has to be impossible for the user or the TTP to lie on the user id, which is linked to the message that violated the predetermined conditions.

4. The link between the user id and the message should not be revealed to anyone other than the law enforcement agency.

5. The revocation scheme has to be compatible with the trust model, which indicates that at least a predetermined number of anonymizers have to cooperate and not too many users are misbehaving. The latter would decrease the anonymity of the remaining users. Idemix

Idemix is the most efficient unlinkable anonymous credential systems (Camenisch & Lysyanskaya, 2001). It is developed by IBM research in the Zürich Research Laboratory and enables strong authentication while having strong privacy features at the same time (Camenisch & Herreweghen, 2002; Camenisch & Lysyanskaya, 2001). With Idemix, which is short for Identity Mixer, users can prove certain facts to a service provider, like being older than 18 years old without revealing their true identity.

Referenties

GERELATEERDE DOCUMENTEN

What Masud does in his own work on Shatibi and what was common among the scholars gathered during the Muslim Intellectuals workshop in Leiden was that each of us was, to a lesser

Hypothesis 2a: Perceived management attitudes towards safety have a positive impact on carefulness behaviors related to violence and aggression incidents.. Hypothesis 2b:

Thesis Master of Science in Business Administration Operations & Supply Chains University of Groningen.. This research has been performed on behalf of the thesis project

The research has been conducted in MEBV, which is the European headquarters for Medrad. The company is the global market leader of the diagnostic imaging and

In paragraaf 5.6 wordt de noodzaak van identiteit in het informatiesysteem aan de orde gesteld en geconcludeerd dat het mogelijk is om volledig functionele

Dit boek bestrijkt een periode van vijftien jaar onderzoek naar de inzet van ict om de privacy van de burger en zijn persoonsgegevens te beschermen en daardoor het vertrouwen in

Hoe kunnen in informatiesystemen de persoonsgegevens van burgers zodanig effectief worden beschermd, dat zij erop kunnen (blijven) vertrouwen dat hun persoonsgegevens niet

The first research question deals with the legal requirements that Directive 95/46/EC; researchquestion 2 deals with the negative effect of the surveillance state on privacy; the