• No results found

A Data Governance Framework in an insurance company : Solvency II Compliance

N/A
N/A
Protected

Academic year: 2021

Share "A Data Governance Framework in an insurance company : Solvency II Compliance"

Copied!
102
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

A Data Governance Framework in an Insurance Company

Solvency II Compliance

MASTER THESIS

Marcia Eve Hek (5699657)

23rd of July 2014

Supervisor: Dr. E. Peelen

Secondary Supervisor: Dr. M.L. van der Veen Master: Business Studies IP

Institute: University of Amsterdam College year: 2013-2014

(2)

2

ABSTRACT

Insurance companies in Europe need data governance to implement accountabilities, to ensure that the quality of the data meets high standards. This thesis develops an appropriate data governance framework for a large Dutch insurance company in order to assess the data governance maturity and to ensure the data quality on a continuous basis. The Solvency II Data Governance Framework was derived from four existing data governance frameworks and was adjusted to comply with the Solvency II regulation. Two advisors validated the Framework and Structure, after which 11 interviews were held within the insurance company. Based on the framework the data governance maturity of the insurance company was determined. Also, the quality of the current data was analysed. During the research it was found that data governance in the Dutch insurance company in question has not reached the maturity status. In addition, the data quality related to the Solvency II data quality criteria needed improvement. The empirical research indicated that the framework and structure are suitable to ensure the quality of the data in the daily operational processes and to manage the data quality better in the future. The Solvency II Data Governance Framework contains the data governance structure and components, which interact with data management. Insurance companies can formalize and structure an organisation specific data governance framework based on the research findings.

Keywords:

Data Governance, Solvency II Data Quality Management, Data Management, and Data Governance Framework and Data Governance Structure

(3)

3

““““It always seems impossible,

It always seems impossible,

It always seems impossible,

It always seems impossible,

until it is done

until it is done

until it is done

until it is done””””

(4)

4

Table of Contents

1 INTRODUCTION ...7

1.1 PROBLEM STATEMENT...8

1.2 RESEARCH OBJECTIVE AND RESEARCH QUESTION...10

1.3 RESEARCH METHODOLOGY...12

1.4 STRUCTURE OF THE THESIS...15

2 LITERATURE REVIEW ...16

2.1 SOLVENCY II REGULATION...16

2.1.1 Solvency II and data quality ...18

2.1.2 Solvency II and data quality assessment...19

2.1.3 Solvency II and data governance...21

2.2 RISK MANAGEMENT...21

2.3 DATA MANAGEMENT...23

2.3.1 Data quality ...23

2.3.2 Data quality dimension...25

2.3.3 Data quality analysis ...27

2.4 DATA GOVERNANCE...28

2.4.1 Data governance framework ...29

2.4.2 Roles and responsibilities ...30

2.4.3 Data ownership ...31

3 DEVELOPMENT DATA GOVERNANCE FRAMEWORK ...32

3.1 COMPARISON DATA GOVERNANCE FRAMEWORKS...32

3.2 CONCEPTUAL SOLVENCY IIDATA GOVERNANCE FRAMEWORK...39

3.3 DESCRIPTION COMPONENTS SOLVENCY IIDATA GOVERNANCE FRAMEWORK...41

3.4 CONCEPTUAL SOLVENCY IIDATA GOVERNANCE STRUCTURE...46

3.5 RESULTS VALIDATION CONCEPTUAL SOLVENCY IIDATA GOVERNANCE FRAMEWORK...47

3.6 THE SOLVENCY IIDATA GOVERNANCE STRUCTURE...49

4 DATA GOVERNANCE MATURITY ASSESSMENT...52

4.1 DATA COLLECTION...52

4.2 DATA OPERATIONALISATION...53

4.2.1 Knowledge about Solvency II ...54

4.2.2 Data governance and data quality ...55

4.2.3 The Solvency II Data Governance Framework ...57

4.2.4 The Solvency II Data Governance Structure ...59

4.3 RESULTS DATA GOVERNANCE MATURITY ASSESSMENT...60

4.3.1. Knowledge about Solvency II...60

4.3.2. Data governance and data quality ...61

4.3.3. The Solvency II Data Governance Framework ...64

4.3.4. The Solvency II Data Governance Structure...66

4.4 RECOMMENDATIONS...66

5 DATA QUALITY RESEARCH DESIGN AND RESULTS ...69

5.1 DATA QUALITY RESIGN...69

5.2 MEASUREMENT CONTROLS ON SOLVENCY II RISK DATA ELEMENTS...70

5.2.1 Data collection ...70

5.2.2 Analysis ...72

5.2.3 Results ...73

5.3 BUSINESS RULES ANALYSIS...75

(5)

5

5.3.2 Analysis ...77

5.3.3 Results ...77

6 CONCLUSION AND DISCUSSION ...79

6.1 CONCLUSION...79

6.2 RECOMMENDATIONS...80

(6)

6

List of Figures, Tables and Appendices

Figure Name of Figure Page

Figure 1 Research Structure 14

Figure 2 Illustration of a draft of a Data Governance Framework 33 Figure 3 The Data Governance Institute (DGI) Data Governance Framework 35 Figure 4 The IBM Data governance Unified Process Overview 36 Figure 5 Conceptual Solvency II Data Governance Framework 40

Figure 6 The Solvency II Data Governance Framework 48

Figure 7 The Solvency II Data Governance Structure 50

Figure 8 Data Quality Analysis Process 70

Figure 9 Results control measures on Solvency II risk data elements 75

Table Name of Table Page

Table 1 Overview Solvency II data quality requirements 19

Table 2 Overview of the data quality category 25

Table 3 Data quality conditions per dimension 27

Table 4 Overview selected Data Governance Frameworks 32

Table 5 Analysis of the selected Solvency II Data Governance Frameworks 38 Table 6 Roles and responsibilities Solvency II Data Governance Structure 47 Table 7 Overview additional roles and responsibilities Solvency II Data Governance Structure 51 Table 8 Interview results data governance maturity assessment 67

Appendix Name of Appendix Page

Appendix A Overview of the steps and principles from Delta Lloyd 89 Appendix B Undiversified Required Capital per Insurance Risk 90 Appendix C Overview Mortality Risk Solvency II risk data elements 90 Appendix D Description of the variables in Data Quality Review Template 91 Appendix E Overview results control measures on Solvency II risk data elements 91 Appendix F Overview of the business rules in the PAA system 92

Appendix G Overview analysed business rules 93

Appendix H Overview results business rules analysis 94

Appendix I Overview interviewees at Delta Lloyd 95

(7)

7

1 INTRODUCTION

Organisations are constrained to continuously adapt their business models (Wende, 2007; Wende and Otto, 2007). Economic intensification demands structured and harmonized business processes across different organisations and countries, customers are asking for individual products and services (Kagermann and Österle, 2006). The attendant constraints clearly have an impact on the organisation’s structure, businesses processes, business data and IT strategy (Wende and Otto, 2007).

Solvency II is the updated set of regulatory requirements for insurance companies that operate in the European Union (EU) (Doff, 2008). The Solvency II Directive describes new rules and requirements for all insurance companies that operate in the EU. The Solvency II Directive’s objective is to increase the protection of insurance policyholders in the European Union and to enable a better coverage of all the risks run by an insurance company (Lorent, 2008). EIOPA1 requires that internal processes and procedures be implemented in an insurance company to ensure the quality of the data used for the calculation of her solvency. Data are the basis for the performance of the risk management and financial management of organisations. EIOPA has issued specific advice for data used for the Solvency Capital Requirement (SCR) or Technical Provisions (TP), which can have an impact on the outcomes. This advice explains that it is important that reliable data sources be used. Poor data quality can have an impact (Kharti and Brown, 2010) on the financial reports, which are sent to the regulatory authorities. For this reason, the Solvency II Directive formulates standards and guidelines for ensuring the quality of data. Thus, by adopting a data governance framework, insurance companies could benefit from an effective

1 The organisation that governed the Solvency II Directive, the Committee of European Insurance and Occupational Pension Supervisors (CEIOPS), was replaced by the European Insurance and Occupational Pensions Authority (EIOPA) in January 2011. EIOPA is part of a European System of Financial Supervisors that comprises three European Supervisory Authorities, one for the banking sector, one for the securities sector and one for the insurance and occupational pensions sector, as well as the European Systemic Risk Board (EIOPA, 2007).

(8)

8 governance of the quality of data (Cheong and Chang, 2007). This thesis conducts a qualitative case study to develop or refine an existing data governance framework, which can be applied in the risk management department of an insurance company.

1.1 Problem statement

Organisations are becoming aware of data as an asset (Kharti and Brown, 2010). Nevertheless organisations are making (minimal) adjustments to improve their quality of the data, because it is not seen as a large problem even though they experience a significant amount of stress as a result of poor data quality (Friedman, 2006). Organisations in different industries such as banking, insurance, finance, government and health care are integrating their business process structure based on products, functionality and geographies (Wang et al., 1995). Due to the integration of information systems within and across organisational boundaries business data are integrated and exchanged (Waal and Jonge, 2012). Organisations are collecting and storing more data than ever before (Watts et al., (2009). Organisations in a dynamic environment are asked to respond quickly and effectively to the changes in the environment (Waal and Jonge, 2012). As a prerequisite for an adequate response, accurate data are required; information disclosed from the growing amounts of data retrieved from the information systems (Waal and Jonge, 2012). These data are stored in various information systems and in complex ways serve as input to the organisations’ decision-making process (Watts et al., 2009). Cheong and Chang (2007) have argued that reports and decision-making can only be as good as the quality of the data. According to them, variables that impact data quality are (1) the use of various information systems, (2) the fact that the actual processes of using, collecting and maintaining data take place at various levels in an organisation, and (3) the fact that in the developing phase of an information system

(9)

9 the data quality aspect is overlooked (Cheong and Chang, 2007). Many organisations are not aware of what data they have, how critical that data are and which sources exist for critical data or the degree of redundancy of their data assets (Kharti and Brown, 2010, p. 151). In the same way, Friedman (2006) stated, organisations have quality issues, but they do not know how to solve them. Furthermore, it is commonly known that organisations have data quality issues, but they do not know how to optimize the quality (Friedman, 2006). The formalizations of data quality are new to many organisations (Bitterer and Newman, 2007). Wang and Strong (1996), Ballou and Tayi (1989) and Redman (1992) have noted that many of the databases are not error free and information systems contain a surprisingly large number of errors. The functional data could be inaccurate, uncontrolled, outdated and incomplete, which can have a significant impact on the organisations’ social and economic status (Wang et al., 1995; Wang and Strong, 1996). This poor data quality can have an impact on the operational levels – e.g. customer dissatisfaction and increased costs2 - and strategic levels – e.g. decision-making processes regarding customers, government and competitors 3- of an organisation (Kharti and Brown, 2010). Therefore, data quality must be integrated in every process or step of information, system design and implementation with the adaptation of governance, monitoring and auditing (Piprani and Ernst, 2008). Within insurance companies poor data quality can be found in (1) the various data sources across disparate information systems, (2) the discrepancy between the development methodologies and the data quality assurance, and (3) the data that are collected, managed and used by different organisation levels (Cheong and Chang, 2007).

2 Thomas, C. Redman (1998)

(10)

10 Poor data quality could have far-reaching effects and consequences, e.g. customer dissatisfaction, increased operational costs, less effective decision-making and a reduced ability to develop and execute a strategy (Redman, 1998). Woodall et al. (2011) have stated that the quality of data in an organisation is paramount to its success. Poor data quality can contribute to disastrous and even life threatening consequences. For example, the incident in which an Iranian civilian aircraft in 1988 was shot down by mistake (Fisher and Kingma in Woodall et al., 2011). Another example given by Wang and Strong (1996) concerns the case of an organisation that could not access the data of one single customer, because this particular customer had many different customer numbers. In order to be able to continuously rely on high quality data, organisations need to take initiatives to gain control of their data. Data free of defects possess features that data consumer’s desire (Redman, 2001). The data can be managed more effectively and successfully by adopting a data governance framework (Cheong and Chang, 2007). A data governance framework consists of the structure, policies, processes, roles, responsibilities and tools required to improve and manage the data quality (Wende, 2007).

1.2 Research objective and research question

Within Delta Lloyd there is no formal implemented and documented data governance framework. The data governance components used within the organisation, e.g. processes, data dictionary and metrics, have not yet been structured and formalized. Also the roles and accountabilities, e.g. Executive Sponsor and Data Stewards, are not explicit in the organisation. Quality of data is of the utmost importance for all kinds of decisions and reporting, but the present thesis limits itself to the subject of data governance and data quality in the context of Solvency II regulation.

(11)

11 In the research literature various data governance frameworks have been developed and modified by researchers through different perspectives. Some of the data governance frameworks contain certain specific components – e.g. roles and decision-making – of an overall data governance framework while others have a more generic - e.g. people and processes, assessment, architecture – approach. The objective of this thesis is to assess the data governance maturity and to analyse the data quality in an administration system within Delta Lloyd N.V..

This leads to the following research question:

What data governance framework will aid to assess maturity of data governance, to ensure the data quality of Delta Lloyd with respect to Solvency II?

To answer the research question the following sub-questions are formulated:

1) What data governance frameworks or models are distinguished in current literature?

2) How do the data governance frameworks meet the Solvency II requirements for data policy?

3) How can the current governance frameworks be improved with the application in risk management in mind?

4) What is meant by data quality in an insurance company with respect to Solvency II rules and requirements?

5) What are the current data quality and maturity of the data governance?

6) Which recommendations are suggested on how to improve data quality at Delta Lloyd N.V. on a sustainable basis in the perspective of Solvency II?

(12)

12

1.3 Research methodology

With the help of an empirical study, the research question and the sub-questions will be answered. The aim is to assess the maturity of data governance and data quality within Delta Lloyd with respect to risk management. This research was conducted with a qualitative approach. According to Yin (1984) and Eisenhardt (1989) the case study typically combines data collection methods such as archives, interviews, questionnaires and observations. Additionally, a case study is appropriate when the subject is new (Eisenhardt, 1989). The research is based on the combination of the subjects, namely data governance, data management and Solvency II regulation. Based on these subjects the case study is particularly appropriate. For this research an insurance company, namely Delta Lloyd N.V., was selected. This company4 operates in the Netherlands, Belgium and Germany in the insurance and banking sector and uses a multi-brand, multi-channel strategy. Delta Lloyd N.V. offers a broad range of general insurance products, principally in the Netherlands, to both private and corporate customers. As an insurance company Delta Lloyd feels the responsibility to abide by the rules and requirements regarding data quality of Solvency II. Additionally, Delta Lloyd has not structured a governance framework with a focus on data quality yet.

The study commenced with a review of the literature on the variables relevant to data governance and data management. Subsequently, the Solvency II related literature, e.g. regulatory documents from EIOPA, consultancy papers and academic research papers, were reviewed. Finally, the internal documents of Delta Lloyd about the Solvency II project were reviewed. Based upon that the Solvency II Data Governance Framework and Structure were developed. Two advisors5 were consulted to validate the conceptual Solvency II Data Governance Framework on usability and applicability

4 Source: Delta Lloyd website.

5 Consulted by H. Wandt, Principal Advisor at Human Inference and R. van Gerven, Advisor at Praktisch (Risico) Management B.V.. Both advisors were consulted at location.

(13)

13 aspects. Based on their knowledge and experience the Framework was further formalised and structured. The final Solvency II Data Governance Framework was used in the empirical study within Delta Lloyd N.V. To answer the research question, multiple research methods were used. The multiple research methods contained (1) performing a data governance maturity assessment, (2) a measurement of control measures on relevant Solvency II risk data elements and (3) a data quality analysis on the current data quality in a Policy Administration System6. The research study concludes with recommendations on where and how to improve the data quality. The approach of the interviews, data quality analysis and measurement are described in more detail in the relevant sections.

The data governance maturity assessment was conducted by holding face-to-face interviews. The face-to-face-to-face-to-face interviews were held with one IT Director, an Information Manager, two employees of an Information Management department and seven business representatives (consultants, risk managers, members of the Board of Management). All the interviewees are currently working at Delta Lloyd. The interviews were scheduled at the end of November and first week of December 2013 and took place at the offices of the organisation. This approach involved a set of carefully formulated questions with the intention of taking each interviewee through the same sequence, thereby asking them the same questions. During the interviews a semi-structured questionnaire was used. The gathered information was treated with the utmost care and was kept confidential. The outcome of the research was published anonymously. The interviewees were informed that the information was confidential. The interviews were recorded and a report was made after each interview. These reports were checked with the respondents before publishing the interview results. In

(14)

14 addition, a data quality analysis and measurement to establish the level of quality were performed on the current state of the data in the information system. The quality analysis and measurement provided a general guidance for assessing data as a useful component of the framework. By performing the data quality analysis and measurement in the Policy Administration System inconsistencies and imperfections in the data were identified. Incompatibilities in the data can result in lost revenue, customer dissatisfaction, and regulatory fines 7. To determine the quality in the Policy Administration System several methods of analysis and measurement were used. First, the existing business rules on the historical data were analysed. Second, the applicable control measures on the relevant Solvency II risk data elements were measured. The quality analysis and measurement were conducted on the basis of the data used for the single (individual) risk - Mortality Risk - in the Policy Administration System8. The research study concludes with recommendations on where and how to improve data quality. The development of the research structure is represented in figure 1.

Conceptual Data Governance Framework

Data Governance Maturity Assessment

Conclusion and Recommendations

Data Quality Analysis Solvency II / Data Governace

Documents L it er at ur e re vi ew

Final Data Governance Framework R es ea rc h an d F in di ng s Validation by Advisors D ev el op m en t C on cl us io n

Figure 1. Research Structure

7 The Data Governance Unified Process (Soares, 2010).

(15)

15

1.4 Structure of the thesis

This thesis is structured as follows: chapter 2 contains the literature review concerning Solvency II regulation, data governance, data management and risk management; the development of the data governance framework, including an analysis of four existing data governance frameworks, is discussed in chapter 3; chapter 4 describes the thesis research design for the data governance maturity assessment and data quality analysis; the interview results from the data governance maturity assessment are presented in chapter 5; in chapter 6 the performed data quality check, data analysis and measurement including the results are presented. The final chapter contains a summary of the thesis as well as its conclusion and a discussion of its managerial and research contribution, including recommendations, limitations and suggestions for future research.

(16)

16

2 LITERATURE REVIEW

Based on the research question, as described in chapter 1, three important terms are mentioned. First, the term Solvency II and second is data governance framework and third is data quality. In this chapter information about Solvency II will be provided in order to understand the basics of the regulation towards data quality. Next the terms data management and risk management in relation with the Solvency II regulation, regarding data quality will be clarified.

2.1 Solvency II regulation

The European Insurance and Occupational Pensions Authority (EIOPA) was established as a solution due to the changes of the structure of the supervision of the financial sector in the European Union. In 2000 the European Union launched the Solvency II project, as the single European supervisory framework for the insurance industry9. The Solvency II project, as a revision of the prudential regulation, was aimed to fully reflect the lasted development in the insurance sector (Lorent, 2008).

Solvency I, introduced in 1973, was based on liability volumes to calculate the solvency. As the supervisory elements in the Solvency I framework had become out-dated, the initiative was taken to revise the insurance supervision regulations, in parallel with the Basel II10 developments in banking (Doff, 2008). While the Solvency I Directive was aimed at revising and updating the current EU Solvency regime, Solvency II has a much wider scope. Solvency II fundamentally reforms the supervisory structure and practice (Doff, 2008). The objective of the Solvency II regulation is to protect the policyholders of the insurance companies (Lorent, 2008).

9 The Nederlandsche Bank, Solvency II - General Notes, September 2012 10 Regulation for the Banking sector.

(17)

17 Other goals are for example are (1) establishing fair financial stability, (2) stable market and (3) risk management standards that must apply across the EU. Solvency II is a risk based economic approach, in contrast to Solvency I. Under Solvency II the insurance companies are encouraged to manage their risks, value their liabilities using economic principles and the reserve capital to absorb risks (Doff, 2008). Solvency II regulation is a more sophisticated way of calculating the solvency for an insurance company. These force insurance companies to do, act, think, test and improve their internal risk management and internal control environment11. Solvency II regulation involves the updated set of regulatory requirements for insurance organisations that operate in the European Union. Solvency II is based on the three-pillar framework12 that is also existent in Basel II. Organisations affected by Solvency II regulation are the insurance and reinsurance companies operating in the European Union with gross premium income exceeding € 5 million, or gross technical provision higher than € 25 million. The aim of The Solvency II regulation is to adjust the existing prudential regime for

insurance and reinsurance companies in Europe (Doff, 2008). The Solvency II

requirements do not only state the required amount of capital that an insurer should hold to protect its policyholders, it also encourages insurers to properly manage their risks e.g. Market risk, Operational risk, Credit risk and Underwriting (Life, Non-life and Health) risk.

11 Delta Lloyd Group, 2010

12 Solvency II is based on three interconnected pillars: Pillar I focuses on financial requirements e.g. market-consistent valuation of the balance sheet, including insurance liabilities and assets. Pillar II sets out requirements for the governance and risk management of insurers, as well as for the effective supervision of insurers and Pillar III focuses on disclosure and transparency requirements.

(18)

18

2.1.1 Solvency II and data quality

According to EIOPA, the term data refers to all the information e.g. numerical, census of classification information - not qualitative information-, which is directly or indirectly needed in order to carry out a valuation of the Technical Provisions (Article 86f)13. More explicit, data relates to numerical data, internal or external, that are used for the calculation of the Solvency Capital Requirement (SCR) and Technical Provisions from the time they are received from the sources such as, a(n) (policy) administration system or an external data provider (Delta Lloyd14)

EIOPA published a number of consultation papers dealing with data quality. The most prevalent is Article 86f Standard for Data Quality. Solvency II Directive emphasizes the importance of data quality, specifically for the data that are used to calculate the Solvency Capital Requirement (SCR) and Technical Provisions (TP). However they do not disclose that it is mandatory for all data. Thus, the quality should be based on 'fit for purpose' for a limited part of the data used within an organisation. Furthermore, the Solvency II Directive did not formulate any specific data quality approximation. On the contrary, they dictate a set of data quality requirements for insurance companies. In the articles CEIOPS’ Advice for Level 2 Implementing Measures on SII: Technical Provisions - Article 86f Standards for Data Quality and Articles 120 to 126 Tests and Standards for Internal Model Approval the data quality requirements were set by EIOPA.

13 EIOPA: Level 2 Technical Provisions - Article 86f Standards for Data Quality (former Consultation Paper 43). 14

(19)

19 Table 1. Overview Solvency II data quality requirements

2.1.2 Solvency II and data quality assessment

The three essential criteria to assess the data quality, according to Article 86 f Standards for Data Quality15 are the definitions appropriateness, completeness and accuracy. These are described as follow:

Appropriateness

“Data are considered to be appropriate if they are suitable for the intended purpose”. If the data are appropriate they do not contain biases that make them unfit for a specific purpose.

15

EIOPA: Level 2 Technical Provisions - Article 86f Standards for Data Quality (former Consultation Paper 43).

Data quality requirements Description Article –

Section number

Internal process and procedure • Define internal processes on the identification, collection, processing and retention of data

• Compile a data dictionary used in the internal model; regarding each attributes source, characteristics and usage.

Articles 120-126 - 5.134 Articles 120-126 - 5.175

Data quality Assessment and improvement

• Perform data quality assessments, by means of three data quality criteria and implement a process for identification and resolving data deficiencies

Article 86 f - 3.37 Article 86 f - 3.21 Article 86 f - 3.22 Articles 120-126 - 3.35 Data Quality Monitoring • Define data policy which sets out the organisation approach

to managing data quality

• Define data quality controls and demonstrate the fulfilment of the three criteria to ensure completeness, accuracy, validity and timeliness of the data

• Establish guidelines for the frequency of the update process of the data

• Define periodical data quality checks based on the data quality standards, dimensions and expert opinion

Article 86 f - 3.35 Articles 120-126 - 5.178 Articles 120-126 -5.179

Data governance (data update)

• Establish a continuous process to manage changes and data updates, which can have materially impact on the (model) output.

• Establish a continuous basis to ensure a sufficient quality of data, by adopting internal systems and procedures in place e.g. data quality management, internal processes on identification, collection and processing of data, and role of internal/external auditors.

• Describe the processes, which the undertaking has in place for checking and validating data quality. Specify the actions to be taken in the event that data are not or does not continue to be accurate, complete and appropriate.

Article 86 f - 3.32 Articles 120-126 - 5.182, Articles 120-126 - 5.183 Articles 120-126 - 5.152 Articles 120-126 - 5.173b

(20)

20 Thus, the data are appropriate for the valuation of technical provisions, setting of assumptions and they must be representative and suitable for managing the relevant risks.

Completeness

“Data are considered to be complete if they are allowed for the recognition of all the main homogeneous risk groups within the insurance or reinsurance portfolio”. The data are considered to be complete if they have sufficient granularity to allow for the identification of trends and the full understanding of the relevant items for the intended purpose.

Accurateness

Data are considered to be accurate if they are free from material mistakes, errors and omissions. The data are accurate, if the data recording is adequate, performed in a timely manner and is kept consistent over certain period of time to obtain accurate data.

EIOPA did not describe how often the assessment should be performed. However, the Directive requires periodically data quality monitoring, regarding to the principle of proportionality. In particular, the monitoring of the performance of the relevant information systems and the channels used to collect, store, transmit and process data. The assessment, described in article 86f, could be based namely on data quality performance indicators, but expert appraisal needs to play a key role in the analysis.

(21)

21

2.1.3 Solvency II and data governance

Insurance companies, based on article 86f, should have adequate internal processes and procedures in place, to cover the information systems used for data quality management and collection, storing and processing of the data. The additional requirements are written in the articles 120 to 126, describing the details of the processes of the insurance companies for checking and validating data quality. According to article 120 to 126, the insurance company must specify the actions to be taken in the event that data are not or do not continue to be accurate, complete and appropriate.

2.2 Risk management

In the literature various definitions of the term risk are used. As represented by the International Actuarial Association16 (2004) risk is “the chance of something happening that will have an impact upon objectives and it is measured in terms of consequences and likelihood”. For the understanding of the term risk the definition will be used for this thesis

Additionally, several definitions of the term risk management are represented in the literature. Until the 1960’s the term risk management was formally positioned in the financial world, where principles were developed and guidelines were established (D’Arcy and Brogan 2001). Mehr and Hedges (1963) - the fathers of risk management (D’ Arcy and Brogan, 2001) - describe risk management, as “the management of those risks for which the organisation, principles, and techniques appropriate to insurance management are useful”17. With those risk Mehr and Hedges (1963) refer to the pure

16 Founded in 1895, and reformed in 1998 with a new constitution, the International Actuarial Association (IAA) is the worldwide association of professional actuarial associations, with a number of special interest sections for individual actuaries. The IAA exists to encourage the development of a global profession, acknowledged as technically competent and professionally reliable, which will ensure that the public interest is served.

17 Robert I. Mehr and Robert A. Hedges, Risk Management in the Business Enter-prise (Richard D. Irwin, Inc., Homewood, Illinois, 1963).

(22)

22 risk management, and excluded the corporate financial risk (Dionne, 2013). Other definitions for the term risk management are:

• Risk Management is an on-going process, involving the credit union’s Board of Directors, management and other personnel. It is a systematic approach to setting the best course of action to manage uncertainty by identifying, analysing, assessing, responding to, monitoring and communicating risk issues/events that may have an impact on an organisation successfully achieving their business objectives (COSO18, 2004);

• The IAA (2004) refers to risk management as the first line of defence in a company or as a way to prevent the emergence of situations that could imperil to the company.

While the primary focus of this research will be on data governance framework and data quality, the Solvency II regulation within a risk management department of an insurance company is also mentioned as a topic. Therefore, in this research definition of risk management of the IAA is used.

Since this research is performed within Delta Lloyd, an insurance company, it’s relevant to take a closer look on the reasons why risk management is considered of such high importance within this industry. But in addition, because Solvency II requires that in an insurance company, risk management is adequacy and fundamental for the insurance business. The insurance industry is getting more important and it is a growing segment within the financial markets (Lorent, 2008, p. 3). Meaning that insurance companies provide an enormous contribution and impact in the expending (financial) economic and therefore pose a risk. In return, they are facing the risks involved in the

18 Committee of Sponsoring Organizations of the Treadway Commission (COSO) Enterprise Risk Management – Integrated Framework, 2004.

(23)

23 insurance policies. Risks such as terrorism, natural catastrophes and bird flu have generated higher and more frequent damages (Lorent, 2008). Insurance companies provide insurance and other financial services; they assume various kinds of actuarial and financial risks (Santomero and Babel, 1997). To manage those risks, the risk management department in an insurance company formulates scenarios and calculates the impact of risks. To mitigate their risks they differentiate those risks in various sectors through a variety of techniques in their risk management systems. This involves a combination of reinsurance, pricing, and product design (Santomero and Babel, 1997). As a result an organisation is prepared for possible scenarios (Caballero and Krishnamurthly, 2008).

2.3 Data management

International business continues to grow rapidly in today’s world. The unmistakable need for information is increasing. Data are needed to function and to communicate. Therefore, they should be of good quality. Wende and Otto (2007) described data management as the collection, organisation, storage, processing and presentation of high quality data. (Wende and Otto, 2007).

2.3.1 Data quality

The term data are the plural form of the Latin word datum. According to article 86f, the term data was defined as comprise of numerical, census or classification of information but not qualitative information. In the context of this thesis the term data refers to comprise numerical, census or classification information but not qualitative information.

(24)

24 Harvey and Green (1993) categorized the conceptions of quality into five categories (1) exceptional, (2) perfection, (3) fitness for purpose, (4) value of money and (5) transformation. The term ‘fitness for purpose’ refers to the quality in terms of fulfilling customer’s requirements, needs or desires. Campbell and Rozsnyai (2002) describe quality as ‘fitness for purpose’, which refers to the product or service meeting customers needs, requirements, or desires. ISO 840219 defines quality as ‘fitness for purpose’, as the totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs. According to ISO 8402, this may not be mistaken for “degree of excellence” or “fitness for use” which meets only part of the definition.

Klein and Rossin (1999) concluded that there is no single definition of data quality accepted by researchers and those working in the discipline (Klein and Rossin in Kerr et al., 2007). Otto et al. (2007, p. 917) defined data quality as “ the dependence of perceived quality of the data consumer needs”. As Pipino et al. (2002, p. 211); Wang et al. (1995, p. 351); Wang and Strong (1996, p. 7) mentioned, data quality is a multi-dimensional concept, which refers to (1) the dimension hierarchies and (2) the measurements to capture imprecise data based on the data set in question. Likewise, data quality is described as a multifaceted concept, which contains a set of data quality attributes requiring consumer assessment (Weber et al., 2009). These quality attributes referred by Weber et al. (2009), are the data quality dimensions accuracy, consistency, relevancy and timeliness. Therefore considering the above, the following definition of data quality was used in this thesis: data quality is described as a multi-dimensional concept, to answer the data consumer needs.

19 The International Organisation for Standardization known as ISO, is an international standard-setting body composed of representatives from various national standards organisations. ISO 8402 –is the Quality management and quality assurance vocabulary.

(25)

25 The quality of the data is assessed by a set of data quality attributes, to fulfil the data consumers’ needs. To ensure the quality of the data, it must be complete, appropriate and accurate and applicable for reporting the results.

2.3.2 Data quality dimension

In the literature a data quality dimension is defined as a set of data quality attributes that represent a single aspect or construct of data quality (Wang and Strong, 1996). In the research literature Wang and Strong (1996) identified more than 100 dimensions - using an empirical approach -which are important to data consumers (Wang and Strong, 1996). By referring to the term dimension as a set of data quality attributes that most data consumers react to in a fairly way (Wang and Strong, 1996). The authors assumed that data consumers have different needs. After Wang and Strong (1996) eliminated and consolidated the selected dimension they proposed a list of dimensions, which were divided into four categories.

Table 2. Overview of the data quality category

Category Description Dimensions

Intrinsic Data has quality in their own right Accuracy, Objectivity, Believability and Reputation

Contextual Data quality must be considered within the context of the task at hand

Value-added, Relevancy,

Timeliness, Completeness and Appropriate amount of data

Representational Emphasize the importance of the role of the systems; which means that the system must be accessible but secure

Interpretability, Ease of understanding, Representational consistency and concise representation

Accessibility Emphasize the importance of the role of the systems; which means that the system must present data in such a way that they are interpretable and understandable

Accessibility and Access security

The dimension accuracy is defined as intrinsic data quality and the dimension timeless, completeness are listed into the contextual data quality category. In the literature the definitions of the four selected data quality dimensions are described as follows:

(26)

26 Accurateness

Accurateness is one of the key dimensions in quality analyses in data management (Wang et al., 1995). According to Wang et al. (1995) accuracy refers to the recorded value in conformity with the actual value. Accuracy is defined as a dimension of data quality ‘as errors occur because of delays in processing times, lengthy correction times, overly or insufficiently stringent data edits’ (Morey (1982) in Wang et al., 1995).

Appropriateness

The term is not used frequently in the literature. The dimension appropriateness is not mentioned in the table. In the paper of Pipino et al. (1995) appropriateness is defined as a sufficient amount of data. Wang and Strong (1996) refer to the term as used for appropriate amount of data dimension.

Completeness

According to Ballou and Pazer (1985) completeness refers to what extent a dataset contains records of the necessary values for a certain variables. Another definition of completeness is described as the extent to which data are of sufficient breadth, depth, and scope for the task at hand (Wang and Strong, 1996).

Timelines

According to Ballou and Pazer (1995) and Ballou et al. (1998) the timeliness dimension is based upon whether the recorded value is up-to-date.

The dimension appropriateness, completeness and accurateness are fundamental in the context of Solvency II. These Solvency II data quality criteria are strict requirements

(27)

27 for the data used in the calculation of the Solvency Capital Requirement and Technical Provisions. According to the Solvency II Directives the dimension timeliness is a part the dimension accurateness, which stated that the data used in the financial and risk reports needs to be accurate, timely and the processes to ensure data quality. Thus the data are considered complete, accurate and appropriate, if at least the following conditions are met. The data quality conditions of Solvency II and the literature are described in table 3.

Table 3. Data quality conditions per dimension

Category Dimension Literature Solvency II (CP 56 and CP 43)

Data from different time periods used for the same estimation are consistent

Data are certified error-free (Wang and Strong, 1996)

Data must be free from material errors

I n tr in si c Accurateness

The recorded value is in conformity with the actual value (Wang et al., 1995)

Data are recorded in timely manner and consistently over time

The ability of an information system to represent every meaningful state of the represented real world system. (Wand and Wang, 1996). The extent to which data are if sufficient breadth, depth and scope for the task (Wand and Wang, 1996)

Data are of sufficient granularity and include sufficient historical information to identify trends and to assess the characteristics of the underlying risk

Completeness

All values for a certain variables are recorded (Wang et al., 1995)

Data satisfying the condition are available for all relevant model parameters and no such relevant data are excluded from the use in the internal model without justification

Data must be appropriate to the task at hand (Tejay et al., 2006)

Data are consistent with the purposes for which it will be used

The appropriate amount of data: the extent to which the quantity and volume of available data are appropriate. (Wang and

Strong,1996)

The amount and nature of the data ensure that the estimations made in the internal model on the basis of the data do not include an undue estimation error

Appropriateness

Value-added (intention of use) as a dimension that addresses the benefits and advantages of using data (Wang and Strong, 1996)

Data are consistent with the assumptions underlying the actuarial and statistical techniques that are applied to the internal model C o n te x tu a l

Timeliness The recorded value is not out of date (Wang, et al., 1995). Timeliness is the degree to which information is up-to-date (Kahn, et al, 2002)

Moreover, data are considered to be accurate if the recording of information is adequate, performed in a timely manner and is kept consistent overtime

2.3.3 Data quality analysis

As Pipino et al. (2002) stated data assessing is a continuous process that requires fundamental principles necessary for developing usable metrics and not the ad hoc actions solving specific problems, like missing data and inadequately defined data. They stated that organisations need subjective quality measurements of those involved

(28)

28 with the collection and usage data, and the objective quality measurements both based on the selected dataset (Pipino et al., 2002). When performing a subjective data quality assessment it reflects the needs and experience of the stakeholders (Wang in Pipino et al., 2002). In the subjective assessment, questionnaires are useful to understand users’ perception of data quality. The objective data quality measurement must be established by metrics, statistical analyses techniques and quality assessment methods. Decision need to be made as to where to analyse the data quality in the context of the application and business rules. The results from an objective and subjective assessment should be compared - identifying discrepancies and determine the root causes -, and determine which necessary actions for improvement are needed (Pipino et al., 2002). As Pipino et al. (2002), stated assessing the data quality is a continuous process each organisation should develop and utilize their internal metrics using subjective and objective data quality metrics. The authors have presented three functional forms simple ratio, min or max operations, and weighted average, which could help organisations in developing data quality metrics in practise (Pipino et al., 2002). As the authors suggested in their conclusion - based upon experience- that “one size fits all” set of metrics are not the solution (Pipino et al., 2002). Noted that the used data quality analysis methods may differ for every organisation and even within the same insurance company.

2.4 Data governance

The data governance literature includes several definitions providing different perspectives on the concept of data governance. Kharti and Brown (2010) refer to data governance as the decision rights and the accountability in an organisation regarding the decision-making about its data assets. Some authors describe a more broad and deep definition of data governance definition. The Master Data Management (MDM)

(29)

29 Institute defines data governance as the formal orchestration of people, process, and technology to enable an organisation leverage data as an enterprise asset20. Thomas (2006) goes a little further by refer to data governance as a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances and using what methods21. While Loshin (2009, p. 68) stated that data governance is expected to ensure that the data meets the expectations of all the business purposes, in the context of data stewardship, ownership, compliance privacy, security, data risks, data sensitivity, metadata management, and Master Data Management (MDM).

Based on the literature the following data governance definition is formulated in this thesis, a system of roles and responsibility established within the organisation (primary) information processes, in which the set of processes, policies, standards and procedures have been drawn with the aim to assure the (high) data quality. Whereby, the data should be considered as 'fit for purpose’.

2.4.1 Data governance framework

Why a data governance framework? A framework describes what is a governed including related concepts, components and the interrelationships among them (NASCIO22, 2009) For this thesis, the terms model and framework are used based on this definition. Organisations of all types and sizes use a data governance framework to compose roles, accountabilities and the responsibilities for monitoring the data quality (Wende, 2007, p. 419; Wende and Otto, 2007, p. 2). The aim of data governance framework is a set of processes and policies that ensure that important data assets are

20 Source: The Master Data Management Institute website.

21 Source: The Data Governance Framework (DGI), Thomas, G. (2006). 22 National Association of State Chief Information Officers (NASCIO).

(30)

30 formally managed throughout the organisation23. Those organisations that structure their data quality and set roles and accountabilities will reap a significant contribution from data quality (Bitterer and Newman, 2007; Friedman, 2006). With a data governance framework the decision-making and responsibilities encourage the desirable behaviour in the use of the data and pursuing of the desired quality (Wende and Otto, 2007). To encourage desirable behaviour data governance develops and implements corporate-wide data policies, guidelines and standards that are consistent with the organisation its goals and strategy (Wende and Otto, 2007). Other aspects of data governance include compliance monitoring, data quality, data classification, data security and access, and data risk management (NASCIO, 2008).

2.4.2 Roles and responsibilities

Wende (2007) stated that to improve data quality and maintain a high quality of data within the organisation specific roles - Executive Sponsor, Chief Steward and Technical and Business Data Stewards - and committee - Data Quality Board - are needed. The author stated that organisations need to structure their data quality accountabilities, to ensure high data quality and to be able to respond to strategic and operational challenges (Wende, 2007). Additionally, Wende (2007) stated that the roles may view the quality of the same data with considerable difference of opinion, particularly where some of the selected roles are not aware of the use of the data or do not have domain knowledge. Data stewards should manage their data oversight responsibilities from a cross-organisational perspective, having a good understanding of the organisations vision (Wende, 2007).

23 Source: Wikipedia website.

(31)

31 It is important that the data are managed by the operational business, while collaborating with IT department (Cheong and Chang, 2007; Weber et al., 2008).

2.4.3 Data ownership

It is important that identifying data ownership is considered crucial in data quality, because the ownership helps to determine the roles and responsibilities throughout the data flow charts (Loshin, 2009). Bitterer and Newman (2007) stated that a data owner does not exist, arguing that the organisation itself is the owner of data. In their - Bitterer and Newman (2007) - perspective the Data Steward is the one who makes sure that the data are complete, accurate and consistent. Thus, the organisation should be defined as the data owner - and not a particular person - because in a large organisation the top management or executive is not actively involved in daily data quality operations (Bitterer and Newman, 2007). In a dynamic environment there are many data providers of one data set all with different interests (Wende, 2007). That is the reason why Loshin (2009) requires having a data ownership policy, where the stakeholders, data sets, responsibilities and dispute resolutions are all clearly defined as well as which stakeholders agree to subscribe.

(32)

3 DEVELOPMENT DATA GOVERNANCE FRAMEWORK

In this chapter sub-question one, two and three will be answered. The goal is to analyse which frameworks and related components in the selected data governance frameworks could be used and which ones comply with the prescribed data quality requirements from EIOPA.

3.1 Comparison data governance frameworks

A framework describes what is governed including related concepts, components and the interrelationships among them (NASCIO, 2009). For this thesis, the terms model and framework are used based on this definition. The aim of data governance framework is a set of processes and policies that ensure that important data assets are formally managed throughout the organisation24. Those organisations that structure their data quality and set roles and accountabilities will reap a significant contribution from data quality (Bitterer and Newman, 2007; Friedman, 2006). Most of the frameworks were developed before the introduction of the Solvency II regulation. Hence, none of them have significant similarities with the Solvency II regulatory data quality requirements. Despite this aspect some components of the frameworks are applicable to specific parts of the Solvency II data quality rules and requirements. The selected data governance frameworks are outlined in table 4, including an analysis from a Solvency II perspective; this is shown in the last column. The following four data governance frameworks were selected from the literature to analyse.

Table 4. Overview selected Data Governance Frameworks

Number Data Governance Framework Literature reference

1 Data Governance Framework Wende, 2007 2 Data Governance Institute (DGI) Data Governance Framework Thomas, 2006 3 IBM Data Governance Unified Process Overview Soares, 2010

4 Modified Data Governance Framework Kharti and Brown, 2010

(33)

33

Data Governance Framework

In the paper of Wende (2007) the Data Governance Framework is presented which contains three components built on a comparable matrix to a RACI (Responsible, Accountable, Consulted and Informed) matrix.

Figure 2. Illustration of a draft of a Data Governance Framework

The data governance model is comprised of Data Quality Management (DQM) roles, decision areas and main activities, and responsibilities, i.e., the assignment of roles to decision areas and main activities (Wende, 2007, p. 419). The columns of the matrix indicate the roles. The rows of the matrix identify the key decision areas and main activities. The cells of the matrix are filled with the responsibilities, i.e., they specify the degrees of authority between roles and decision areas. In an overview a set of four roles and one board are described (Wende, 2007, p. 420). The roles are defined in (1) Executive Sponsor, (2) Chief Steward, (3) Business Data Steward, and (4) Technical Data Steward and a Data Quality Board. As Wende (2007) stated a Data Governance Framework can assist organisations to structure the roles and accountabilities with

(34)

34 regard to data quality initiatives. With the use of a matrix, the roles of data governance and the degree of accountability within each role can be represented.

Data Governance Institute (DGI) Data Governance Framework

The Data Governance Institute (DGI) has introduced a data governance framework (Thomas, 2006). This is a logical way of organizing, classifying, and communicating intricate decision-making activities surrounding the use of data from organisations (Thomas, 2006 p. 5). The aim of the framework is that organisations are able to make decisions about how to manage data, derive value from it, minimize costs and complexity, manage risk and ensure compliance with ever-growing legal, regulatory, and other requirements (Thomas, 2006). The DGI framework is based on the premise that businesses have direct information needs, which guide technology strategies (Thomas, 2006). The DGI data governance framework provides a model of ten universal components (Thomas, 2006 p. 12). It determines rules and rules of engagement – e.g. decision rights, accountabilities-, people and organisational bodies – e.g. stakeholders, Data Steward - and a process of governing the data. A component shows the data related questions - Why, What, Who, When, and How - which are important before starting designing a data governance framework.

(35)

35 Figure 3. The Data Governance Institute (DGI) Data Governance Framework

IBM Data Governance Unified Process Overview

Soares (2010) developed a guide based on IBM data governance best practices, namely the IBM Data Governance Unified Process. The framework is illustrated in figure 4. This intricate process of data governance contains three components (1) decision rights to optimize the decisions, (2) securing, and (3) leveraging data. The IBM data governance Unified Process consists of fourteen steps (ten required steps and four optional tracks) which addresses the core issues focused on people and processes. The IBM data governance Unified Process is a continuous loop (Soares, 2010).

(36)

36 Figure 4. The IBM Data governance Unified Process Overview

Modified data governance framework

Kharti and Brown (2010) have presented a framework for data governance that provides a set of five data decision domains. This is considered a relevant alternative strategy for data governance. With this framework Kharti and Brown (2010) specified the data decision domains that are consonant with IT decisions domains, by providing an overall framework to align the IT with the data assets. The modified data governance framework consists of five decisions domains:

1. Data principles establish how all other decisions about data assets will be conducted.

2. Data quality establishes the requirements of intended use of data regarding accuracy, credibility, completeness, and timeliness.

3. Meta data are simply defined as the semantics or “ content” of data so that it is interpretable by the users establishing the rules to interpret data.

(37)

37 4. Data access refers to the standards set by organisations access policies and may

integrate audit tracking, privacy and availability practices.

5. Data lifecycle refers to all the stages from creation, usage, and storage to deletion that a data element undergoes. In addition, within each decision domains are the assignment of potential roles for accountability and decision-making purposes.

(38)

38

Table 5. Analysis of the selected Data Governance Frameworks

Number Data Governance

Framework

Key Elements Description Analysis from Solvency II perspective

1. Data Governance

Framework (Wende, 2007)

• Establish useful and necessary roles

• Establish possible decision areas and activities • Defining roles to decision areas and main activities

• Provides a framework that helps organise structure in the data quality responsibilities.

• Includes roles and decisions areas, to configure the individual data governance.

• Roles for each decision are provided

• The framework describes roles and accountabilities that are needed to govern specific decision areas. • The roles and decision areas can be useful within

the Solvency II regulatory.

• Specific roles and decision area for creating a business data dictionary.

2. The DGI Data

Governance Framework (Thomas, 2006)

• Design Data Governance organisation o Establish stakeholders, Data Steward • Develop Data Governance policies

o Establish decision rights and accountabilities • Design process

o Identify data processes

• Perform data analysis and data monitoring • Determine standards on process (re)design and data

management.

• Improvement of privacy, compliance and security aspects.

• Provides guidelines on how to implement, classify, and communicate decision-making activities surrounding the use of data

• Uses a framework of ten universal components of a data governance program.

• Includes the maturity model used to assess the maturity stage .

• Focused on architecture

• The components could be used within Solvency II • No data assessment is being performed. However,

an investigation or analysis is performed. • No report on the measurement of data quality. • Accountabilities and controls are established. • Has determined standardized data definitions • Has build data governance standards • Monitoring data quality

3. IBM Data Governance

Unified Process Overview (Soares, 2010)

• Design data governance architecture • Design data governance organisation

• Establish policies, principles, business rules and procedures

• Build a data dictionary

• Define business – and technical metrics for data quality

• Improvement of privacy, compliance and security • Determine the data stewardship

• IBM has set an overall data governance system starting with an identified business problem. • The program contains tracks, which can be selected. • Provides a step-by-step guide based on IBM data

governance best practices.

• Optimisation of decision-rights, securing, and data delivery.

• Addresses the core issues focused on people and processes

• Many steps and sub-steps to make the process effective within an organisation.

• Reporting on the data analysis is part of Solvency II and also of this framework.

• The foundation of the IBM framework is more focused on the information governance of the organisation than on the overall processes.

4. Modified Data

Governance Framework (Kharti and Brown, 2010)

• Define data access policies

• Establish data governance standards on data quality (e.g. accuracy, credibility, completeness, and timeliness)

• Sets data principles of intended use, metadata, and how data are accessed.

• Focused on defining roles, establish and standardise policies for the use of the data are important elements to ensure data quality.

• Periodically data quality monitoring and checks are not defined.

• The framework was focussed on aligning specific data issues to IT.

(39)

39

3.2 Conceptual Solvency II Data Governance Framework

In the previous section the four most widely used data governance frameworks were analysed. The selected frameworks are drawn from different starting points; each

framework reflects a certain interpretation of what a data governance framework should

be (see section 3.1 for the comparison of the selected data governance frameworks).

From the Solvency II perspective the selected frameworks are insufficient in their current structure, but some core components were used for the development of a data governance framework that does comply with the Solvency II requirements. The conceptual Solvency II Data Governance Framework was constructed based on the components of the four data governance frameworks and own interpretation. The framework is divided into 14 structured components, each being fundamental to the data governance and data quality process.

Based on the literature it can be said that roles and accountabilities are the key components within data governance. These important components were added, e.g. executive sponsorship and data governance structure, to the conceptual Solvency II Data Governance Framework. The Data Governance Institute (DGI) Framework (Thomas, 2006) turned out to contain various components that were used as input for the conceptual Framework, such as metrics, controls, data governance processes and data rules and definitions. However, in the DGI framework the components for data architecture, measurement data quality, report data quality results and monitor data quality were not present, while the last two are critical components. These components turned out to be a part of the data governance framework of Kharti and Brown (2010) and IBM (Soares, 2010). Therefore, in the conceptual Framework the components ‘Report Data Quality Results’ and ‘Monitor Data Quality’ were added. A data directory for data policy is an important component to meet the Solvency II requirements.

(40)

40

Therefore, a component “Design and Implement Data Directory” was added to the conceptual Framework. This core component is missing from the other frameworks, except for the framework of IBM (component ‘create meta data repository’). In the framework of Wende (2007) the focus was on the roles and decision areas for creating a business data dictionary. The component ‘Design Data Exchange Specifications (DES)’ was added to the conceptual Framework; this is an implicit component of the conceptual Framework, because it reflects the data delivery between two areas of responsibility. This component was not clearly defined in the four selected data governance frameworks. Furthermore, Solvency II does not define requirements on metrics to measure the data quality. However, Solvency II prescribes data quality criteria - accurateness, completeness and appropriateness -, these criteria form an integral part of all the components in the conceptual framework. The conceptual Solvency II Data Governance Framework, based on the literature and own interpretation is shown in the figure 5.

Referenties

GERELATEERDE DOCUMENTEN

The governance structure for data sharing proposed here involves the exchange of raw user information and not information further processed by firms, so that the system is

38 The Commission intends to cover many different issues in its proposal for a Data Act, such as business-to-government data sharing, busi- ness-to-business data sharing,

Er zijn onzekerheden en redenen voor onder- en overschattingen die samenhangen met zowel de veronderstelde invoer voor de varianten van anders betalen voor mobiliteit als kenmerken

The purpose of this study is to analyze the influence of the fit between the performance appraisal system and the organizational culture, which is operationalized as value

ALSPAC: Avon Longitudinal Study of Parents and Children; DAC: Data Access Committee; METADAC: Managing Ethico-social, Technical and Administrative issues in Data ACcess; NHS:

15 Ethnographers therefore should not recognise, in the management of data, a contradiction between a duty towards science (as emphasised by, for example, the Netherlands Code

Uit het onderzoek – proof of principles effect UV#C op sporenkieming en myceliumgroei in vitro – kunnen de volgende conclusies worden getrokken: • UV#C kan de kieming van

Hij beschrijft in dez e serie v erschill ende methoden die kunnen worden toegepast bij vegetat iekundi g onderzoek in netuurtuinen.. We hebben deze artike lenserie voor u