• No results found

The development of a hard and soft IT governance assessment instrument

N/A
N/A
Protected

Academic year: 2021

Share "The development of a hard and soft IT governance assessment instrument"

Copied!
8
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

ScienceDirect

Available online at www.sciencedirect.com

Procedia Computer Science 121 (2017) 47–54

1877-0509 © 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

10.1016/j.procs.2017.11.008

10.1016/j.procs.2017.11.008

© 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

1877-0509

ScienceDirect

Procedia Computer Science 00 (2017) 000–000

www.elsevier.com/locate/procedia

1877-0509 © 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN -

International Conference on Project MANagement / HCist - International Conference on Health

and Social Care Information Systems and Technologies, CENTERIS / ProjMAN / HCist 2017, 8-10

November 2017, Barcelona, Spain

The development of a hard and soft IT governance assessment

instrument

"Daniël Smits, Jos van Hillegersberg" *

"* University of Twente, Drienerlolaan 5, 7522 NB Enschede, The Netherlands."

Abstract

Current IT governance research is largely focused on hard governance. Soft governance needs more attention. The MIG model (Maturity IT governance) was designed because an IT governance maturity model covering both the hard and soft parts of governance did not exist. Using the MIG model, this paper describes the development of an instrument to measure hard and soft governance maturity: the MIG assessment instrument. It summarizes the operationalization of a maturity model for hard and soft IT Governance. The paper gives a detailed description of the result of the development of the MIG assessment instruments’ first cycle. A research contribution to the MIG model was a solution for the implementation of the focus area informal organization in the instrument. We found an easy and simple solution by using the nine focus areas of the maturity model as a framework for the informal organization. The instrument is intended to be used in combination with interviews. The instrument will be used, validated and improved in several cycles using the results from the case studies.

© 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

Keywords: IT governance; Soft governance; Informal organizational; Design science.

* Corresponding author. Tel.: +31-6-5232-7548.

E-mail address: d.smits@utwente.nl

Available online at www.sciencedirect.com

ScienceDirect

Procedia Computer Science 00 (2017) 000–000

www.elsevier.com/locate/procedia

1877-0509 © 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN -

International Conference on Project MANagement / HCist - International Conference on Health

and Social Care Information Systems and Technologies, CENTERIS / ProjMAN / HCist 2017, 8-10

November 2017, Barcelona, Spain

The development of a hard and soft IT governance assessment

instrument

"Daniël Smits, Jos van Hillegersberg" *

"* University of Twente, Drienerlolaan 5, 7522 NB Enschede, The Netherlands."

Abstract

Current IT governance research is largely focused on hard governance. Soft governance needs more attention. The MIG model (Maturity IT governance) was designed because an IT governance maturity model covering both the hard and soft parts of governance did not exist. Using the MIG model, this paper describes the development of an instrument to measure hard and soft governance maturity: the MIG assessment instrument. It summarizes the operationalization of a maturity model for hard and soft IT Governance. The paper gives a detailed description of the result of the development of the MIG assessment instruments’ first cycle. A research contribution to the MIG model was a solution for the implementation of the focus area informal organization in the instrument. We found an easy and simple solution by using the nine focus areas of the maturity model as a framework for the informal organization. The instrument is intended to be used in combination with interviews. The instrument will be used, validated and improved in several cycles using the results from the case studies.

© 2017 The Authors. Published by Elsevier B.V.

Peer-review under responsibility of the scientific committee of the CENTERIS - International Conference on ENTERprise Information Systems / ProjMAN - International Conference on Project MANagement / HCist - International Conference on Health and Social Care Information Systems and Technologies.

Keywords: IT governance; Soft governance; Informal organizational; Design science.

* Corresponding author. Tel.: +31-6-5232-7548.

(2)

1. Introduction

IT Governance (ITG) is a relatively new topic. The first publications appeared in the early 1990s1, with important research on ITG being done by Peterson2, 3, Weill & Ross4, Van Grembergen & De Haes5. Based on their study of 250 enterprises worldwide, Weill and Ross4 concluded that “firms with superior IT governance have at least 20 percent higher profits than firms with poor governance”. ITG is an ongoing concern for organizations worldwide. A McKinsey global survey in 2014 showed that 35% IT executives (or 30% of all executives) mentioned “improving governance processes and oversight” as the most important factor for improving IT performance.6

Definitions of ITG in the literature vary greatly.7, 8 An analysis of the ITG literature revealed that six streams of thought can be distinguished in ITG.9 Four IT governance streams differ in scope: 'IT Audit', 'Decision-making', 'Part of corporate governance, conformance perspective', 'Part of corporate governance, performance perspective'. The last two streams differ in the direction in which IT governance works: 'Top down' and 'Bottom up'. More than 50% of the organizations worldwide use frameworks for ITG.10 Frameworks frequently used in practice for ITG are very diverse: ITIL, ISO 17799, ISO 27000, ISO 38500, COBIT, Six Sigma, PMI, Risk IT, CMM or CMMI, and so on.11 There is a substantial correspondence between the streams an organization sees as important and the frameworks used in practice. Such streams deviate from the conventional list of dimensions.12 In literature, more alternative factors are suggested.13-16

As proposed by several scholars, ITG can be deployed using a trichotomy summarized as structure, processes and relational mechanisms. Most of the frameworks mentioned earlier are largely based on processes and structure. An exception is the ISO 38500 standard for ITG.17 The inclusion of human behavior as one of the principles makes it a positive exception. Implementation of such standard, however, is not yet widespread.11, 18 COBIT and ISO 38500 are really focused on ITG. The use of COBIT 5.0 for ITG by ISACA members (26%) seems to grow.19

Relational mechanisms can be broken up into several parts for soft governance and the context.20 Culture can be seen as a factor highly influencing the implementation of ITG.11 People don't act or think in terms of process and structure only.

The split of governance into hard and soft governance has been done earlier.21-25 Joseph Nye is the founder of the soft power theory. Soft power is related to "intangible power resources such as culture, ideology, and institutions".26 Several studies showed that soft governance needs more attention11, 27-29 and that ITG is situational11, 27, 30-32.

The goal of our research program is to determine how to improve IT governance’s effectiveness and maturity. This study aims to develop an MIG assessment instrument based on the MIG model, which was designed in previous research of Smits and Hillegersberg20, 33. The MIG model is a focus area maturity model used to develop the MIG assessment instrument.

In this paper, we discuss the development of the MIG assessment instrument. This paper is organized as follows. Section 2 covers ITG maturity and the MIG model. Section 3 presents the research methodology. The results of the development of the initial model of the MIG assessment instrument are described in Section 4. A discussion, the limitations, implications for future research, and conclusions are described in Section 5.

2. IT governance maturity

The maturing entities in this study are 'organizational capabilities' based on the resource-based view used in strategic management literature.34, 35 An organizational capability is "the ability of an organization to perform a coordinated set of tasks, utilizing organizational resources, for the purpose of achieving a particular end result".36 Maturity models can be seen as artifacts to determine a company's status quo and as "deriving measures for improvement".37

Does ITG maturity have a significant positive impact on IT performance and firm performance?

Recent studies showed that ITG maturity has a significant positive impact on IT performance and firm performance.4, 38-40 However, some studies did not find a clear, positive correlation between the two;41, 42 while some argued that there might be a considerable time delay between the improvement of ITG maturity levels and the perceived benefit43.

(3)

We think ITG is a complex phenomenon and improving ITG maturity depends on improving all relevant aspects. Most research predominantly focuses only on hard governance aspects, while soft governance and situational aspects of the context are not (fully) covered. To assess or improve an organization’s IT governance and to be able to grow in maturity, it is necessary to include both hard and soft governance and the context. We need an IT governance maturity model covering all three elements.

2.1. The MIG model

The MIG model was designed because an ITG maturity model covering both the hard and soft parts of governance did not exist.11, 33, 44 The MIG model is a maturity model covering both hard and soft governance and the context,20, 33 and looks as follows:

Table 1 The MIG model

Governance Domain Focus area Maturity model used

Soft Behavior

Continuous improvement Bessant et al.45

Leadership Collins46

Collaboration Participation Understanding and trust Magdaleno et al.Reich et al.48 47 Hard

Structure Functions and roles CMM49

(used for all five focus areas) Formal networks

Process IT decision-making Planning Monitoring

Context Internal Culture Quinn

50 Informal organization

External Sector

The context is essential for delivering information about the situational part of ITG information. The MIG model is a focus area maturity model.51 In such type of maturity model, maturity is determined by a set of focus areas. The MIG model20 makes use of the maturity models as described in the last column of Table 1.

For each focus area in hard governance, the same maturity model is used: CMM. For soft governance, each focus area makes use of a different maturity model: Bessant et al. for “continuous improvement”, Collins for “leadership”, Magdaleno et al. for “participation”, and Reich et al. for “understanding and trust”.

No maturity models are used in the context. The context is important because research has shown that IT governance is situational.11, 27, 30 The focus areas in the context are situational elements of the maturity model as proposed by Mettler and Rohner.28 To get insight into the focus area, “Culture”, the Competing Values Framework of Quinn is used.50

3. Research methodology

The development of the MIG instrument (as described in the next section) and the pilot case study must deliver answers to the following research question:

How can we develop an MIG assessment instrument based on the MIG model?

The MIG assessment instrument is based on the MIG model, which was developed using design science by Smits and Hillegersberg20, 33. There is no widely accepted definition of design science research.52 However, we adopt Hevner’s definition of design science, because it fits our purpose. Hevner et al.53 note that the design science paradigm seeks to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. For our research, we aim to develop an innovative instrument to determine the current status of an organization’s IT governance. The research approach combines knowledge from literature and experts from practice

(4)

to achieve both “problem relevance” and “research rigor”.53 For the development of the instrument, we applied the guidelines and three cycles of Hevner: the Relevance cycle, the Design cycle and the Rigor cycle.53, 54

We intend to use the MIG instrument in case studies. For the design cycle the MIG model, the MIG instrument and interviews of the case study are relevant. The developed instrument is intended to be used in case studies combined with interviews.

For each case, we will use a case study protocol as described by Yin:55

1. The protocol starts with a preparation for the case study. During the first step, a group of employees from one organization are selected and invited to participate in the study.

2. Each participant is asked to fill out the MIG instrument (part A, customer) before the interview. 3. The researcher creates the result sheet using the MIG instrument (part B, researcher).

4. During the interview, the results for each focus area are discussed.

5. After the interview, the results of the interview are summarized and sent to the participant for verification. Thus after each case study two types of results are available:

a. a result from the MIG assessment instrument; and b. the corrected results based on the interview.

The feedback from the participant on the use of the MIG instrument and corrected results can be used to improve the instrument. During the designing of the MIG instrument, the guidelines of Hevner were utilized as described in the next enumeration:53

1. Design as an Artifact: The MIG instrument is the artifact.

2. Problem Relevance: The relevance of the problem is one of the questions in the case study. The feedback is used to improve the instrument in the next cycle (Relevance cycle).

3. Design Evaluation: After each use of the instrument, the results are discussed with the interviewee and used as evaluation input in the design cycle. For example, how did the interviewee experience the use of the instrument?

4. Research Contributions: Some contributions to the MIG model have been added to make it possible to design the instrument (see section 4.2 Choices necessary for the implementation).

5. Research Rigor: After each use of the instrument, the results are discussed with the interviewee and used as input for the rigor cycle. For example, does the interviewee agree with the results of the instrument?

6. Design as a Search Process: The first version of the MIG instrument was created and used with the intention to do at least three cycles improving the instrument.

7. Communication of Research: The results of the use of the instrument and the results of the case study are communicated to the participants in the organization and used as a basis for writing research papers.

We intend to improve the instrument in (at least) three cycles. More specifically, we will define a new version of the instrument using the data collected in several case studies during an academic year.

The next section covers the results of this study. The result of this study is the initial version of the MIG assessment instrument.

4. Results

This paper only describes the first version of the MIG assessment instrument. The initial version was created in the first quarter of 2015. This version was based on the MIG model as designed by Smits and Hillegersberg.20, 33 For the operationalization of the maturity model, some additional choices have been made (see section 4.2).

In this section, all elements and choices will be explained and examples are given from the implementation of each element.

4.1. General scheme of the instrument

The MIG instrument consists of two parts (Excel sheets):

A. The first part (A) is intended for the participant and contains the statements and other questions. B. The second part (B) is for the researcher and is used to create the result sheet.

(5)

Both parts are separated to prevent the participant from being influenced by the results when filling out the assessment.

The MIG assessment instrument (part A) consists of a general form with some demographic information and two questionnaires. The second part for the researcher (part B) is only used to create the result form. To make it easy to import the data in part B of the instrument a hidden tab is added, which can be used to easily transfer the data. 4.2. Choices necessary for the implementation

The current publications on the MIG model do not contain a clear description of the model, which should be used for the focus area “informal organization”.

Three alternatives have been suggested: Variables outlined by Cobb56, seven types of lateral relations developed by Galbraith57 and the Sociogram as described by Mintzberg58. All three alternatives are very laborious and not easy to be implemented. We tried doing so in some tests, but did not succeed. They also resulted in a lot of discussion during the designing of the MIG model. The practitioners could not reach a consensus and suggested finding out in practice a framework that delivered the best results.20

Finally we found an easy and simple solution without the complexity of adding another framework. We used the nine focus areas of the maturity part of the MIG model. By adding two statements for each of the nine focus areas of hard and soft governance, we expanded the questionnaire to deliver information for the ‘informal organization’. The questionnaire includes formal and informal statements for each focus area. For example: ‘Decision-making on IT investments and projects is formally organized’. Furthermore, we added two control statements to check the consistency in the answers of the participant; which in total makes 20 statements.

By using the same focus areas as in the maturity model, we do not add complexity and expect to collect useful information on the context of each focus area in the FAMM-part of the MIG model.

We did not add a model for “sector”; because researchers know to which organization and sector the data belong. 4.3. The MIG assessment instrument (part A - participant)

The MIG instrument for the participant consists of two questionnaires.

Questionnaire 1: Based on focus areas and maturity models, as described in the MIG model without changes, it

consists of:

• 84 statements, two for each of the 42 maturity levels in the focus area maturity model part of the MIG model • 20 statements for the focus area informal organization

The interviewee has to decide for each of the 104 statements whether they apply to the current status of the organization. Willing participants can add a comment or motivation where needed.

All statements are based on two key elements of the definitions of focus areas and maturity levels in the MIG model. Content analysis was used to determine the key elements in the definitions.59 For each definition, we determined the two—in the opinion of the researchers—most relevant ‘themes’. For example: for the first (or initial) level of CMM we choose ‘Ad hoc’ and ‘Chaotic’. The themes are used to create the statements. To validate such choices the instrument will only be used in combination with interviews.

In each cycle, the key elements (and statements) will be reconsidered comparing the results of the instrument and the interview. The statements are presented in a random order, but are grouped by focus areas. Try-outs of test versions of the questionnaire on participants showed that a random order for all statements was experienced as confusing. Randomizing is important to prevent the participant from influencing the results of the assessment before the interview.

For each statement, the participant must decide whether he agrees (yes) or does not agree (no). The basic principle is that if the response is positive for both statements, the level will be reached. Furthermore, it is only possible to reach a level if all preceding levels have been reached.

Thus to mark a level as reached, there are two conditions to be met: 1. Both the statements for a level should be answered positively. 2. All preceding levels must be reached.

(6)

Figure 1 The statements for the focus area functions and roles in questionnaire 1

Questionnaire 2: For the second questionnaire on culture, we used an existing questionnaire: the Organizational

Cultural Assessment Instrument (OCAI). This instrument was developed by Cameron and Quinn60 as a means for organizations to quantify organizational culture.

It fits our purpose—we need a simple instrument to identify the culture and be able to compare organizations. The questionnaire consists of 24 statements for which six times 100 points have to be divided in four alternative statements.

4.4. The MIG assessment instrument (part B - researcher)

By importing the results of part A of the instrument into part B, a result chart can be generated. In Figure 2, a simplified example of the result chart is shown (fictitious data).

The first table, “IT Governance Maturity”, shows the results of the maturity levels based on the answers entered by the participant. To mark a level as reached, there are always two statements (first condition) that should be answered positively, including all preceding levels (second condition). If both conditions are met, the field is marked green—meaning the level is reached. The focus area “Functions and roles” is an example of an unexpected situation: B shows a value of 1, and level C, a value of 2. This might be an imperfection of the instrument and must be discussed with the participant. Non-existing levels are marked as not available (n.a.).

The second table and the graph on the right show the organization’s positioning in the competing values framework.50 The table shows the resulting numerical values.

The last part of the result sheet shows a percentage for “Informal organization”. It shows the percentage of positively answered statements (out of 18 statements) as a percentage and as a graph.

Figure 2 Results MIG assessment (example; simplified).

Focus

Area Statement Fulfilled Remarks

Functions and roles exercised by different people are completed in a comparable way. All defined functions and roles in the IT organization are really performed by employees. Functions and roles are described in standards, procedures, tools and methods and communicated in the organization.

In the IT organization several functions and roles are defined.

The functions and roles are fully recognized and accepted in the IT organization.

The organization makes use of an incremental and innovative process to continually improve the functions and roles of the organization.

The functions and roles are monitored and measured. Action will be taken if functions or roles are not effective. Functions and roles satisfy to a fixed (standardized) pattern.

The organization continually collects information on the quality and performance of the functions and roles of the organization.

Function and roles are well characterized, understood and used by the employees of the organization.

Enter 'y' or 'Y' in the column 'Fulfillend' if the statement is fulfillend in your organization. Please assess the current situation!

Func tio ns a nd r ol es

(7)

5. Discussion and conclusion

The MIG instrument is based on the MIG model. The MIG assessment instrument was created combining design science53 and the approach for developing a Focus Area Maturity Model51. This paper focuses on the development of the first version of the MIG assessment instrument. In our next papers, we will discuss the experiences using this and next versions of the MIG assessment instrument in several case studies.

The answer to the research question: How can we develop an MIG assessment instrument based on the MIG

model? as described in section 2 can be summarized as follows: The first cycle of the development of the MIG

instrument using the MIG model has resulted in the first version of an MIG assessment instrument. In section 4, this version has been described in detail. We expect to use, validate and improve the instrument in several cycles using the results from case studies.

A research contribution to the MIG model was a solution for the implementation of the informal organization in the MIG instrument. We found an easy and simple solution by using the nine focus areas of the maturity model as a framework. Assessing the informal organization using the same focus areas as in the maturity model does not add complexity and is expected to deliver useful information on the context of each focus area in the maturity part of the MIG model (the hard and soft governance part).

Limitations: For most maturity levels of the focus areas, there are more than two possible key elements to be used

as a basis for the statements. Using different key elements might lead to different results. To validate such choices, the instrument will only be used in combination with interviews. In each cycle, key elements will be reconsidered by comparing the results of assessment using the instrument and the interviewee’s opinion on the results.

Next steps: We will use the instrument in case studies performed by students and by the researchers. We intend to

improve the instrument in at least three yearly cycles. Each academic year, we expect to do somewhere between 5 and 10 case studies. The collected data will be used to create the improved version for the next academic year.

References

1. Urbach, N., A. Buchwald, and F. Ahlemann, Understanding IT Governance Success and its Impact: Results from an Interview Study. Proceedings of the 21st European Conference on Information Systems (ECIS 2013), 2013.

2. Peterson, R., Crafting information technology governance. Information Systems Management, 2004. 21(4): p. 7-22.

3. Peterson, R. Configurations and coordination for global information technology governance: Complex designs in a transnational European

context. 2001. Maui, HI.

4. Weill, P. and J.W. Ross, IT governance: How top performers manage IT decision rights for superior results. 2004: Harvard Business Press. 5. Van Grembergen, W., S. De Haes, and E. Guldentops, Structures, processes and relational mechanisms for IT governance. Strategies for

information technology governance, 2004. 2(004): p. 1-36.

6. Khan, N. and J. Sikes, IT under pressure: McKinsey Global Survey results. 2014, McKinsey.

7. Webb, P., C. Pollard, and G. Ridley. Attempting to define IT governance: wisdom or folly? in System Sciences, 2006. HICSS'06. Proceedings

of the 39th Annual Hawaii International Conference on. 2006. IEEE.

8. Lee, J. and C. Lee, IT Governance-Based IT Strategy and Management: Literature Review and Future Research Directions, in Information

Technology Governance and Service Management: Frameworks and Adaptations. 2009, IGI Global. p. 44-62.

9. Smits, D. and J.v. Hillegersberg. The Continuing Mismatch Between IT Governance Theory and Practice: Results from a Delphi Study with

CIO’s. in 9th Americas Conference on Information Systems (AMCIS 2003). 2013. AIS Electronic Library.

10. ISACA, Governance of Enterprise IT (GEIT) Survey Global Edition. 2012, ISACA, Meadows, IL, US.

11. ITGI, Global Status Report on the Governance of Enterprise IT (GEIT). 2011, IT Governance Institute, Meadows, IL, US.

12. Novotny, A., E. Bernroider, and S. Koch, Dimensions and operationalisations of IT governance: a literature review and meta-case study. 2012.

13. Chin, P.O., G.A. Brown, and Q. Hu, The impact of mergers & acquisitions on IT governance structures: A case study. Journal of Global Information Management, 2004. 12(4): p. 50-74.

14. Maidin, S.S. and N.H. Arshad. IT governance practices model in IT project approval and implementation in Malaysian public sector. 2010. Kyoto.

15. Nfuka, E.N. and L. Rusu. IT governance maturity in the public sector organizations in a developing country: The case of Tanzania. 2010. Lima.

16. Mohamed, N., J. Kaur, and G. Singh, A conceptual framework for information technology governance effectiveness in private organizations. Information Management and Computer Security, 2012. 20(2): p. 88-106.

17. 38500, I.I., International Standard for Corporate Governance of IT. 2008, International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).

18. Couchman, P.K., et al. Corporate governance and information technology: findings from an exploratory survey of Australian organizations. in ANZAM 2011: 25th Annual Australian And New Zealand Academy of Management conference. 2011. ANZAM.

(8)

20. Smits, D. and J. van Hillegersberg. IT Governance Maturity: Developing a Maturity Model Using the Delphi Method. in System Sciences

(HICSS), 2015 48th Hawaii International Conference on. 2015. IEEE.

21. Tarmidi, M., A. Abdul Rashid, and R. Abdul Roni. Exploring the Approaches for COBIT Process in Malaysian 100 Top Corporate

Governance Companies. in 3rd International Conference on Business and Economic Research (3rd ICBER 2012). 2012.

22. Uehara, K., Soft IT Governance. ISACA Journal online, 2010. 1(1): p. 1-6.

23. Cook, D.M. Mitigating Cyber-Threats Through Public-Private Partnerships: Low Cost Governance With High-Impact Returns. in 2010

INTERNATIONAL CYBER RESILIENCE CONFERENCE. 2010.

24. Tucker, C.M. The Lisbon Strategy and the Open Method of Coordination: a new vision and the revolutionary potential of soft governance in

the European Union. in annual meeting of the American Political Science Association. 2003.

25. Moos, L., Hard and Soft Governance: The Journey from Transnational Agencies to School Leadership. European Educational Research Journal, 2009. 8(3): p. 397-406.

26. Nye, J.S., Soft power. Foreign policy, 1990: p. 153-171.

27. Rogers, G.P., The role of maturity models in IT Governance: A Comparison of the major models and their potential benefits to the

enterprise, in Information Technology Governance and Service Management: Frameworks and Adaptations. 2009, IGI Global. p. 254-265.

28. Mettler, T. and P. Rohner. Situational maturity models as instrumental artifacts for organizational design. in Proceedings of the 4th

International Conference on Design Science Research in Information Systems and Technology. 2009. ACM.

29. Davies, M.A., Best practice in corporate governance: building reputation and sustainable success. 2012: Gower Publishing, Ltd.

30. Sethibe, T., J. Campbell, and C. McDonald. IT governance in public and private sector organizations: Examining the differences and

defining future research directions. 2007. Toowoomba, QLD.

31. Denford, J.S., G.S. Dawson, and K.C. Desouza. An argument for centralization of IT governance in the public sector. in System Sciences

(HICSS), 2015 48th Hawaii International Conference on. 2015. IEEE.

32. de Souza Bermejo, P.H. and A.O. Tonelli. Planning and implementing IT governance in Brazilian public organizations. in System Sciences

(HICSS), 2011 44th Hawaii International Conference on. 2011. IEEE.

33. Smits, D. and J.v. Hillegersberg. The Development of an IT Governance Maturity Model for Hard and Soft Governance. in 8th European

Conference on IS Management and Evaluation (ECIME 2014). 2014. Academic Conferences and Publishing International Limited.

34. Wernerfelt, B., A resource‐based view of the firm. Strategic management journal, 1984. 5(2): p. 171-180.

35. Ulrich, D. and N. Smallwood, Capitalizing on capabilities. Harvard business review, 2004: p. 119-128.

36. Helfat, C.E. and M.A. Peteraf, The dynamic resource‐based view: Capability lifecycles. Strategic management journal, 2003. 24(10): p. 997-1010.

37. Becker, J., R. Knackstedt, and D.-W.I.J. Pöppelbuß, Developing maturity models for IT management. Business & Information Systems Engineering, 2009. 1(3): p. 213-222.

38. Liang, T.P., et al. The impact of IT governance on organizational performance. 2011. Detroit, MI.

39. Simonsson, M., P. Johnson, and M. Ekstedt, The effect of IT governance maturity on IT governance performance. Information Systems Management, 2010. 27(1): p. 10-24.

40. Dodds, R., Effective IT Governance will Improve Returns to Shareholders. Information Systems Control Journal, 2004. 3(1): p. 17-18. 41. Tugas, F.C., Information Technology Maturity Index And Profitability In The Philippine Food, Beverage And Tobacco Industry.

International Journal of Business Research, 2010. 10(1).

42. Tanriverdi, H., Performance effects of information technology synergies in multibusiness firms. MIS Quarterly: Management Information Systems, 2006. 30(1): p. 57-77.

43. Yuwono, B. and A. Vijaya. The impact of Information Technology Governance maturity level on corporate productivity: A case study at an

IT services company. in Advanced Computer Science and Information System (ICACSIS), 2011 International Conference on. 2011. IEEE.

44. Pöppelbuß, J., et al., Maturity Models in Information Systems Research: Literature Search and Analysis. Communications of the Association for Information Systems, 2011. 29(1).

45. Bessant, J., S. Caffyn, and M. Gallagher, An evolutionary model of continuous improvement behaviour. Technovation, 2001. 21(2): p. 67-77. 46. Collins, J., Level 5 leadership: The triumph of humility and fierce resolve. Harvard Business Review, 2001. 79(1): p. 66-76.

47. Magdaleno, A.M., R.M. de Araujo, and C.M.L. Werner. A roadmap to the Collaboration Maturity Model (CollabMM) evolution. in

Computer Supported Cooperative Work in Design (CSCWD), 2011 15th International Conference on. 2011. IEEE.

48. Reich, B.H. and I. Benbasat, Measuring the Linkage Between Business and Information Technology Objectives. MIS quarterly, 1996. 20(1). 49. Paulk, M.C., et al., Capability Maturity Model for Software. 1991, Carnegie Mellon University, Pittsburgh.

50. Quinn, R.E. and J. Rohrbaugh, A spatial model of effectiveness criteria: towards a competing values approach to organizational analysis. Management science, 1983. 29(3): p. 363-377.

51. van Steenbergen, M., et al., The design of focus area maturity models, in Global Perspectives on Design Science Research. 2010, Springer. p. 317-332.

52. Iivari, J. and J. Venable. Action research and design science research–seemingly similar but decisively dissimilar. in 17th European

conference on information systems. 2009.

53. Hevner, A.R., et al., Design science in information systems research. MIS Q., 2004. 28(1): p. 75-105.

54. Hevner, A.R., A three cycle view of design science research. Scandinavian journal of information systems, 2007. 19(2): p. 4. 55. Yin, R.K., Case study research: Design and methods. 2013: Sage publications.

56. Cobb, A.T., Informal influence in the formal organization: Perceived sources or power among work unit peers. Academy of Management Journal, 1980. 23(1): p. 155-161.

57. Galbraith, J.R., Organization design: An information processing view. Interfaces, 1974. 4(3): p. 28-36.

58. Mintzberg, H., The structuring of organizations: A synthesis of the research. University of Illinois at Urbana-Champaign's Academy for Entrepreneurial Leadership Historical Research Reference in Entrepreneurship, 1979.

59. Berg, B.L., H. Lune, and H. Lune, Qualitative research methods for the social sciences. Vol. 5. 2004: Pearson Boston, MA.

60. Cameron, K.S. and R.E. Quinn, Diagnosing and changing organizational culture: Based on the competing values framework. 2005: John Wiley & Sons.

Referenties

GERELATEERDE DOCUMENTEN

Since traditional project management methods aren’t always suitable to manage more ill-defined and uncertain projects, there is a need to combine both hard and soft aspects.. Back

It will investigate, through an approach that is based on Mulder & Scholtens (2013) who study this effect for the Netherlands, what happens to the wholesale prices of

6 In fact, prospective long-term follow-up is part of both investigator-initiated European- wide trials on fresh decellularized allografts for pulmonary and aortic valve replacement

F 28 Hues differences of the med/bottom prediction group 28 F 29 Image contrast distributions of the prediction groups 28 F 30 Image brightness distributions of the prediction groups

Het reisgedrag van de studenten wordt beïnvloedt door veranderingen binnen verschillende disciplines; ten eerste vanuit politieke een politieke discipline, waar politieke

• If the species adapted to the fossil diesel (B0) (adaptation induced at various concentrations of fossil diesel and exposure times), then the species would not rely most

[16] on simulation of hip joint movement during Salat activity showed that impingement of hip joint prosthesis could occur during three positions include sitting

We predict that children will be drawn to the sentence-internal reading of ’different’, for both the quantifier ’each’ and the definite plural ’the’, due to their preference