• No results found

A challenge for research projects is to keep it valuable for science and practice, or in other words finding the balance between rigour and relevance. On the one hand, to keep scientific value, the methodology must be well grounded and supported by empirical evidence. On the other hand, practice was lacking a methodology that supported practitioners in applying process mining in their organizations and that shared the best practices of process mining specialists in the field.

The ‘rigour-relevance’ debate is long-lasting issue in management science, that produced several arguments for both sides. Some statements in favour of rigour are: “a respectable objective for academic research is the development of knowledge for knowledge’s sake” [Huff 00], “nothing is so practical as a good theory” [Lewin 45], and “the key quality criterion for knowledge is validity, i.e. it is deemed valid by an informed audience – the relevant scientific community – on the basis of the arguments and empirical proof presented” [Peirce, 60].

Statements for relevance are: “if academic research is irrelevant, practitioners will look elsewhere for solutions” [Aken 07], “The sheer complexity of organizations frustrates scientific research of the usual type” [Daft 90]. And, “a discipline aimed at actively changing reality is incomplete without prescriptive statements, especially if this change involves other people”

[Aken 07]. A methodology that describes how people should perform process mining projects in organizations cannot leave out the organizations and the people that should use the

methodology to make it successful.

[Shrivastava 87] formulated eight criteria to assess the rigor and practical usefulness of research.

Practical usefulness variables:

1. Meaningfulness: The research is meaningful, understandable and adequately describes problems faced by decision-makers.

2. Goal relevance: It contains performance indicators which are relevant to managers' goals.

3. Operational validity: It has clear action implications which can be implemented 4. Innovativeness: It transcends 'commonsense' solutions and provides non-obvious

insights into practical problems.

5. Cost of implementation: The solutions suggested by the research are feasible in terms of their costs or timeliness.

Rigor variables:

6. Conceptual adequacy: Well grounded, it uses a conceptual framework consistent with existing theories in the field

7. Methodological rigor: The program uses analytical methods and objective quantifiable data

8. Accumulated empirical evidence: The research program has generated a substantial amount of accumulated empirical evidence supporting it

Concerning the practical usefulness, the methodology seems to meet the different criteria of [Shrivastava 87]:

1. First, the methodology does certainly describe a problem of practitioners as also was described at Process Mining Camp 2012, and the description of the methodology is

47

made as understandable as possible for non-experts, e.g. excluding formulas and technical terms.

2. Secondly, the methodology starts by identifying the business objectives of processes in organizations to formulate project goals, which show the relevance for managers’ goals.

3. Thirdly, the methodology contains a clear description of all activities that are needed to accomplish a process mining project and how these activities can be accomplished, except ‘commonsense’ activities that are ‘normal’ for performing organizational projects. Chapter 7 describes an organizational process and how the methodology was used to perform a process mining project.

4. Furthermore, the methodology that is described is not ‘commonsense’ as process mining practitioners are conducting projects differently, appendix B.4. Besides,

inexperienced practitioners of process mining do often not know the activities that are required for a process mining project. Having an overview of the life cycle of process mining projects and to use a list of main activities as a guide during the project was experienced as very useful during the case study. The methodology prevents from a deviation of the project plan and makes sure that all important activities are performed.

5. Finally, the methodology is realistic as a guiding framework for projects in the sense that it can be used for all projects independent of the available time and budget.

According to the rigor criteria, the methodology should need more value:

6. The methodology is developed using the ‘Systems Engineering Process’ a framework that has shown its usefulness in designing and managing complex engineering projects.

Since the methodology is validated by only one case study, no framework is used for combining empirical evidence.

7. Several existing methodologies in the data- and process mining domain that were used as inspiration during the development. Besides, the activities that are required for a process mining project are grounded by scientific literature.

8. Unfortunately, the value of the proposed methodology is not (yet) supported by a lot of empirical evidence. A single case study showed that the methodology seemed to be valuable, however, a broader investigation of the value of the methodology is required.

The methodology shows which activities must be completed to perform a process mining project, but it cannot be concluded (yet) that PMPM will be valuable to perform process mining projects more effectively and efficiently. [Rescher 77] describes two types of human knowledge,

‘knowledge how’, (the way of doing things) and ‘knowledge that’ (statements or assertions about the world). For this research project ‘knowledge how’ has been the primary focus.

‘Knowledge that’, establishing the truth that the methodology is valuable, is only validated by conversations with practitioners and a single case study. Although, the methodology is built on scientific literature and practical experiences.

[Moody 03] proposes a theoretical model and associated measurement instrument for evaluating IS design methods, the ‘Method Evaluation Model (MEM)’, figure 8.1. The model is based on the ‘Technology Acceptance Model (TAM)’ from the IS success literature and

‘Methodological Pragmatism’ from the philosophy of science.

48

Figure 8.1, Method Evaluation Model [Moody 03]

The constructs of the model are:

- Actual Efficiency: the effort required to apply a method.

- Actual Effectiveness: the degree to which a method achieves its objectives

- Perceived Ease of Use: the degree to which a person believes that using a particular method would be free of effort

- Perceived Usefulness: the degree to which a person believes that a particular method will be effective in achieving its intended objectives

- Intention to Use: the extent to which a person intends to use a particular method - Actual Usage: the extent to which a method is used in practice

To generate empirical evidence that the proposed process mining methodology is efficient, effective and has chance to be adopted in practice, MEM can be used. A possibility to test the methodology can be a laboratory experiment by giving one half of the participants the

developed methodology and the other half no or another data- or process mining methodology.

Results can be evaluated to what degree they meet the business objectives and the amount of effort that is spent to complete the experiment. Furthermore a survey can be used to evaluated the perceived ease of use and usefulness.

Another possibility is to evaluate the methodology in a field experiment. Practitioners can apply PMPM in a ‘real’ process mining project. At the end of the project they can answer questions about their perceptions of the methodology, for example understandability, completeness, usefulness. However, this second way to test the methodology is not measuring the actual efficacy, and the laboratory experiment does not consider ‘real’ projects. Therefore, of course, a combination of both types of experiments will be more reliable, e.g. laboratory for pre-testing and field for post-testing the methodology.

49