1
EVALUATING THE MODULAR SUPPLY IN ELDERLY
CARE FROM MULTIPLE PERSPECTIVES
Master thesis, Msc Supply Chain Management
University of Groningen, Faculty of Economics and Business
July 21, 2013
RUOCHEN CHEN
Student number: S2094231
E-mail: r.chen.2@student.rug.nl
Supervisor/university
Drs. M.R. van der Laan
2
Table of Contents
ABSTRACT ... 4
1. INTRODUCTION ... 5
2. THEORETICAL FRAMEWORK ... 7
2.1 The Concept of Service Modularity ... 7
2.2 Performance Objectives of Service Modularity ... 7
2.2.1 Service modularity and flexibility ... 7
2.2.2 Service modularity and cost ... 9
2.2.3 Service modularity and quality of care ... 9
2.2.4 Service modularity and responsiveness ... 10
2.3 Performance Indicators of Service Modularity ... 11
2.4 Criteria for Performance Measures Development of Service Modularity ... 11
2.4.1 Involving various perspectives ... 11
2.4.2 Standardizing measurements development ... 12
3. METHODOLOGY ... 14
3.1 Research Design ... 14
3.1.1 Case study setting ... 14
3.1.2 Design research ... 14 3.2 Data Source ... 15 3.2.1 Case study ... 15 3.2.2 Design research ... 15 3.3 Data Analysis ... 16 3.3.1 Case study ... 16 3.3.2 Design research ... 16 4. RESULTS ... 17
4.1 How to Measure the Effect of Modularizing Service Offerings on Flexibility? ... 17
4.1.1 Performance indicator development ... 17
4.1.2 Instrument development ... 18
4.2 How to Measure the Effect of Modularizing Service Offerings on Coordination Costs? ... 20
4.2.1 Performance indicator development ... 20
4.2.2 Instrument development ... 20
3
4.3.1 Performance indicator development ... 22
4.3.2 Instrument development ... 22
4.4 How to Measure the Effect of Modularizing Service Offerings on the Cycle Time of Service delivery? ... 23
4.4.1 Performance indicator development ... 23
4.4.2 Instrument development ... 23
4.5 How to Use Developed Measurement Instruments? ... 24
5. DISCUSSION ... 26
5.1 Relate Measurement Instruments to the Criteria of Performance Measurement Development ... 26
5.2 How to Analysis the Data Collected from Developed measurement Instruments? ... 26
5.3 How to Gain a Balanced View of Service Modularity Performance? ... 27
6. CONCLUSION AND FURTHER RESEARCH ... 29
6.1 Conclusion ... 29
6.2 Theoretical and Managerial Implications ... 29
6.3 Limitations and Further Research ... 30
REFERENCES ... 31
APPENDIX A: Measurement Instrument for Evaluating the Performance of Service Modularity from Patient Perspective ... 35
APPENDIX B: Measurement Instrument for Evaluating the Performance of Service Modularity from Care Provider Perspective ... 36
APPENDIX C: Measurement Instrument for Evaluating the Performance of Service Modularity from Organizational Perspective ... 38
4
ABSTRACT
As remarkable competitive advantages have gained from modularity in manufacturing setting, a growing body of literature introduced this concept into healthcare sector recently, especially the elderly care. Although existing literatures theoretically highlight the possible effects of service modularity on various aspects, its performance has not been systematically measured yet. Along with this point, this thesis aims to investigate how to evaluate the performance of service modularity in a specific healthcare setting. The central purpose of this thesis is to develop different instruments to measure the performance of modular service provision in elderly care in relation to multiple
5
1. INTRODUCTION
Since product modularity has gained significant benefits in the manufacturing sector, modularity is gaining more attention in service industries recently. Notable authors such as Pekkarinen and Ulkuniemi (2008), Bask et al (2010) and Rahikka et al (2011) indicate that service modularity enables organizations to meet heterogeneous customer demand in a cost efficient manner. Among all the services industries, healthcare service, especially care for elderly people, is recognized as one specific service context in which
organizations need to align customization with efficiency goals (De Blok et al, 2010). Since the population is aging rapidly in most of the developed country, the healthcare system is facing fierce challenges. On the one side, demand driven policies require the healthcare system to put the diverse individual elderly patient’s requirements into a central position. On the other side, healthcare organizations are urged to decrease operation costs (De Blok et al, 2010). Developing a personalized supply for each older patient individually is too costly and time consuming. To balance demand-driven care and costs, a modular service design is largely encouraged into the elderly provision.
Despite the rising importance of service modularity in elderly care, the effect of modularity on performance has not been systematically evaluated (Jacobs et al, 2007; Dörbecker and Böhmann, 2013). Also, in other service contents, the effects of service modularity on various performance dimensions have not been empirically tested. Most of the existing studies focus on the way to modularize service offerings rather than actively evaluating them. In a relatively new research of Dörbecker and Böhmann (2013), the concepts and the effects associated with service modularity presented in the existing literature are summarized and compared. For instance, De Blok et al (2010, 2012) discuss the effects of service modularity on both client involvement and personalization in order to achieve customization. In addition, Lin et al (2010) and Pekkarinen and Ulkuniemi (2008) discuss the effect of service modularity on costs reduction. Although it is undeniable that all these research give considerable insights on the effects of service modularity, a substantial part of these studies treat these effects only on a conceptual level (Dörbecker and Böhmann, 2013). In addition, a more balanced evaluation not only on benefits and risks effects, but also on different perspectives could help to deeply explore the performance of service modularity in practice (Dörbecker and Böhmann, 2013).
6 case study is helped to gain empirical insights of modular service in an elderly care setting. In the end, proposed instruments are finalized based on empirical evidences and validate measurements in the existing literature. Therefore, the main research question of this thesis is:
How to evaluate the performance of service modularity in elderly care from multiple perspectives (patient, care provider and organization)?
7
2. THEORETICAL FRAMEWORK
In this section, the theoretical framework of present research will be elaborated. First, the concept of service modularity is introduced. Second, performance objectives of service modularity are discussed which leads to several sub questions. Third, theoretical
performance indicators are explored from performance objectives, indicating the research scope which forms the theoretical foundations of this research.
2.1 The Concept of Service Modularity
Modularity, as defined by Schilling (2000), is a continuum describing the degree to which a system’s components can be separated and recombined, and it refers both to the tightness of coupling between components and the degree to which the rules of the system architecture enable or prohibit the mixing and matching of components. Notable authors such as Pekkarinen and Ulkuniemi (2008), Voss and Hsuan (2009) ,Bask et al (2010) and Rahikka et al (2011) then extend this concept into service sector. Service modularity is highly associated with the concepts of different terms: “architecture, components, interfaces, modules” (Dörbecker and Böhmann, 2013). Service architecture is the way that the functionalities of the service system are decomposed into individual functional elements to provide the overall services delivered by the system (Voss and Hsuan, 2009). Service components refer to small parts of service which fulfill the specific function, but which are dependent on other parts of functioning (Voss and Hsuan, 2009; De Blok et al, 2010). Interfaces are the linkages shared between different modules (De Blok et al, 2010), that are more “soft” than those in the product modularity by involving extensive interpersonal behaviors. Service modules are understood as functional, self-containing parts which consist out of interdependent components (Schilling, 2000; Voss and Hsuan, 2009). By mixing and matching service modules, final service packages can be customized in various ways to meet customer requirements.
2.2 Performance Objectives of Service Modularity
Performance objectives are crucial to organizations in judging the performance of operations and developing the competitive capabilities. Although the perceived benefits of service modularity have been widely discussed, the performance objectives of service modularity are not explicitly defined in the extant literature. Jacob et al (2007) examined the effects of product modularity on four objectives: cost, quality, flexibility and cycle time. Along with their findings, the following sub sections elaborate the theoretical foundations to investigate linkages between service modularity and performance objectives (flexibility, quality of care, cost and responsiveness).
2.2.1 Service modularity and flexibility
8 foremost competitive advantage of modularization (Lin and Pekkarinen, 2011;
Pekkarinen and Ulkuniemi, 2008; Rahikka et al, 2011). Therefore, the need to
empirically assess the effect of service modularity on flexibility is demanding. However, fewer instruments are available to measure the flexibility in service sector (Purbey et al, 2007). Slack (1991) identified two dimensions of flexibility for manufacturing sector, in terms of range and response flexibility. Range flexibility is understood as to what extent the service can be changed, while the response flexibility is understood as the ease (in terms of cost, time, or both) with which the service can be changed (Purbey et al, 2007). Purbey et al (2007) then suggest measures should be related to these two dimensions in order to assess the flexibility capability comprehensively.
Service modularity is expected to have an impact on flexibility in various ways. Range flexibility is increased by means of service modularity. As Schilling and Steensma (2001) state, modularity is a tightly integrated hierarchy which replaced by loosely-coupled networks that allow components to be flexibly recombined into various configurations. By mixing and matching of service modules (De Blok et al, 2010), variation in service package is created to meet the diversity clients’ needs. As such, service modularity is associated with an increase in range flexibility by achieving customization in service offerings, which enables organizations to server a wider range of customers (De Blok et
al, 2010). Moreover, De Blok et al (2010) observe that not only does the client
involvement take place but the intensity increases in the late stage of specification process to advance the modular service offering from low to high customization. Modularization allows clients adapting standard modules to truly integrate their
preferences and requirements (De Blok et al, 2010), which consequently achieves a high level of customization in the late offering creation stage and subsequently increases the capability of range flexibility. De Blok et al (2010) then state service components and modules are easier to adapt than product components and modules, that because service components and modules can quickly be changed through extensive interactions between clients and providers. Due to closely interact with clients, service providers use their interpersonal behaviors not only adapt the content of service modules but also alter the way of service delivery to achieve customization in service offering. These interpersonal behaviors refer to personalization is the result of interactive nature of care, which
effectuate customization in modular service delivery process. In this way,
personalization is better accommodated by means of service modularity that consequently increases the range and response flexibility. However, Rahikka et al (2011) argue that if the content and process of service module change, organizations should carefully balance the incurred costs and extra time and the increase of flexibility. Therefore, instrument is needed to assess the effect of service modularity on flexibility in terms of incurred costs and time, which leads to following sub question:
9 2.2.2 Service modularity and cost
Besides flexibility, lower cost is the second compelling advantage that widely discussed in the existing literature on modularity. Along with the growth of innovative care service, more attentions are gained in the effectiveness and efficiency of delivery in healthcare settings. This implies that care providers and organization managers require an
understanding not only of the efficacy of their interventions and services, but also of the relative value in an economic sense of their efforts (Zilberberg and Shorr, 2010). When examining the extant literature, it is found that most of researchers agree that modularization in service leads to lower costs by standardizing interfaces and
rationalizing service and process design (De Blok et al, 2010; Pekkarinen and Ulkuniemi, 2008; Bottcher and Klinger, 2011). Bask et al (2010, 2011) theoretically report that service modularity results in standardized interfaces. Pekkarinen and Ulkuniemi (2008) then emphasize modularity as a means to rationalize the work process, thus, achieve profitability. In service provision, interfaces could be visualized as the coordination practices among different providers, such as multi-functional meetings or discussions, which aim at managing interdependencies among various service modules. Since service modularity standardizes the interface, coordination of providers’ operational activities between different service modules is expected to be reduced. In addition, modularization in service also allows providers offering services based on structured instructions through standard work routines, which increases work efficiency as well as reduces coordination efforts. Along with this point, Bottcher and Klinger (2011) contend that the
standardization in both interfaces and delivery processes allow a better controllability of the human element in service provision, resulting less coordination efforts. Because less coordination activities results, costs spend on arrange these activities is expected to be reduced. According to Broekhuis and van Donk (2011), in a healthcare setting, non-standardized ways of coordination is demanding due to the increasing specialization of specific patient group and the functional specialization in care. As elderly patient is a specific patient group which holds diverse requirements, the needs of standardized
coordination activities seems very limited. However, by means of service modularity, less coordination efforts might be achieved which consequently reduce coordination costs. Based on these points, the present research mainly focuses to investigate whether service modularity could reduce coordination effects and costs in elderly care setting, which leads to the following sub question:
Sub question 2: How to measure the effect of modularizing service offerings on coordination costs?
2.2.3 Service modularity and quality of care
10 173). This concept contains many dimensions which depending on the perspective used. For patients, the focus is mainly on effectiveness, accessibility of care and consistent information provided. For care providers, the focus is on the delivery of effective care according to the appropriate standards for their patients. For healthcare organization managers, efficiency and safety of care are main issues (Kuijpers, et al, 2013). Hence, good quality of care can be defined as ‘care of a high standard that is effective, efficient, safe and patient oriented’ (Kuijpers, et al, 2013).
Existing literature does not clearly indicate the effect of service modularity on quality of care. Lin and Pekkarinen (2011) only state the quality of service will be ensured and improved by visualizing and standardizing the service components and interfaces. This means that modularization in service enables service providers not only to fulfill the customers’ needs in an effective and efficient way, but also to optimize the content of service easily. In this way, the service quality is expected to be guaranteed to a large extent. The present research focus on the evaluation of modular service performance in healthcare setting, service quality therefore can be measured by quality of care (Purbey et
al, 2007). According to Sixma et al (2000), quality of care is generally measured from
patients’ view by means of patient satisfaction assessment. In principle, the patient satisfaction measurements could be one way to evaluate the effect of modular service on the quality of care. However, empirical evidence is needed to clarify how the quality of care should be measured in a content based modular supply, which leads to the following sub question:
Sub question 3: How to measure the effect of modularizing service offerings on quality of care?
2.2.4 Service modularity and responsiveness
Responsiveness is regarded as one dimension to assess the delivery performance. It could be visualized by manufacturing cycle time (Jacobs et al, 2007). A substantial part of literature in production operation has suggested that cycle time is reduced by using modular design principles (Jacobs et al, 2007). In service modularity, the cycle time could be interpreted as the total time spent in the development and delivery process of each service package. Kazemi et al (2011) indicate service modules can be easily reused in various contexts and could be composed to fulfill customers’ requirements quickly. Pekkarinen and Ulkuniemi (2008) and Lin et al (2010) then state modularization
11 totally destroying old service designs (Ho et al, 2009). In this way, modularization in service is expected to decrease the time spend on service offering development and delivery, which subsequently increases the responsiveness. To systematically measure the effect of service modularity on responsiveness, the following sub question is proposed:
Sub question 4: How to measure the effect of modularizing service offerings on the cycle time of service delivery?
2.3 Performance Indicators of Service Modularity
Performance indicators (PIs) are means to help organizations measuring the operation performance by comparing actual results with preset objectives (Fortuin, 1988). In
general, performance indicators do not constitute a precisely defined category (Berg et al, 2005). As Fortuin(1988) indicated, good performance indicators are derived from the organization’s objectives, which means an organization should define its objectives and then operationalize these objectives into suitable indicators. Along with this point, the theoretical performance indicators of service modularity are derived from those
objectives mentioned in the sections 2.2. Conceptually, performance indicators presented in Table 1 are ways that enable organizations to achieve correspondent objectives. These indicators provide directions for instrument development in the following sections.
TABLE 1: An overview of theoretical performance indicators
2.4 Criteria for Performance Measures Development of Service Modularity 2.4.1 Involving various perspectives
Campbell et al (2003) indicate stakeholder perspectives which intended to be reflected by indicators should be taken into account when developing appropriate measures. Loeb (2004) further cites that a significant part of the challenge in measuring health care performance derives from the disparate nature and variable perspectives represented among the key stakeholders. However, despite the wide recognition of the importance of
Performance objective
Theoretical performance indicator
Flexibility Customization of service offering
(De Blok et al 2010, Rahikka et al 2011, Pekkarinen and Ulkuniemi ,2008)
The intensity of client involvement (De Blok et al ,2010)
Personalization in service delivery process (De Blok et al, 2012)
Cost Coordination costs
(Pekkarinen and Ulkuniemi, 2008)
Quality of care Patient satisfaction
(Sixma et al,2000)
Responsiveness Cycle time of service delivery
12 responding to a variety of stakeholder perspectives when measuring performance, few studies have actually involved these perspectives in the performance measurement
development (Tregunno et al, 2004). To develop a comprehensive evaluation instruments for service modularity, multiple stakeholder perspectives are taken into account and reflected by sets of performance indicators.
In general, patient, care provider and organization are proposed as three important
stakeholders in healthcare system. Firstly, as several proposals on the care provision have put patients rather than the care supplier at the center of processes (Broekhuis et al, 2009), their experiences are important to measure the performance of care. In this research, the frequent users of modular service are elderly people. They are therefore considered as experts by virtue of their greater experience in evaluating the performance of modular service (Sixma et al, 1998). Secondly, care providers are those who work with service modules on the daily basis. They also work as interfaces that link various service
modules together to finalize the service package on patient specifications. With regard to care providers’ opinion, one could get a more balanced evaluation of modular service. Thirdly, besides responsible for the design and delivery process of modular service, healthcare organization are carefully balanced the costs and benefits of the modular care outcomes. Involving the organization’s perspective will get an overall understanding about how to develop the measurement instruments. Based on these points, all these three perspectives are taken into account when developing measurement tools of service
modularity in the present research.
2.4.2 Standardizing measurements development
A central issue in performance measurement remains the absence of guidelines with respect to what to be measured (Loeb, 2004). One needs to understand what to be
measured firstly, and then ascertaining how these measurements should be developed on the basis of the available internal or external data (Loeb, 2004). As measures such as indicators and instruments have become so numerous in recent years, collaborative efforts are needed to standardize these measures (Adair et al, 2006). In the work of Adair
et al (2006), catalogued criteria conceptually for measurements development from
13 TABLE 2: Criteria for performance measurements development
Criterion Description
Evidence based Measures are valid and reliable operational definitions for the measure that have been demonstrated through rigorous research Strategic Measures direct attention towards the desired change
Important Measures address important or serious health services problem Actionable Measures address a service area that can benefit from
improvement
Feasible Data collection, reporting and follow-through are cost effective Relevant &Meaningful Measures are relevant to key stakeholders
Understandable Measures are easy to understand
Balanced Measures are balanced across types of treatments, treatment setting, major problems, age groups and levels of healthcare system.
Non-ambiguous Measures are clear in terms of which direction for service change is desirable
14
3. METHODOLOGY
This section will present the used research methodology, the research design, the source of data, and in the end the type of data analysis performed.
3.1 Research Design
A mixed methodology of design science research and a case study was used in this research. This is because, firstly, this research aims to solve a practical problem by developing an evaluation instrument to measure the performance of modular care. Under this circumstance, the design method was appropriate because it guided the design of interventions or the development of performance measurement of existing entity by using scientific knowledge (Aken, 2004). Secondly, because of the exploratory character of the research objective, a subjectivist, qualitative case study was chosen to gain richer empirical understandings (Rahikka et al, 2011). One advantage of using a mixed method was to combine the strengths of these two different methods. The design science clearly indicated several topics that should be investigated in this research, while the case study not only provided a specific research setting to employ the design method but also enriched the empirical understandings about the evaluation of modular service performance in practice.
3.1.1 Case study setting
A qualitative exploratory case study was conducted in a Dutch healthcare setting to gain an in-depth of observation of research problem. Three criteria were used to select the suitable organization. Firstly, the nature of the services had to correspond to the understanding of modular services (Rahikka et al, 2011), which means the modular
services in selected organization has already designed and applied into practice. Secondly, the case study took place in the context of elderly care provision where a wide variety of heterogeneous services were provided to fulfill elderly clients’ needs (De Blok et al, 2010). Thirdly, the need of measuring the modular care performance was called for in the selected organization. Based on these criteria, a healthcare center in the Northern part of the Netherlands was chosen. This center has already developed several diagnostic and treatment modules, which are now putting into practice. It aimed to offer various customized services and care packages in order to satisfy the elderly client in a low cost manner. However, due to the absence of measurement instrument, the performance of modular service has not been systematically evaluated yet.
3.1.2 Design research
Research activities in present research were mainly focused in the first two topics of the design method structure proposed by Wieringa (2007), in terms of the problem
investigation and solution design. The theoretical performance indicators were explored in a case study setting to gain substantial empirical evidences about performance
15 developed based upon this case study and available measurement tools in the literature to answer the main and sub research questions.
3.2 Data Source
Two types of data were used for measurement instrument development in this research. One was the valid and reliable measurement instruments in the existing literature that relevant to either the modularity performance measurement or healthcare performance measurement, the other was the qualitative data collected from case study. Existing instruments were the fundaments of the measurement instrument development in this research. They gave directions about which items or what kinds of questions were
appropriate to assess the effects of service modularity. Empirical evidences which gained through collected qualitative data were used to transform these items or questions into a more content based measurement tools.
3.2.1 Case study
The qualitative data was collected mainly from semi- structured interviews. Several open-ended questions with respect to the interviewees’ experiences of modular service were asked during the interviews. Interviewees should be selected from those who work in different functions, activities and organization level in the modular care provision. As a result, three interviewees were selected: organization manager, specialist nurse and a planner in the center. All interviews were conducted by at least two researchers to ensure unity in style and form of interviewing (De Blok et al, 2010) and to reduce the bias of the research. The duration time for an interview is normally within two hours on average. All interviews were recorded by digital devices and then transcribed for further analysis. Moreover, relevant documents (e.g. process description, handbooks) which directly got from the center were needed to translate into English. In the end, all these recordings and documents enabled the researcher to summary the relevant findings.
3.2.2 Design research
Valid measurement instruments were obtained from published researches by the means of electronic database- (EBSCO host) and internet (Google scholar) search. Articles were selected based on the following criteria: (1) English-language and were published
between 2000 and 2011; (2) must focus on either the product modularity evaluation or the healthcare performance evaluation. The database and internet search were conducted by using any combinations of search terms shown in Table 3 to obtain appropriate
16 TABLE 3: An overview of search terms with correspondent performance objectives
Performance objectives Search terms
Flexibility modularity; service; flexibility; customization; customer involvement; elderly care; performance measures
Cost modularity; service; cost reduction; coordination practice; healthcare
Quality of care modularity; service; quality of care; patient satisfaction; elderly care
Responsiveness modularity; service; delivery performance; cycle time; responsiveness
3.3 Data Analysis 3.3.1 Case study
The main aim of analyzing the qualitative data was to explore the concepts of service modularity in a service setting. To be specific, the primary qualitative data in the
interview transcripts and translated documents was analyzed and then related to different performance objectives. Afterwards, key findings and relevant concepts about how service modularity influence different performance objectives were abstracted from these primary data. The key findings not only created a better understanding about the practical meanings of abstract terms, such as modularity, interface and performance but also helped to further transform these abstract terms into the development process of a more content specific measurement tools.
3.3.2 Design research
Available measurement instruments were obtained from those selected researches based on the search strategy. By comparing these available instruments, final selection was made based on the following criteria: (1) selected instruments must focus on measuring either the performance of modularity or the healthcare performance; (2) selected
17
4. RESULTS
This section will present the results of how the theoretical performance indicators are translated into a specific research setting and how different measurement instruments are developed with correspond performance objectives and stakeholders’ perspective. Also, illustrations of how to use the developed measurement instruments are given in the end. 4.1 How to Measure the Effect of Modularizing Service Offerings on Flexibility? 4.1.1 Performance indicator development
Reviewing the Table 1, three theoretical performance indicators were proposed to measure the flexibility in modular supply. The first indicator is the customization in service offering. It enables organizations to adapt the provision of care to meet the changing needs of elderly and consequently achieve flexibility. During the interview, both the organization manager and care provider expressed their viewpoints on this indicator. The organization manager stated: “...Although it is time consuming and costly
to provide personalized care to each individual elderly patient, our center still strives to offer various customized modular services in fulfilling heterogeneous requirements . Realizing customization in service offering facilities the flexibility…” The care provider
stated: “...modular supply leaves rooms for the customized configuration in service and
subsequently gains an ability to change and adapt service offering easily. Specifically, by modularizing care service, I could choose the most suitable service components from a standard menu and then adapted them based on the needs of an individual patient …”
The second indicator is the intensity of client involvement. To advance the service offering from low to high, the intensity of this involvement is expected to be increased in the late stage. This standpoint was strengthened by the care provider: “…As the elderly
people are frail with special needs, I try to support and let them involved in making their own service package. In the early stage of the process, the interaction with elderly people is restricted to the need assessment. Interactions are increased when customizing care service package in the late stage… ”
The third indicator is the personalization in the service delivery process. During the interaction with the elderly patients, care providers modify not only the contents of service package but also their interpersonal behavior to achieve a better fit with the clients’ needs. In this way, customization will be easily reached and flexibility will be subsequently accomplished. The care provider acknowledged this standpoint:
“...Modifications of the modular service packages were observed in my work. Based on
the records from GP and questionnaire results, I choose several standard questions to begin the conversation. During the conversion, I always fine-tuned the service package on the basis of elderly patients’ needs. In this way, the service package will be very flexible and personalized...” In sum, all three theoretical performance indicators were
18 4.1.2 Instrument development
Instrument for measuring flexibility should be developed based on the validate measures which were obtained in the selected articles that based on the search strategy. As a result, measures in the works of Worren et al (2002), Duray et al (2000) and Tu et al (2004) were selected. Worren et al (2002) measured the effect of modularity on strategic flexibility and firm performance in the home appliance industry. Duray et al (2000) analyzed how the customer involvement in modularity to realize customization. They proposed several construct variables in term of customer involvement and modularity. Tu
el al (2004) measured the impact of modularity based manufacturing practice on
customization. The validate instrument for measuring customization in their research was therefore used. However, all these three researches measured the effect of product
modularity on performance dimensions. Selected measures were therefore fine-tuned to measure the service modularity performance. Since the present research aims to measure how the modular service affects the performance, open-ended measurement questions or items were suitable to developed base on the existing measures in work of Worren et al (2002), Duray et al (2000) and Tu et al (2004). Table 4 and 5 show the developed
questions to assess the capability of being flexible in modular care offerings and delivery process. These questions should be conducted with organization manager and care providers perspectives respectively. Table 6 shows questions to measure these three performance indicators from the patient perspective.
TABLE 4: Questions to assess the effect of service modularity on flexibility from organization perspectives
Questions
Customization in service offering (Tu et al, 2004; Worren et al ,2002)
How does modular service affect the capability of customizing the service offering to a large scale?
How does modular service affect the capability of responding to customization requirements quickly?
Do the standardized service modules encourage the innovation and flexibility of final service offering? And how?
To what extant is the content of service module easily adapted to the specific needs of the client?
How does your organization balance the high degree of customization in modular service offering and incurred costs and time?
Client involvement (Duray et al, 2000)
How does the intensity of client involvement influence the customization of modular service offering?
19 TABLE 5: Questions to assess the effect of service modularity on flexibility from
care provider perspectives
Questions
Customization in service offering (Tu et al, 2004; Worren et al ,2002) How is the variation in service offering achieved by means of service modularity? To what extent are service modules flexible to mix and match in configuring various service offering?
Personalization in service delivery (Worren et al, 2002; Duray et al, 2000) How does your interpersonal behavior influence the customization of service offering in modular supply?
How are choice options added to finalize service offering in delivery process? How do your balance the adaptations of service modules or components and incurred costs and time when achieving a high degree of customization in modular service offering?
To what extant are the content and process of service module easily adapted to the specific needs of the client?
Client involvement (Duray et al, 2000)
When and how does the client involvement take place in modular supply?
How does client specify the features and choices of service module when finalizing his or her service offerings?
To what extent is the client himself or herself actively involved in developing their service offerings?
How does the intensity of client involvement in modular supply affect the variation in service offering?
TABLE 6: Questions to assess the effect of service modularity on flexibility from patient perspective
Questions
Customization in service offering (Tu et al, 2004)
To what extent is your service offering flexible to cover all your requirements? Can your service offering be customized by adding service modules as requested? Are service modules flexible enough that can be combined in various ways to suit your needs?
Personalization in service delivery (Worren et al, 2002; Duray et al, 2000) How do the content of the service offering be altered into your specification during delivery process?
Client involvement (Duray et al, 2000)
How is the choice options of service module provided to you in modular supply? Do you specify the features and choices of service module when developing your own service offering?
When and how do care providers involve you in the service package configuration process?
20 4.2 How to Measure the Effect of Modularizing Service Offerings on Coordination Costs?
4.2.1 Performance indicator development
During the interview, the care provider explained that there was barely any coordination between her and other care providers. She also pointed out that modularizing the service enabled her to standardize the work content and process by using standard instructions. In this way, she could work in a more efficient way in her own domain. Consequently, less coordination practices among different care providers, such as multi-functional meetings or unstructured supervision practice, are expected which theoretically leads to less coordination costs. Since the present research only focuses on how service modularity influences the coordination costs, it is proposed as the appropriate performance indicator. 4.2.2 Instrument development
Instrument for measuring coordination costs should be developed based on the validate measures which were obtained in the selected articles that based on the search strategy. As a result, measures in the works of Broehuis and van Donk (2011), Jacob et al (2007), Worren et al (2002) were selected. Broehuis and van Donk (2011) measured the
coordination practices among care providers in the hospital setting from the following aspects: unstructured oral and structured coordination; supervision practice; feedback reporting and standardized work process. Part of the measurable items in the work of Worren et al (2002) was used to assess the standardization in modular process. In the research of Jacob et al (2007), measurable items were used to evaluate the cost
21 TABLE 7: Questions to assess coordination practices from care provider
perspective
Questions
Coordination practice ( Broehuis and van Donk, 2011) Within a service module
To what extent does the modular service help care providers to coordinate their work within a service module?
How does modular service affect the coordination practices, in terms of unstructured oral consultations, structured coordination meetings, supervision practice within a service module?
To what extent does the work processes standardized within a service module by means of service modularity?
How often do you discuss the feedback and advices with other care providers within a service module?
To what extent are service modules standardized?
Between service modules
To what extent does the modular service help care providers to coordinate their work between service modules to finalize service offerings?
How does modular service affect the coordination practices, in terms of unstructured oral consultations, structured coordination meetings, supervision practice between service modules?
How often do you discuss the feedback and advices with other care providers between service modules?
TABLE 8: Questions to assess the effect of service modularity on coordination cost from organization perspective
Questions
Coordination practice and costs (Worren et al, 2002)
How does modular service affect the coordination practice and cost in your organization? (E.g. coordination costs might spend on arranging multifunction meetings, contact costs with different care providers, etc.)
To what extent are service modules standardized?
To what extent are work processes standardized in modular supply? And how does the standardization of work processes affect costs?
Are process steps of different service modules documented in modular supply? Does your organization have a formal procedure to analyze client needs and
define their specifications in service package development process? Does your organization have standard procedures for transferring client
information across different service modules?
22 4.3 How to Measure the Effect of Modularizing Service Offerings on Quality of Care? 4.3.1 Performance indicator development
As it is mentioned earlier, quality of care is a broader and complex concept which contains multiple dimensions and definitions. It is important to understand the meanings of quality of care in modular supply and to investigate from which aspects to measure it. The organization manager gave us some clues towards these questions: “…The modular
service is supplied to the patient centered care. Elderly patients could be one important perspective to measure the quality of care in modular service. From the information that I had, the quality measurement could be developed on the basis of CQI in the Netherland healthcare system. However, we need to well-turned the standard CQI measurements into the content of modular service in elderly care provision.”
4.3.2 Instrument development
Based on the finding of the case study, the instrument for measuring the quality of modular care was developed based on Consumer Quality Index (CQ-index) survey. This is a standardized survey to assess the patient experiences towards healthcare treatment, which involves different stakeholders’ perspectives: (1) consumers who are more and more expected to act as informed and critical decision makers; (2) healthcare providers who monitor their provided care service; (3) healthcare organization which supervises healthcare quality; (4) health insurance companies (5) patient organizations which represent patients’ interests and needs (Damman et al, 2009). Returning to Table 1, the theoretical performance indicator for measuring quality of care was the patient
23 4.4 How to Measure the Effect of Modularizing Service Offerings on the Cycle Time of Service delivery?
4.4.1 Performance indicator development
The reusability of service modules allows organization to develop various service packages in a short time. Consequently, as the service development time decreased, the service offering could be delivered faster than before. Regarding to this standpoint, the organization manager commented: “Due to elderly patients were very busy, we wanted to
provide the ‘one stop’ service to the elderly. It means to provide all needed service modules together in a quick response.” Under this circumstance, a fast delivery of the
service package is relatively important. Therefore, the performance indicator for
measuring the responsiveness in modular supply was the cycle time for service offering delivery.
4.4.2 Instrument development
The instrument for measuring the effect of service modularity on responsiveness was developed based on the works of Jacob et al (2007) and Menor et al (2002). Menor et al (2002) mentioned quantitative measurements to evaluate the new service development performance. Jacob et al (2007) proposed several measures to assess the delivery
performance of product modularity. Measurable items and questions were developed base on these existing measurements to assess the responsiveness in modular service offerings. Table 9 and Table 10 present the measurable items and questions to assess the effects of service modularity on cycle time of service offering delivery. Since organization and care providers are responsible for the modular service development and delivery respectively, items in Table 9 should be measured from an organization perspective while questions in Table 10 should be conducted with care providers. From patient perspective, the cycle time of service offing delivery could be assessed by their waiting time experiences in modular supply. Table 11 shows proposed questions to assess patients’ waiting time experiences.
TABLE 9: Items and questions to assess the effect of service modularity on the cycle time of service delivery from organization perspective
Measurable items of service offering development time (Menor et al, 2002)
Average time for individual service offering development Average time for keep the client waited
The conjunction time between different modules
The time for adjust or adopt service modules on the basis of client requirement
Questions
24 Does service modularity shorten the development time for service offering? If yes, please indicate how this is realized?
TABLE 10: Questions to assess the effect of service modularity on the cycle time of service offering delivery from care provider perspective
Questions
Cycle time of service offering delivery
To what extent do care providers use similar components in different service modules? (E.g., care provider could switch quickly to different minimal service modules based on similarities of service components )
To what extent do care providers use similar service modules in different service
offerings? (E.g., in the center, care provider re-uses the basic module to configure various offerings in relation to different elderly patient requirements.)
How does the reusability of common components or modules influence the cycle time of service offering delivery?
TABLE 11: Questions to assess the effect of service modularity on the cycle time of service offering delivery from patient perspective
Questions
Cycle time of service offering delivery
To what extent is your service offering delivered on time by means of service modularity?
To what extent does modular supply affect the waiting time between different appointments?
4.5 How to Use Developed Measurement Instruments?
Based on the proposed questions and items, measurement instruments for evaluating the performance of modular service form multiple perspectives, in terms of patient, care providers and organization, are shown in Appendix A, B and C respectively. The target group for these measurement instruments is the entire field of organizations that want to work with modular care. Given the evaluation of service modularity performance is in the early and exploratory stage, developed measurement instruments are used for case
research purpose. When conducting case studies in sampling organizations, these instruments can be used as interview protocols to collect rich qualitative data through semi-structured interviews with multiple respondents from three groups, in terms of patient group, care provider group and organization representative group. Respondents selected from these three groups should be either experienced modular services or
25 used in interviewing with large amount of potential respondents. Qualitative data
collected from semi-structured interviews should be coded and interpreted to generate relevant findings on performance of service modularity.
In contrast to developed measurement instruments in Appendix A, B& C, the pilot CQI instrument in Appendix D is used for a survey purpose, which can be employed in follow-up studies. The population of respondents for this instrument refers to all patients who were received modular services in the whole year (e.g.in 2013). Respondents should able to recall their experiences with respect to quality of care in modular supply. The sample refers to the patient who has recently received modular service within a certain period (e.g. within 6 months). Researcher could determine the sample size according to the probability of making a type I error (α), the statistical power (1-β) and the effect size (Karlsson, 2009). According to Karlsson (2009), the α level fixed at 0.05 and a statistical power of 0.8 are common practices in operation management, while the effect size can assumed to be fixed at some unknown value (e,g, if the effect size is assumed to be fixed at medium association, then the sample size should at least be 44 respondents). Pilot testing is conducted with a group of potential respondents (e.g. 10 patients) to examine and modify the measurement properties of the CQI questionnaire (Karlsson, 2009). Then, the adapted CQI questionnaire will be filled by sample respondents to collect the
26
5. DISCUSSION
This research investigates how the effects of service modularity on various performance dimensions should be measured from multiple perspectives. To achieve this, different measurement instruments were developed in results section. This section will first relate these measurement instruments to the criteria mentioned in section 2.4. Second, it will subsequently discuss how the preliminary data that collected through these instruments will be analyzed and interpreted. Third, it will further discuss how to gain a more balanced evaluation of service modularity on the basis of existing results.
5.1 Relate Measurement Instruments to the Criteria of Performance Measurement Development
Returning to Table 2.2, catalogued criteria that abstracted from the work of Adair et al (2006) are used to assessing the developed measurement instruments in this sub section. First, measurement instruments are developed in relation to multiple stakeholders’ perspectives, in terms of patient, care provider and organizational. It is expected all these three stakeholders have their contributions in supporting modular service provision. Their views may have implications for a thorough evaluation of service modularity on
performance dimensions. Along with this point, proposed questions and measurable items are relevant and meaningful to key stakeholders. Second, instruments are developed base on both valid measurable items in existing literature and qualitative data collected from case study. Validity and reliability of proposed measures are therefore ensured through rigorous research, while feasibility of these measures is enlarged through case study. In this way, measurement instruments are developed in an evidence-based manner. Third, the case study also help to transform and adapt those abstract items in existing
instruments to a modular service based content, which makes the developed questions and items important and actionable in practices. Further, present research clearly illustrates how these questions and items are developed and how to use them to obtain preliminary data, which implies they are understandable and unambiguous. However, proposed questions and items are less sensitive towards the desired change result in evaluating the performance of service modularity. Besides, these questions are failed to develop in a more balanced view across different care groups or organization settings. 5.2 How to Analysis the Data Collected from Developed measurement Instruments? Section 4.5 generally illustrates how to use developed measurement instruments to collect data, while this sub section further discusses how to analysis the collected data. To
evaluate the performance of service modularity, developed measurement instruments in present research are suggested to be used as interview protocols to conduct case studies in modular care organizations. However, these instruments are developed to measure
27 representatives, 10 care providers and 25 patients respectively within a healthcare
organization. Since one care provider can provide the same modular services to several different patients, care providers and patients are interrelated to each other. Similar, organizational representatives and care providers also interactively cooperate in modular care provision process. Moreover, all three kinds of respondents belong to different hierarchies within an organization. Figure 1 presents an example of how these three kinds of respondents related within an organization.
Figure 1: An example of how three kinds of respondents related within an organization
When conducting interviews with respondents who are interrelated and belong to multiple levels, the qualitative data will be collected in a nested structure. In order to analyze these nested-structure data, it is suggested that researcher could divide
correspondent respondents into several focus groups. One issue in interpreting these data is how to deal with the divergences among key stakeholders’ perspectives towards the same performance dimension. For instances, when assessing the effect of service modularity on flexibility dimension, patients may report that there is hardly any
flexibility found in modular service provision, however, care provider and organization may report that a higher degree of flexibility is achieved in their modular services. Under the circumstances, researchers need to carefully trade-off different key stakeholders’ opinions in order to gain a comprehensive evaluation of service modularity performance. If only considers patients’ opinions to assess the effect of service modularity on
flexibility, cost factors will be omitted. Similar, a higher degree of flexibility will be hardly achieved if only considers organizational opinions. Therefore, it is recommended that researcher need to balanced key stakeholders’ opinions when assessing the effects of service modularity on various performance dimensions. This sub section roughly
discusses possible issues that researcher may encounter when analyzing qualitative data. Even though it provides practical insights of developed measurement instruments, more researches are needed to investigate how to systematically interpret qualitative data which presents in a nested structure.
5.3 How to Gain a Balanced View of Service Modularity Performance? Recall to the main objective of present research, a balanced view of evaluating the performance of service modularity should not only focus on different stakeholders’
Organization level
28 perspectives, but also on both positive and negative effects of modular supply. However, there is little knowledge about what kind of negative effects or risks that service
29
6. CONCLUSION AND FURTHER RESEARCH
6.1 Conclusion
The main research question of this thesis is how to evaluate the performance of service modularity in elderly care from multiple perspectives. To answer this question, different validate measurement instruments were developed in this thesis. Specifically, along with the findings in works of notable authors such as De Blok et al (2010), Rahikka et al (2011), Pekkarinen and Ulkuniemi, (2008); Jacob et al, (2007), the possible effects of service modularity were related to four performance objectives, in terms of flexibility, quality of care, cost and responsiveness. Subsequently, theoretical performance indicators were derived from those objectives which in the end lead to four sub questions. To
answer the sub questions, a mixed method research was conducted. The design method gave guidelines on how instruments should be developed, while the case study not only strengthened the existence of the theoretical performance indicator but also gave rich empirical understandings of modular service in elderly care setting. Consequently,
instruments were developed on the basis of qualitative data and existing valid instruments in results section.
6.2 Theoretical and Managerial Implications
The findings of this thesis have several theoretical implications. First, the understanding of performance objectives in service modularity becomes more explicit. Second, this research contributes several questions and measurable items to evaluate the performance of service modularity. These questions are fine-tuned from existing measures of product modularity with the characteristic of healthcare service. Moreover, combining multiple stakeholders’ perspective in the development of measurement instruments gives a balanced view of the evaluation of service modularity performance. Further, the empirical evidences collected through case study confirm and strengthen the earlier findings in the works of De Blok et al (2010, 2012), Rahikka et al (2011) and Pekkarinen and Ulkuniemi (2008) to some extent.
30 6.3 Limitations and Further Research
Several limitations to this research can be addressed. One of the limiting factors is that the interviews were only conducted with care provider and organization manager, which did not involve elderly patients. In this way, the richness of data was largely decreased by only interviewed with the organization manager and care provider. Second, since the pilot tests were not conducted in the selected organization, the reliability and validity of the developed instruments were decreased. Third, although different measurement
instruments were developed in present research, the effect of service modularity on performance was not empirical measured. Besides, the negative effects or risks of service modularity have not been explored yet. Further, the interpretation of data presented in a nested structure was not systematically investigated. Finally, instruments were
appropriate for those healthcare organizations that provide modular care to their patients. Hereby, the generalizability of these instruments was limited.
This research leaves rooms for further research. First, the pilot studies should be conducted with the focus group in the healthcare organization. Critical factors could be identified to further well-tuned these measurement tools as well as increase their
31
REFERENCES
Adair, C.,Simpson, E., Casebeer,A., Birdesell,J., Hayden,K., and Lewis, S., 2006.
Performance Measurement in Healthcare : Part II – State of the Science Findings by Stage of the Performance Measurement Process. Healthcare policy, 2(1): 56–78.
Aken, J. E. Van. 2004. Management Research Based on the Paradigm of the Design Sciences : The Quest for Field-Tested and Grounded Technological Rules. Journal of
Management Studies, 41(2): 219-245
Bask,A.,Lipponen,M., Rajahonka, M.,and Tinnila. M.,2010. The concept of modularity: Diffusion from manufacturing to service production.Journal of Business & Industrial
Marketing, Emerald Group Publishing Limited:355-375.
Bask,A.,Lipponen,M., Rajahonka, M.,and Tinnila. M.,2011.Framework for modularity and customization: Service perspective? Journal of Business & Industrial Marketing,
Emerald Group Publishing Limited:306-319.
Berg, M., Meijerink, Y., Gras, M., Goossensen, A., Schellekens, W., Haeck, J., Kallewaard, M., et al. 2005. Feasibility first: developing public performance indicators on patient safety and clinical effectiveness for Dutch hospitals. Health policy (Amsterdam,
Netherlands), 75(1): 59–73.
Bottcher, M., and Klingner,S., 2011. Providing a method for composing modular B2B services. Journal of Business & Industrial Marketing. Emerald Group Publishing
Limited. 320-331
Broekhuis, M., De Blok, C., & Meijboom, B. 2009. Improving client-centred care and services: the role of front/back-office configurations. Journal of advanced nursing, 65(5):971–80.
Broekhuis, M., & Donk, D. P. Van.2011. Coordination of physicians’ operational activities: a contingency perspective. International Journal of Operations & Production
Management, 31(3):251–273.
Campbell, S. M., Braspenning, J., Hutchinson, a, & Marshall, M. N. 2003. Research methods used in developing and applying quality indicators in primary care. BMJ (Clinical
research ed.), 326(7393):816–9.
Campmans-Kuijpers, M. J. E., Lemmens, L. C., Baan, C. a, Gorter, K. J., Groothuis, J., Van Vuure, K. H., & Rutten, G. E. H. M. 2013. Defining and improving quality management in Dutch diabetes care groups and outpatient clinics: design of the study. BMC health
32
Damman, O. C., Hendriks, M., & Sixma, H. J. 2009. Towards more patient centered healthcare: A new Consumer Quality Index instrument to assess patients’ experiences with breast care. European journal of cancer (Oxford, England: 1990), 45(9): 1569– 77.
De Blok,C., Luijkx,K., Meijboom,B.,and Schols, J.2009.Demand-based provision of housing, welfare and care services to elderly clients: from policy to daily practice through
operations management. Health Care Anal.17:68-84
De Blok,C., Luijkx,K., Meijboom,B.,and Schols, J.2010.Modular care and service packages for independently living elderly. International Journal of Operations & Production
Management.75-97.
De Blok,C., Luijkx,K., Meijboom,B.,and Schols, J.2010.Improving long-term care provision: towards demand-based care by means of modularity. BMC Health Service
Research.10:278
De Blok,C., Luijkx,K., Meijboom,B.,and Schols, J.2012.The human dimension of modular care provision: opportunities for personalization and customization. International
Journal of Operations & Production Management.
Dörbecker,R., and Böhmann,T. 2013. The Concept and Effects of Service Modularity – A Literature Review. University of Hamburg
Duray, R., Ward, P. T., Milligan, G. W., & Berry, W. L. 2000. Approaches to mass customization: configurations and empirical validation. Journal of Operations
Management, 18(6): 605–625.
Fortuin, L. 1988. Performance indicators — Why, where and how? European Journal of
Operational Research, 34(1):1–9.
Ho,H.C., Haung, C.C.,and Yang,H.L.,2009. Development of Modular Services.
International Conference on New Trends in Information and Service
Science.1215-1220.
Jacobs, M., Vickery, S. K., & Droge, C. 2007. The effects of product modularity on competitive performance: Do integration strategies mediate the relationship?
International Journal of Operations & Production Management, 27(10):1046–1068.
Karlsson, C. 2009.Researching Operations Management. Taylor& Francis Group. New York
and London.
Lau Antonio, K.W. Yam, R.C.M., and Tang, E., 2007. Then impacts of product modularity on competitive capabilities and performance: an empirical study. International Journal
economics.105:1-20.
Lin,Y., Luo, J., and Zhou, L., 2010.Modular Logistics Service Platform. International
33
Lin,Y., and Pekkarinen, S.2011. QFD-based modular logistics service design. Journal of
Business& Industrial Marketing, Emerald Group Publishing Limited.344-356.
Loeb, J. M. 2004. The current state of performance measurement in health care.
International journal for quality in health care : journal of the International Society for Quality in Health Care / ISQua, 16(1): i5–9.
Menor, L. J., Tatikonda, M. V., & Sampson, S. E. 2002. New service development: areas for exploitation and exploration. Journal of Operations Management, 20(2):135–157. Pekkarinen, S., and Ulkuniemi, P., 2008. Modularity in developing business services by
platform approach. International Journal of Logistics Management.84-103.
Rahikka, E., Pekkarinen, S., and Ulkuniemi, P. 2011. Developing the value perception of the business customer through service modularity. Journal of Business& Industrial
Marketing, Emerald Group Publishing Limited. 357-367
Schilling, M. A., Steensma, H. K., Sy, W., Henderson, J., Hippie, S., Gray, W., Griliches, Z., et al. 2001. The use of modular organizational forms: an industry- level
analysis.Academy of management Journal. 44(6):1149–1169.
Sixma, H. J., Van Campen, C., Kerssens, J. J., & Peters, L. 2000. Quality of care from the perspective of elderly people: the QUOTE-elderly instrument. Age and ageing, 29(2):173–8.
Sixma, H.J, Van Campen, C.,Kressens J.J, Peters L. and Rasker J.J .1998. Assessing patients’ priorities and perceptions of the quality of health care: The development of the QUOTE-Rheumatic patients’ instrument. British Journal of Rheumatology, 37: 362-368. Slack, N. (1991), The Manufacturing Advantage, Mercury Books, London.
Stubbe, J. H., Brouwer, W., & Delnoij, D. M. J. 2007. Patients’ experiences with quality of hospital care: the Consumer Quality Index Cataract Questionnaire. BMC
ophthalmology, 7:14.
Tregunno, D., Ross Baker, G., Barnsley, J., & Murray, M. 2004. Competing values of emergency department performance: balancing multiple stakeholder perspectives.
Health services research, 39(4): 771–91.
Tu, Q., Vonderembse,M.A., Ragu-Nathan, T.S.,and Ragu-Nathan, B., 2004. Measuring Modularity-Based Manufacturing Practices and Their Impact on Mass Customization Capability: A Customer-Driven Perspective. Design Science. 35(2): 147-168
Van der Geer, E., Van Tuijl, H. F. J. M., & Rutte, C. G. 2009. Performance management in healthcare: performance indicator development, task uncertainty, and types of
34
Voss, C., Tsikriktsis, N. and Frohlich, M. 2002. Case research in operationsmanagement,
International Journal of Operations & Production Management, 22 (2): 195-219.
Voss, C. and Hsuan, J. 2009, Service architecture and modularity, Decision Science Journal, 40 (4):541-61.
Wieringa, R. 2007. Writing a Report about Design Research. University of Twente, the
Netherlands.1–9.
Worren, N., Moore, K., & Cardona, P. 2002. Modularity, strategic flexibility, and firm performance: a study of the home appliance industry. Strategic Management Journal, 23(12): 1123–1140
Yang, L. and Shan, H,. 2009. Process Analysis of Service Modularization Based on Cluster Arithmetic, First International Workshop on Database Technology and Application,
IEEE, 263-266.
Zilberberg, M. D., & Shorr, a F. 2010. Understanding cost-effectiveness. Clinical
35 APPENDIX A: Measurement Instrument for Evaluating the Performance of Service Modularity from Patient Perspective
This instrument is used to measure how service modularity influence performance from patient perspective. It contains several open-ended questions in relation with proposed performance indicators. This instrument is preferable used as an interview protocol to conduct case study with patients who experienced with modular service provision.
A. Questions to assess the effect of service modularity on flexibility Customization in service offering:
1. To what extent is your service offering flexible to cover all your requirements? 2. Can your service offering be customized by adding service modules as requested? 3. Are service modules flexible enough that can be combined in various ways to suit
your needs?
Personalization in service delivery:
1. How do the content of the service offering be altered into your specification during delivery process?
Client involvement:
1. How is the choice options of service module provided to you in modular supply? 2. Do you specify the features and choices of service module when developing your
own service offering?
3. When and how do care providers involve you in the service package configuration process?
4. Do you think your involvement in modular supply customize your service offering base on your own requirements?
B. Questions to assess the effect of service modularity on cycle time of service delivery
1. To what extent is your service offering delivered on time by means of service modularity?