• No results found

Integration requirements of a process digital twin within an existing information system landscape

N/A
N/A
Protected

Academic year: 2021

Share "Integration requirements of a process digital twin within an existing information system landscape"

Copied!
34
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

1

Integration requirements of a process digital twin

within an existing information system landscape

Master Thesis

Name: H.G. Antonides

Student number: s3001792

(2)

2

Table of Contents

Abstract ... 3 1. Introduction ... 4 2. Theoretical background ... 6 2.1 Digital twin... 6 2.1.1 Industry 4.0 ... 6

2.1.2 Digital Twin - history ... 6

2.1.3 Digital twin - definitions and application ... 6

2.1.4 Parts of the digital twin ... 7

2.2 Information Systems and Integration ... 9

2.2.1 Types of enterprise information systems ... 10

3. Methodology ... 14

3.1 Design Science Research ... 14

3.1.1 Selected case ... 15

3.1.2 Data collection and analysis ... 15

4. Findings... 17

4.1 Phase 1 Solution incubation ... 17

4.1.1. Data requirements of the process digital twin ... 17

4.1.2. Type of integrations needed within the process digital twin ... 18

4.1.3. Integration requirements between the process digital twin and legacy systems ... 20

4.2 Phase 2: Solution refinement ... 23

4.2.1 Supplied transactional digital master model ... 23

4.2.2 Documents data in transactional data in the process digital master and process digital shadow ... 25

4.2.3. Expansion of Analytics & Simulation part of the digital twin ... 26

4.2.4 Artifact: model with integration requirements of a process digital twin ... 27

(3)

3

Abstract

Purpose: This study aims to discover how a process digital twin can be integrated within the context of a manufacturing company with an existing information system landscape.

Design/methodology/approach: Exploratory problem-solving, Design Science Research Findings: A model on integration requirements for a process digital twin.

Research limitations: The findings are validated by an expert group of software engineers, but not yet validated and applied within other manufacturing companies.

Practical implications: The model will provide insight in the integration requirements of a process digital twin and might be input in the considerations if and how a manufacturer wants to start integrating a process digital twin.

(4)

4

1. Introduction

Manufacturing companies are dealing with a great opportunity, which leaves them at the same time a challenge. The opportunity is, by using digital twins, to be able to assess system performance and technical risks, whilst achieving an estimated 10% improvement of system effectiveness (Pettey, 2017). This as a result of the enhancement data reliability, decision making certainty, removal of the redundancy in monitoring systems, and better means to detect and remove reasons behind variations (Al-Najjar, Alsyouf, 2000). However, the integration of the digital twin within an existing information system landscape provides a challenge. The digital twin needs to be able to communicate with other systems of the company to create a complete virtual representation (Stark, Kind, & Neumeyer, 2017). Hasselbring (2000) states that the more heterogenic the data in an application and the more autonomous a system, which are both characteristics of a digital twin (Perino, 2019), the harder the integration with other systems. Yet, it is precisely this integration that enables access to different data sources allowing the digital twin to function (Perino, 2019, Hasselbring, 2000). Most manufacturers which are implementing digital twin technology already have an existing information system landscape with a diverse set of applications (Eurostat, 2018). For example, in most cases, a transactional system is in place and besides that, several separate functionalities are used (Waschull, et al., 2019a). To design and build a completely new and integrated information system landscape around the digital twin is time-consuming and expensive. To save money on this investment, companies might want to keep on using the information system and functionalities they already paid for and only invest in the new functionalities supporting a digital twin (Uhlemann, Lehmann, & Steinhilper, 2017). This raises interesting questions regarding the integration requirements between a digital twin and other relevant information systems. Yet, not much is known on how to embed a digital twin in the existing information system in a way the digital twin and preexisting systems can communicate with each other.

In short, a digital twin is the digital representation of a unique asset, which alters its properties, condition and behavior by means of models, information and data (Stark et al., 2017). By making use of Internet of Things (IoT), digital twins give management real time information on the current status of the products manufactured and a database which can be input for manufacturing process optimization. It provides also an effective test-bed for implementing data fusion, machine learning and artificial intelligence to achieve real time control, monitoring and optimization (Tao, Zhang, Nee, 2019). A digital twin can be made of an asset, but also of a production process or system (Madni, Madni, & Lucero, 2019). A process digital twin is larger of scope, containing the process architecture and the data of digital twins in that process (Perino, 2019). According to Shaw (2018), the process digital twin is the digital representation of a manufacturing process in which the benefits of product digital twins will be gathered for a process, the factory as system and possibly the whole supply chain, depending on the chosen scope. The process digital twin gives manufacturers insight in the overall performance of the process as well as on the performance indicators of process steps set by the user (Perino, 2019), giving space to advanced production scenarios which are not useable for product digital twins (Shaw, 2018).

Since there is a practical need and to the best of our knowledge a lack of knowledge about the integration of digital twins in the context of an existing information system, this research focusses on this topic, which leads to the following research question:

What are integration requirements for a process digital twin?

To answer this question, we will focus on a digital twin of a small production process incorporated in one machine. The production process is a process step in multiple product lines.

(5)

5 1. What are different types of applications found in manufacturing?

2. What application types are found in the process digital twin? 3. What data needs to be collected in the process digital twin?

4. What type of integrations are required within the process digital twin?

5. What integrations are needed between the process digital twin and the remaining information system landscape?

(6)

6

2. Theoretical background

In this chapter, the main concepts of this research will be explained. First we will give attention to the concept digital twin and thereafter information systems and integration will be explained.

2.1 Digital twin

This chapter will start with an elaboration on Industry 4.0 and some history, to embed the concept digital twin. Thereafter, several definitions from the literature will be discussed.

2.1.1 Industry 4.0

Industry 4.0, also referred to as SMART manufacturing (Lee, 2015), is the name of the fourth industrial revolution and encompasses digitalization, networking technologies, cloud computing, IoT and cyber-physical systems (Osinde, Byiringiro, Gichane & Smajic, 2019). Expected is that this fourth revolution in manufacturing, of which digital twins are part, will bring great benefits such as better insight in operations and optimized operational procedures which will in turn help companies to save money (van der Ent, 2019). Industry 4.0 embodies the vision for manufacturing companies, in which sensors keep track of all actions applied to products, machines and processes, providing real-time decision-making information. In an Industry 4.0 factory all machines keep track of deterioration levels and delivering maintenance schedules, all products have digital twins, in which the designs are digitized, all process steps have been recorded in the digital shadow and this is being compared to models resulting in valuable management information (Perino, 2019). Process digital twins combine all machine and product digital twins in the chosen scope of the process, delivering overview on the process and its operation, providing real-time decision information as well as showing options for improvements in the process by displaying different process scenarios (Perino, 2019). This clearly illustrates that process digital twins play a key role in the development of Industry 4.0 (van der Ent, 2019, Osinde et al., 2019). According to Gartner (2017), by 2021 nearly half of the major industrial companies will leverage digital twin technology to assess system performance and technical risks, whilst achieving 10% improvement in system effectiveness, approximately.

2.1.2 Digital Twin - history

In NASA’s Apollo program, in which two real identical space vehicles were built, the concept of ‘twin’ was generated. One of them performed the mission in the air space, while the other stayed on earth and was used to mirror the conditions of the launched one (Boschert, Rosen, 2016, Tao, Zhang, Nee, 2019). By observing and comparing both products, insight was gained in the product lifecycle. Tao, et al. (2019) divided the development history of the digital twin into three stages. In the first stage, the concept proposed by Grieves in 2003 (Grieves, 2003), was defined in three dimensions: a physical entity, a digital counterpart and a connection that ties the two parts together (Madni, et al., 2019). In 2005 he classified the digital twin into three subtypes; digital twin prototype, digital twin instance and digital twin aggregate. Due to cognitive and technical limitations, in the following 5 years very few related reports were written. During this period however, the new IT has emerged and developed, laying the foundations for the future development of the digital twin (Madni, et al., 2019, Tao et al., 2019). In the second stage (2010-2014) digital twin technology was mainly applied to the aerospace industry. In 2014 the Whitepaper on the digital twin was published (Grieves, 2014). Since this article was published, digital twin technology was introduced in other fields, like automotive (Swedberg, 2018) and oil and gas (Menard, 2017).

2.1.3 Digital twin - definitions and application

(7)

7 management and was, in manufacturing mainly applied to machines, nowadays digital twins of products are common and digital twins of processes are upcoming (Perino, 2019; Andaluz, 2017). A digital twin can be described as a dynamic virtual model of a system, process or service (Madni et al., 2019). According to Farsi et al. (2020), digital twin technologies ‘enable us to create a virtual duplicate of our real system and therefore provide us with a platform to review activities, interactions and consequences of different decisions within the real system’. Such a digital twin can be used in industry to improve productivity, efficiency, availability and by doing so, can result in improved quality of an asset or a service (Farsi et al., 2020). Boschert and Rosen (2016) describe the digital twin as referring to a comprehensive physical and functional description of a component, product or system, which includes more or less all information which could be useful in all -current and subsequent- lifecycle phases. In contrast to others, they subordinate the importance of the existence of a physical twin and focus mainly on the advantages of the analytical capabilities of the digital twin. Stark et al. (2017) add a new aspect: that the digital twin alters the properties, condition and behavior of the physical asset by means of models, information and data. However, others contradict that a digital twin should alter properties and conditions or do not mention the need for two way information exchange (Boschert, Rosen, 2016). Stecken, Ebel, Bartelt, Poeppelbuss and Kuhlenkötter (2019) refer to a digital twin as the description of a virtual image of a real component or plant, which can be used to perform offline simulations like virtual commissioning or optimization. In the digital twin, different models needed for the simulations are linked to each other and contain meta-data (Rosen, von Wichert, Lo, Bettenhausen, 2015; Stecken et al., 2019). In the description of Stecken et al. (2019), the digital twin seems an advanced (offline) analytics tool, whilst Farsi et al. (2020), Perino (2019) and Andaluz (2017) specificly mention the provision of automated feedback to the tool or process.

To conclude, there is still some ambiguity on the concept in the literature and no general definition is broadly accepted yet. In this research we define the process digital twin as a virtual duplicate of a real process providing a platform to review activities, interactions and consequences of different decisions within the real process, besides providing feedback to the real process (adjusted from Farsi et al., 2020).

2.1.4 Parts of the digital twin

(8)

8 models to the order structure of the entire production system, may be kept up-to-date by the information in the digital shadows (Stecken et al., 2019). According to Stark et al. (2017), the third part of the digital twin is the digital master, which provides a universal model of the asset. For example, for a digital twin of a product, a digital master might consist of a CAD model and other specifications. The digital master provides the analytics and simulation part of the digital twin the basis for algorithms, simulation models, correlation etc. (Stark et al., 2007). Figure 1 displays graphically the relation between a digital shadow, a digital master and the analytics and simulation part of the digital twin. The dotted lines refer to information streams. The digital twin can be of any physical asset, product to be manufactured, machine or tool.

Figure 1: Digital twin of a physical asset

(9)

9 Andaluz (2017) states that the process digital twin also gives the ability to create new customer and enterprise value through personalized service, to drive product quality and advancements, to prevent breakdowns before they occur, and to reinvent knowledge-sharing in a people-machine environment. To be able to create this value, the process digital twin needs support from mixed reality technology in which real and virtual worlds are merged and real time interact. (Andaluz, 2017). Figure 2 graphically shows the process digital twin. The digital shadows of products produced in the actual process and the digital shadow of machines (or other tools) are linked as add-ons to the digital shadow of the process. In a like manner, the digital masters of the product and machine are linked to the digital master of the process. The machine digital shadow and master refer to machine and/or other tools.

Figure 2: Process digital twin

After having gained some background on the topic of the digital twin, the process digital twin and the parts it consists of from the literature point of view, the focus of the remainder of this chapter will be on information systems and integration.

2.2 Information Systems and Integration

(10)

10 systems) to preserve the financial investments made (Hasselbring, 2000). Legacy systems often cannot support the new required function, whereby evolution and migration of legacy and new information systems is required. This means the new information system must be integrated with other information systems, existing applications and data sources (Hasselbring, 2000). Since information systems need to understand data provided by other applications to effectively collaborate, Hasselbring (2000) recommends standardization of message formats and message content. Systems Integration (SI) focusses on the building of applications that are adaptable to business and technology changes while retaining legacy systems and – technology as reasonable as possible (Hasselbring, 2000). The degree of autonomy of a system and the degree of heterogeneity of the data within a system are factors increasing the complexity of the integration process (Hasselbring, 2000). A well-integrated information system can improve the management quality of a company, leading to increasing productivity and decreasing cost (Lari, 2002; Budiarto, Prabowo, Herawan, 2017).

2.2.1 Types of enterprise information systems

To better understand the integration requirements between legacy systems and digital twins, a distinction of information systems based on the type of work they support will be made. This way of looking at information systems helps to understand the integration requirements and problems that arise when trying to integrate different types of applications.

According to Wortmann (2019) there are different information system types with distinctive ways to process data, linked to the types of work they support. The types of work supported are business process management, transaction processing work, real time systems, logistical planning work, analytical work and document management. Enterprise information systems support these types of work. We will elaborate on the type of systems that will need to interoperate with the process digital twin, namely, transaction processing systems, real time systems, professional systems (with special attention to analytics) and document management systems. For the sake of completeness, we list the application types which are not relevant for the digital twin, viz. business process management and logistical planning work in Appendix I.

2.2.1.1 Transaction processing systems

(11)

11 need a transactional (which is a type of relational) database to store their data in. An example of such a system is the Enterprise Resource Planning (ERP) system.

Regarding process digital twin, as graphically displayed in Figure 2, none of the tree parts are transaction processing systems. Yet, the process digital shadow, focused on the collection of all operational and condition data, has transactional features. All data points gathered in the actual process need to be linked to context variables to have meaning. For example, a temperature measurement is meaningless without linkages to the product that is produced and the ‘when’ and ‘where’ of the measurement. In other words, every recorded instance of the process should be linked to an object to have meaning and only based on data that is stored this way meaningful analysis can take place. Storage of data in the manner just described are transactions and it is thus likely the digital shadow will be a transactional database.

2.2.1.2 Professional systems – Analytical systems

Analytical systems provide in the need for managerial information and process data in a different way to do this reliable. Since the database of a transaction processing system cannot hold historical data accurately and do not hold derived objects (like profit or total costs), analytical systems work with a data warehouse (Wortmann, 2019). The historical data in the data warehouse is often used for business intelligence purposes.

For an overview of differences between data warehouses and transactional databases, see Table 1.

Transactional databases Data warehouses

Normalized Not normalized

Many object types Concentrate on a single object type

Subject to change Expand, stored data does not change anymore

Users want to perform transactions Users want to analyze data Users are working on just an few objects Users analyze many objects Derived data are computed each time when

these data are needed Derived data is computed when the object is stored and remain unchanged Provide only actual information Provide information over time.

Table 1: Differences between transactional databases and data warehouses (adapted from Wortmann, 2019)

(12)

12 Figure 3: Data warehouse and tri-dimensional data cubes (retrieved from Jabbar, 2015)

Analytical systems are based on statistical and Artificial Intelligence (AI) technologies (Wortmann, 2019). Some of the analytical activities of the analytics and simulation part of the process digital twin as shown in Figure 2, correspond with an analytics system. For example, the analytical activities such as what if-analysis, modeling and machine learning need reliable historical and structured data as basis (Jabbar, 2015). This implies the analytics and simulation part of the process digital twin might use OLAP tools, data cubes and a data warehouse to perform. The process digital shadow, which collects operational data, will be a data source of the data warehouse.

2.2.1.3 Document management systems

(13)

13

2.2.1.4 Real-time systems

Real-time systems are a class of applications which are based on the technology real-time monitoring and control. Real-time monitoring and control refers to the interaction of a real time system with its environment using sensors and actuators in a hardware domain (Douglass, 2014). The sensors in the actual process generate real-time process data. Programmable Logic Controllers (PLC’s) are small operating systems on a machine applying control mechanisms, in which the output of the sensor real-time might trigger an actuator (Douglass, 2014). Examples of real-real-time systems range from control of vehicles (e.g. train control, traffic light control) to machine control in factories and warehouses (Wortmann, 2019). In these systems in manufacturing, the Internet of Things (IoT) plays a key role in extending of network connectivity, computing capability to objects, devices sensors and items not ordinarily considered to be computers (Boyes, Hallaq, Cunningham, Watson, 2018). According to Singh and Gilbreath (2002) a real-time system can be viewed as a conventional system with temporal bounds, the violations of which may invalidate operational consistency requirements. As compared to conventional processes like transaction processing, real-time systems require more speed, interrupt scheduling, and prioritization (Singh, Gilbreath, 2002). Regarding the digital twin, the actual process, containing PLC’s and possibly other sensors, can be viewed as a real time system and will be referred to as a real time system in the following graphs.

(14)

14

3. Methodology

3.1 Design Science Research

Since the answer on our research will be an artifact, a Design Science Research (DSR) approach is most suitable to conduct this research. According to Venable and Baskerville (2012) the result of the DSR project is always a purposeful artefact which ‘can be a product or a process, a technology, a tool, a methodology, a technique, a procedure, a combination of any of these, or any other means for achieving some purpose’. The artifact to design will be a model displaying the required integrations for a process digital twin. This fits seamlessly with the immediate relevancy to practice of DSR (Holmström, Ketokivi, Hameri, 2009). Design science is used to design and validate models, while also contributing to the literature (Van Aken, Chandrasekaran, Halman, Ketokivi, 2016). According to Holmström et al. (2009) design science is a ‘basic research’ that is immediately relevant to practice by virtue of its focus on developing means-ends propositions that solve real problems. He advocates for use of this research method, especially in operations management, to decrease the gap between academic research and managerial applicability. Holström and Romme (2012) mention that the science-for-design approach, in which design principles, grounded in scholarly findings and tested in practice, can serve to connect operations management practice and academia.

Holmström et al. (2009) state the challenge of DSR lies in the ability to lead to novel theoretical insight. His 4 phased DSR method consists of the following phases: 1. Solution incubation 2. Solution refinement 3. Explanation I –Substantive Theory 4. Explanation II – Formal Theory (Holmström, 2009).

Figure 4: Phases of design science according to Holmström et al. (2009) The solution incubation phase is the phase in which a framework is being materialized, which properly representing the problem under investigation. From this framework possible solutions should be suggested by the researcher. When these suggestions are formalized, they can be input for implementation in a pilot level (Holmström et al., 2009). During the solution refinement phase the previous developed solution is being tested in a real environment in order to verify whether the solution proposed by the researcher meets the criteria for a proper solution of the problem (Holmström et al., 2009). In the third phase, a substantive or mid-range theory is being developed. Relevance of the developed knowledge in the first two phases is sought not only from a practical point of view, but also from an academic point of view. In this phase, the artifact is being evaluated from the perspective of theory rather than practice (Holmström et al., 2009). The substantive theory depends on the context in which it has been developed and is not yet to be generalized to all contexts, but to generalize theoretical concepts that can contribute to the topic of interest (Holmström et al., 2009). In the fourth phase, formal theories that can be used regardless of context are being developed. In this phase, the scientific contribution in the form of generalizable theory, is more important than practical relevance (Holmström et al., 2009).

Because of the limited time allotted to this research, the first two phases of the DSR will be completed, and an effort will be made to start with the substantive theory phase. The validation of the artifact will

(15)

15 be done by a review of the artefact by an expert group of software-engineers. The fourth phase will be left as recommendation to further research.

3.1.1 Selected case

The case selected is a manufacturer and global supplier of light-weight composite components for the aerospace industry. This company made an effort in digitizing their production processes. Currently, this company participates in a project with a technological innovator, an aerospace research center and an educational institute. This project is about the implementation of a new technology to create composite components and at the same time orienting on the role of the digital twin in this business processes. One of the goals of this project (which started September 2019) is to develop a digital twin for the new process. This manufacturing process is intended to become part of multiple production lines. In this research, the scope of the process digital twin will be limited to this specific process. The process itself is incorporated in one machine and multiple product types undergo this process as a step in their manufacturing. The possibility to participate in the orienting interviews of this project around use cases of the digital twin at the machine building side and at the side of the aerostructures manufacturer provide a unique insight in the current view on the digital twin, its possible applications, and different stakeholder perspectives. Participation in this project provide in interesting connections and opportunities to interview people at the aerostructures manufacturer, the aerospace research center and other connections of the parties involved.

3.1.2 Data collection and analysis

The data has been collected between November 2019 and January 2020 at the aerostructures manufacturer in the northern part of the Netherlands. For this research multiple data collection methods have been used. After this, the data collection methods per DSR phase have been elaborated.

3.1.2.1 Phase 1: Solution incubation

In this phase the main focus has been on general understanding of the concepts from different perspectives. The combination of input from the literature and experts from different fields, as will be elaborated below, have been taken into account to generate a general model on integration requirements for a process digital twin.

Literature review

Data has been gathered by literature review on (process) digital twins and information systems integration, on data types and on data integration. Search terms have been digital twin, process digital twin, smart manufacturing, Industry 4.0, cyber physical systems (CP(P)S), Industrial Internet of Things (IIoT), datatypes, integration methods and information systems integration.

Interviews

The semi-structured explorative interviews have been held with enterprise and IT architects (2), a data warehousing specialist, application managers (2), with a software engineer from the aerospace research center, a SMART manufacturing expert from the University of Groningen, with two engineering companies specialized in modelling and interfacing and an associate lector of the educational institute leading the investigation on use cases of the digital twin, providing relevant data. With permission of the interviewees, the interviews have been recorded, reports have been made and analyzed. When necessary, extra clarifying questions have been asked via email.

Direct observations

(16)

16 demos from the aerostructures manufacturer and its information system. Also, participation in project team meetings provided relevant information.

Company documents

Several documents have been investigated. For example, presentations on an earlier project focusing on digitization within the aerostructures manufacturer and on information system architecture have been part of this investigation, as well as specific procedures (e.g. quality procedure) within the case company.

Design

Based on the information gathered, the first draft of the artifact has been created (see chapter 4, figure 7).

3.1.2.2 Phase 2: Solution refinement

In the second phase, the general model as developed in the first phase has been reviewed by an expert group and refined, resulting in several changes and extensions of the model.

Literature review

More specified literature has been found on integrations and types of information systems, which has been used to refine the artifact.

Interviews

In depth interviews have been held with some of the interviewees of the first round (software engineer, IT architect, PLC programmer and data warehousing expert) to get deeper insight in integration methods. A report has been made of these interviews.

Review

The artifact has been evaluated in this phase and information gathered to refine the artefact. The reviewing expert group consisted of a software engineer, an enterprise architect and a data warehouse expert associated with the aerostructures manufacturer and two software engineers associated with the aerospace research center, who will be responsible for the generation of the digital twin within the earlier mentioned project.

Direct observations

A demo of the quality information system has been given at the aerostructures manufacturer. Besides, two presentations have been attended. The first presentation has been provided by a software supplier with experience in implementing a digital twin interface. The second presentation has been provided by the aerospace research center on the topic of the digital twin, including a small demo. Both presentations contributed to further understanding on the topic.

Design

Based on the reviews of the first draft of the artifact, the artifact has been refined (see chapter 4, Figure 11).

3.1.2.3 Phase 3: Explanation I –Substantive Theory

(17)

17

4. Findings

In this chapter, the first and second phase of the DSR methodology are being applied. The results of the evaluation by the expert group, which resulted in a refined artifact, can be found in the paragraph Validation. The contributions from theoretical perspective (third DSR phase) have been explicated in the discussion (theoretical implications).

4.1 Phase 1 Solution incubation

In this phase of the DSR methodology the literature review has been executed with a focus to the in the in the introduction presented problem. Thereafter, the third, fourth and fifth sub questions will be answered, providing a framework and a possible solution to the question how to integrate a process digital twin.

4.1.1. Data requirements of the process digital twin

4.1.1.1 Digital master

In terms of information processing system types as presented in chapter 2, the digital master of the process digital twin is a document oriented system. In this research the focus is on a digital twin of a small process, which will be part of multiple product lines producing multiple product types. Since in this situation the performance of the process is intertwined with the performance of the machine or tool and the product to be produced, it is chosen is to provide the process digital master the possibility of having the digital masters of the machine and the product be linked via add-ons (as earlier shown in Figure 2). These software components can be programmed to link several different product digital shadows, in sync with the product that is produced on that moment. The product digital twin and the machine or tool digital twin are both asset digital twins (as shown in Figure 1). The following table gives an impression on the data requirements for the digital master. The content of the digital masters may be adjusted based on the chosen purpose of the process digital twin. In this research, chosen is to let the process digital twin be a support to the operator who is in charge of this process and to management (what-if analysis and analysis focused on process optimization).

Data requirements Digital master of the machine or tool (add-on) Documents

- CAE models of machine or tool - Numeric Control machine programs - Specifications machine performance - Maintenance guides

- Operating procedures - Compliance documents Product digital master of product to be

manufactured (add-on)

Documents

- CAE models of product

- Required time and resources (materials/man hours, authorizations, tools, etc.)

- Specifications (e.g. range of conformity in size, weight etc.)

- Detailed manufacturing instructions, including: - (Quality) checks during manufacturing - Measurements

- Standard repair routing / activities

- Requirements for environmental conditions Digital master of a manufacturing process Documents

(18)

18 - Pre-specified production time per process step - Specified parameters for machinery

- Checks and measurements to be taken during the process execution (focused on process knowledge improvement)

4.1.1.2 Digital shadow

In terms of information processing system types, the digital shadow has both a document and a transactional part. The document part of the digital shadow is derived from the digital master, but now populated with the specific data of the specific asset involved. The transactional part describes the status changes of the physical asset.

Data requirements

Digital shadow of machine or tool (add-on) As built Documents (only needed if machine or tool deviates from general documentation in digital master) - 3D models

- Maintenance guides - Operating procedures

- Further ‘as built’ documentation Transaction processing

- Status of machine/tools - Hours in operation - Need for cleaning

- Need for small or large maintenance - Change-over times

- Set-up times

- Overall Equipment Effectiveness (OEE) values Digital shadow of product to be manufactured

(add-on) Documents - Non-conformance criteria

- Non-conformance reports - Pictures

Transaction processing

- Status of product being manufactured - Actions performed

- Measurements values

- Time and resources used (materials/man hours, operators involved, tools, etc.)

- Environmental conditions Digital shadow of a manufacturing process Transaction processing

- Values of measurements during manufacturing process (useful for improving knowledge of process)

4.1.2. Type of integrations needed within the process digital twin

Figure 5 zoomed in on the process digital twin and discerned within the digital shadow two parts, namely a document data part and a transactional part. The digital master consists of documents data only. Within the process digital twin two types of integrations can be distinguished, which will be elaborated in this paragraph.

1. Integration between the process digital shadow and the process digital master

(19)

19 Figure 5 displays the above mentioned integrations.

Figure 5: Process digital twin – integrations within the digital twin (before refinement)

4.1.2.1 Integration between the process digital shadow and the process digital master

The integration between the digital shadow and digital master is a document - document integration, which is relatively simple. The documents of the digital master have to be accessible in the digital shadow in order to capture ‘as-build’ information, which may deviate from ‘as-designed’ information. Documentation of both states of the process contributes to the digital representation of the process. The integration from process digital master to process digital shadow is a one way integration, meaning the documents data in the process digital master is accessible for the process digital shadow, but not the other way around. Documents data can be stored in an object oriented database (Waschull, et al., 2019a).

A situation can exist in which one digital master serves many digital shadows. This could be the case when a process is validated and for example used on different production sites. This, however, does not influence the necessity or nature of the integration between the process digital master and the process digital shadows. Every separate process digital shadow should have been granted access to the documents of the process digital twin for which a viewing function would be sufficient in giving the end users insight in the process and support in evaluating its performance.

The documents in the process digital master should be properly controlled by PDM/EDM functionality, in order to ensure that changes in these documents are correctly propagated to the digital master. This functionality should be extended to the copies in the digital shadow, in the sense that any changes in the master which have to be copied in the shadow are identified, reviewed and used for updating when appropriate.

4.1.2.2 Integration between the process digital shadow and analytics and simulation software

(20)

20 via Extract, Transform, Load (ETL). The transaction processing data in the process digital shadow of interest to the analytics software is uploaded in a data warehouse on which OLAP can be applied. This integration is a one-way integration. Since the analytics software is initiation which data will be loaded into the analytics software, this implies it is not necessary to harmonize business logic in both systems. Also, there is no need for real time data uploading, but it can be executed at a convenient moment.

4.1.3. Integration requirements between the process digital twin and legacy systems

Based on the understanding on how the components of a process digital twin are integrated, in the following sub section the focus will be on the integration requirements between the process digital twin and legacy systems. In Figure 6, the linkages between the different systems are displayed, with a short description of the content of the data transferred (dotted lines).

Figure 6: Data movement within and around process digital twin (before refinement)

(21)

21 The integrations that will be elaborated below are:

1. Integration from the real-time systems to the process digital shadow 2. Integration from the analytics software to the real-time systems

3. Integration from transaction processing systems to the process digital shadow 4. Integration from the process digital shadow to the transaction processing systems 5. Integration from document management systems to the process digital master

4.1.3.1 Integration from the real-time systems to the process digital shadow

The physical assets that are part of the actual process, either a tool/machine, or a product which is being manufactured, belong to the real-time world. As mentioned earlier, the digital shadow consist of two parts; a transaction processing part and a documents part. In actual process, the status of the physical asset, either in its use as a tool or machine or in its development from raw materials to a physical product is monitored via sensors resulting in streaming data with a high frequency. To be able to use this data for analytical purposes, without having a large quantity of noise (i.e. meaningless information), this real time streaming data first can to be transformed to transactional data. This is possible via some sort of transformation application, which only stores useful data (which in most cases deviates from set points or -ranges) and translates it to the form of transactional data. After the transformation of the real time data to the transactional data, the data must be stored in the transaction processing part of the process digital shadow. This will be a transactional or relational database.

The function of the transforming application is crucial, since only via structured data an analytics tool can simulate, optimize etcetera. Besides, the selective storage of streaming data by the transforming software is important regarding the fact that streaming data becomes very rapidly increase and cause overload in databases.

(22)

22

4.1.3.2 Integration from the analytics software to the real-time system

Via the dashboard of the analytics software, feedback can be given to the operator at the machine, e.g. an alert to an operator or technical service when analytics software predicts a break down based on (transactional records of) sensor data. Another possibility is to provide feedback to a machine in the process, by changing settings. Keeping the first example in mind, the integration of the analytics software (which is a one-way integration) with the actual process runs via a dashboard and a human. The second case (analytics software directly changing settings of a machine) is not in every situation possible and heavily depends on the digitization of the process steps and the machines. When a machine is capable, the analytics application would have the set points or ranges and trigger above and below the set point, resulting in the pushing of a predefined message towards the operation system of the machine it is about (event driven data movement). The machine would need to be programmed in such way that it understands the message and is able to changes settings automatically based on a message. This solution would require customization, since different machines often are based on different technology and are controlled by different control systems. Also, contractual and legal restrictions should be taken into account when deciding whether this is an option.

4.1.3.3 Integration from transaction processing systems to the process digital shadow

Data from legacy transaction processing systems should be downloaded to the digital shadow. These data are on the one hand stored in the ERP system, such as Bills-of-Material (BoMs) and operations (routings), but also orders. On the other hand, it includes data from quality management systems, specifying measurements and controls necessary during manufacturing. This is a one-way data-integration effort between transaction processing systems. For the transfer of BoMs and data on routings, the data can be transferred via event-driven data movement, using messages. In this case, a change in the BoM or routings would have to be set as a trigger. For the data sharing on orders, event-driven data movement could also be used. The event would have to be a pre-defined to trigger an update, for example, every week.

4.1.3.4 Integration from the process digital shadow to the transaction processing systems

Since it is usually required that legacy transaction processing systems are informed about actual progress as recorded in the process digital shadow, this integration is needed. In particular, an ERP system has to be informed about materials used, hours spent on operations and on order progress, as well as rejects. In similar vein, quality control systems need data on measurements obtained, non-conformance reports, rejects and repairs. This again is a one way integration between two transaction processing databases and thus works via the same route described at the previous paragraph. Although information is exchanged between transaction processing systems and the process digital shadow to both sides, we do not speak about two-way integration since the content of the information (and thus the place of the data within the transactional database) differs. The type of integration could again be event-driven data movement. Events would have to be predefined in the process digital shadow, which could be, for example, the occurrence of a reject.

4.1.3.5 Integration from document management systems to the process digital master

(23)

23

4.2 Phase 2: Solution refinement

After the formalization of the suggested solution, referred to as the artifact or the model Integration requirements of the process digital twin (Figure 7), this model has been reviewed by an expert group which resulted in some changes and in a refined solution/artifact (Figure 11). Although the necessity for the databases and the data warehouse (included in Figure 7) did not change as a result of the reviews, we did not include them in Figure 11 for reasons of clarity.

In the following paragraphs the changes and the influence thereof on the integration requirements of the process digital twin will be elaborated.

Summarizing, the following changes have been opted and taken into account in the development of the refined artifact:

1. Supplied transactional digital master model

2. Documents data in transactional data in the process digital master and process digital shadow

3. Expansion of Analytics & Simulation part of the digital twin In section 4.2.4 the final artifact will be presented.

Figure 8 gives an overview of the components of the refined model and the data movements the proposed integrations in Figure 9 will support.

4.2.1 Supplied transactional digital master model

Two of the experts stated that, in some cases an external supplier (being a machine builder or a software provider in case of common machines) can provide in or sell a digital master model. This would be a model in transaction processing data and could possibly be based the machine suppliers last test-simulation. Although the model presented mainly focusses on the external integrations of a process digital twin, asset digital twins can provide input as earlier elaborated and shown in Figure 2.

(24)

24 The machine or the product manufactured can provide relevant data to the process digital master and/or shadow via the add-on construction. In the graph below the external supplier is added supplying a transaction processing master model of a machine to the digital master of the machine. Via the add-on construction (which is not shown in Figure 11 for clarity reasons) the data can be added to the process digital master. A transactional digital master model of a machine is based on a data model, also called schema. After collecting real-time sensor data, this data should be transformed to transactional data and stored in a data schema. To be able to compare the digital shadow with the digital master, the Analytics & Simulation software needs to be able to recognize the data. Here, the data model plays an important role. For the machine for which a transactional digital master model is available, it is thus important the process digital master and the process digital shadow share the data model. Within the transactional database of the digital shadow, several separate data models can co-exist.

Figure 9: Changes 4.2.1 and 4.2.2

Besides, the supply of a transactional digital master model results in a dichotomy of the digital master now there is a documents part and a transaction processing part of the process digital master. A supplied digital master model results in some additional integrations, namely:

1. Integration from the process digital master to the transformation application 2. Integration from the process digital master to the process digital shadow

(25)

25

4.2.1.1 Integration from the process digital master to the transformation application

This integration is about the transfer of the data model from the machine digital master (from supplier) to the transformation application. In the process digital master a trigger could be defined, that every time a new machine digital master adds on, sends its data model to the transformation application. The transformation application can process this in the selection on which parameters to measure and how to order them.

4.2.1.2 Integration from the process digital master to the process digital shadow

The process digital master transfers the digital master data- model to the process digital shadow in order to provide in the structure the data should be saved. It is possible to organize this via an event-driven data movement. The triggering event is the same as in the previous integration: the arrival of a new transactional master model and the content of the message is the data model which will be exchanged.

4.2.2 Documents data in transactional data in the process digital master and process digital

shadow

The data requirements of the process digital master and -shadow, as described in 4.1.1, often are prepared in different formats. A query cannot find specifications in a document when it is not organized according to a fixed structure. A route for extracting specific values from documents will be elaborated.

4.2.2.1 Conditional integration of documents data with transactional part of the digital master and – shadow

When a company would decide to commit to a certain standard format (e.g. with a table with name so-and-so in which specification X is always displayed on the third row, second column), it would be possible to read specific data out. To the best of our knowledge, such a general documents standard does not exist in the industry. Yet, when a company creates its own standard format, it is also possible to extract values from that format. Via a programmed function relevant values (e.g. upper bounds, lower bounds, and performance specifications) can be extracted and saved as transactions in the transaction processing part of the digital twin or -shadow. When a document is not standardized, for every document a different function should have to be generated to make it possible to read the values from the documents and save them as transactions. Obviously, this would be very time consuming, making manual entry of the values an even better solution. The transactions extracted from the digital shadow and digital master documents, together with the operational data of the digital shadow will be communicated to the Analytics & Simulation part. This way, the simulation can be expanded with norm values which places the real time operational data in context.

4.2.2.2 Integration of documents management system to Analytics & Simulation

Also, for every product/machine a transaction could be created with a link to product/machine documentation. Based on the viewing-only authorization of the process digital master and/or – shadow, specific links could be created which can be extracted by the Analytics and simulation part of the process digital twin and used in the configuration of the user interface.

(26)

26

4.2.3. Expansion of Analytics & Simulation part of the digital twin

Based on the need for specific functions within the analytics and simulation software, the model of integration requirements of the process digital twin has been adapted. Figure 10 displays the refined version of the analytics and simulation software and its integrations with the process digital shadow. Two distinct types of integrations between the process digital shadow and the analytics and simulation software are necessary:

1. Integration between the process digital shadow and the Monitoring & Simulation component of the Analytics & Simulation part

2. Integration of the process digital shadow and the functions Data analytics and Modelling (components of the Analytics & Simulation software)

Figure 10: Expansion of Analytics & Simulation part of the process digital twin

4.2.3.1 Integration of the process digital shadow and Monitoring & Simulation

For the function of monitoring and simulation, a direct integration via protocols can be applied between the transactional data in the process digital shadow and the monitoring and simulation part (as displayed in Figure 10). The nature of the data to be uploaded is largely determined by the simulation models employed and will likely be linked to the process performance metrics (i.e. key performance indicators) and the product specifications. It should be noted that real-time in this context means near real-time, since processing, transferring and displaying the data will cause some time-loss. This is a one-way integration.

4.2.3.2. Integration of the process digital shadow and the functions Data analytics and Modelling (components of the Analytics & Simulation software)

(27)

27 data. A data warehouse, in which OLAP technology can be applied, can provide in this. With such structured data what-if analysis can take place, and artificial intelligence, machine learning and modeling techniques can be applied. To update the data from the process digital shadow for these purposes, the Extract, Transform, Load (ETL) technique can be applied. The transactional data in the process digital shadow of interest to the analytics software is uploaded in a data warehouse and structured in data cubes by OLAP technology. Since there is no need for real time data uploading in this case, ETL can be executed at a convenient moment or a periodic update of changed new values in the digital shadow can be set. Since the end user of the analytics software is initiating which data will be loaded into the analytics software, this implies it is not necessary to harmonize business logic in both systems.

Both these integrations are one-way integrations as well.

4.2.4 Artifact: model with integration requirements of a process digital twin

The refined model with integration requirements of a process digital twin is displayed in Figure 11.

Compared to the model before refinement (Figure 7), this model is more complete by having seperated different functionalitys in the Analytics & Simulation part of the digital twin and the extra possibility of obtaining a machine digital master model from a supplier and the possibility of integration documents data within the transaction processing database of the digital twin, which is integrated with the Analytics & Simulation part of the process digital twin. The last two mentioned changes however, both are conditional.

(28)

28 However, to be aware of the this conditionality provides in the opportunity to prepare and create a new situation.

According to one of the experts, for future configurations it is advised to let data streams follow an alike pattern. In this model all data streams for the Analytics & Simulation part is (consciously) designed to run via the process digital shadow. However, it is also possible to let the data streams run another route, for example via the Analytics & Simulation part of the digital twin. In this case, the Analytics & Simulation part gets all the data of the physical process real-time and should filter, normalize and store the data in a database so it can be used for further analysis.

(29)

29

5 Discussion

In this chapter the artifact developed will be discussed, discerning theoretical and managerial implications. The artifact, as presented in Figure 11, provides in a model for companies orienting on the implementation of process digital twins in a manufacturing company. The model, which is answering the main research question, provides in integration requirements for a process digital twin, in which four data processing system types have been distinguished. However, it is likely the content of a process digital twin might change with its set purpose. With the purpose of the process digital twin, its data requirements can change and possibly the data source systems as well. In the model presented, only four of the information processing system types have been taken into account. Yet, use cases for process digital twins are possible in which integrations with other system types are necessary (e.g. based on process data from a Workflow management system). Besides, for some of the integrations as displayed in the model with the integration requirements of a process digital twin, different possibilities might be possible. ‘There is no one best way’ also applies in this case. Existing integrations and business logic (with their constraints) will influence the final choice of integrations. The integration of a process digital twin requisites customization. Yet, in orienting on the implementation of a process digital twin, this research, including the answering of the sub questions throughout document, provides a concrete image.

5.1 Theoretical implications

This research contributes in giving insight in how digital twins of assets (machine, product) might relate to a process digital twin and gives an overview on integration requirements of a process digital twin with regard to legacy systems. This has not yet been described in the literature yet, although digitization and the costs of badly integrated systems are recognized. Besides, this research contributes in analyzing the different parts of the process digital twin and the data requirements from the perspective of applications with distinctive different ways of processing data.

5.2 Managerial implications

This research contributes in the presentation of a model which provides insights in which integrations are needed, within the process digital twin and within the remaining information system landscape. The model provides input for to a company orienting on the implantation of a process digital twin, providing an overview on what a digital twin is, and how it can be integrated within an existing information system landscape. This can be useful to general management and enterprise architects, displaying graphically what a digital twin is, and what it would entail concerning integrations.

(30)

30

6 Conclusion

The last chapter gives a conclusion to this research, including limitations and recommendations for future research.

This study considered the integration of a process digital twin within an existing landscape to be able to thereby be able to profit from the advantages of making use of digital twins. To protect earlier investments in legacy systems and ensure existing information systems do not have to be replaced, a model has been developed in which is shown how to integrate a process digital twin within an existing landscape. This model purposed to provide the process digital twin access to all data needed for a complete digital representation of the process, while taking the differences in nature of information systems into account. To come to this model, we first did a literature review on digital twins and the different types of information systems within manufacturing companies, which might be of interest to the process digital twin. Thereafter, the data need of the process digital twin was explicated, discerning the needs of the digital master, the digital shadow and the analytics part of the digital twin. Then, the integrations needed within the digital twin have been elaborated to be able to construct the model on how to integrate a digital twin within an existing landscape.

In validating the model, another possibility arose, which resulted in the second option of the model, which is in fact an extension.

From this research, it can be concluded that the integration of a process digital twin is possible, but that the integration thereof requires attention to the different data processing behaviors of different applications. Building these integrations will result in a considerable time and effort investment. Potential benefits in enhancement of data reliability, decision making certainty, redundancy removal in monitoring systems and better means to detect and remove reasons behind variations (Al-Najjar, Alsyouf, 2000) however seem promising.

(31)

31

References

Al-Najjar, B., Alsyouf, I. (2000) Improving effectiveness of manufacturing systems using total quality maintenance. Integrated Manufacturing Systems, 11(4), 267-276.

https://doi-org.proxy-ub.rug.nl/10.1108/09576060010326393.

Andaluz, E. (2017) The Process digital twin: A step toward operational excellence. Retrieved from

https://cloudblogs.microsoft.com/industry-blog/manufacturing/2017/10/23/the-process-digital-twin-a-step-toward-operational-excellence/ on 08-10-2019.

Boyes, H., Hallaq, B., Cunningham, J., Watson, T. (2018) The industrial internet of things (IIoT): An analysis framework. Computers in Industry. 101(10), 1-12

Boschert, S., Rosen, R. (2016) Digital Twin – the Simulation Aspect, Mechatronic Futures, Springer,

https://link.springer.com/chapter/10.1007%2F978-3-319-32156-1_5.

Budiarto, D.S., Prabowo, M.A., Herawan, T. (2017) An integrated information system to support supply chain management & performance in SMEs. Journal of Industrial Engineering and Management. 373-387. DOI: 10.3926/jiem.2180

Caridi, M. Sianesi, A. (2000) SCM-ERP integration: organisational, managerial and technological issues. Proc. 1st Int. Conf. on systems Thinking in Management, 124-129.

Eurostat, 2018. E-business integration. Retrieved from:

https://ec.europa.eu/eurostat/statistics-explained/index.php/E-business_integration on 9-10-2019.

Douglass, B.P. (2014) Real-Time UML Workshop for Embedded Systems. Second edition.Oxford: Newness.

Errattahi, R., Fakir, M., Salmam, F.Z. (2014) Explanation in OLAP Data Cubes. Journal of Information Technology Research, 7(4), 63-78.

Farsi, M., Daneshkah, A., Hosseinian-Far, A., Jahankhani, H. (2020) Digital Twin Technologies and Smart Cities (eBook), Switzerland: Springer Nature Switzerland.

Garfinkel, J. (2018) Gartner Identifies the Top 10 Strategic Technology Trends for 2019. Retrieved from:

https://www.gartner.com/en/newsroom/press-releases/2018-10-15-gartner-identifies-the-top-10-strategic-technology-trends-for-2019 on 9-10-2019.

Grieves, M. (2003) PLM – Beyond lean manufacturing, Whitepaper. Soc.manufacturing engineers, 23-23.

Grieves, M., (2014) Digital twin: manufacturing excellence through virtual factory replication, Whitepaper.

Hasselbring., W. (2000) Information system integration. Communications of the ACM, Vol. 43 (6), 33-38.

Hohpe, G., Woolf, B. (2004) Enterprise integration styles. Inform IT. Retrieved from:

http://www.informit.com/articles/article.aspx?p=169483&seqNum=4 on 09-01-2019.

Holmström, J., Ketokivi, M., Hameri, A.P. (2009) Bridging practice and theory: A design science approach, Decision Sciences. 40 (1), 65-87.

(32)

32 Jablonski, S., Bussler, C. (1996) Workflow management: modelling concepts, architecture and implementation. London: International Thomson Computer Pres.

Jasperneite, J., Schumacher, M., Weber, K. (2007) Limits of increasing the performance of industrial Ethernet protocols. 2007 IEEE Conference on Emerging Technologies and Factory Automation (EFTA 2007), 17-24.

Kunath, M., Winkler, H. (2018) Integrating the Digital Twin of the manufacturing system into a decision support system for improving the order management process. Procedia CIRP 72, 225–231. Lari, A. (2002) An integrated information system for quality management. Business Process Management Journal, 8(2), 169-182.

Lee, J. (2015) Smart Manufacturing Systems. Informatik Spektrum. 38(3), 230-231.

Madni, A., Madni, C., Lucero, S. (2019) Leveraging Digital Twin Technology in Model-Based Systems Engineering. Systems. 1-13.

Madni, A.M.; Sievers, M. (2014a) System of Systems Integration: Key Considerations and Challenges. Syst. Eng. 17, 330–347.

Madni, A.M.; Sievers, M. (2014b) Systems Integration: Key Perspectives, Experiences, and Challenges. Syst. Eng. 17, 37–51.

Menard, S. (2017) 3 ways digital twins are going to help improve oil and gas maintenance and operations. Retrieved from:

https://www.linkedin.com/pulse/3-ways-digital-twins-going-help-improve-oil-gas-sophie-menard on 19-09-2019. .

Mohamed, N., Mahadi, B., Miskon, S., Haghshenas, H., Muhd Adnan, H. (2013) Information System Integration: A Review of Literature and a Case Analysis. Chapter of: Mathematics and Computers in Contemporary Science. Publisher: World Scientific and Engineering Academy and Society. 68-77.

Osinde, O,,Byiringiro, J.B. Gichane, M.M., Smajic, H. (2019) Process Modelling of - Geothermal Drilling System Using Digital Twin for Real-Time Monitoring and Control, Designs, 3 (3), 45.

Perino, J. (2019) The process manufacturing digital twin. LNS Reseach. Retrieved from:

https://cdn2.hubspot.net/hubfs/136847/IC%202019%20MAR%20APM%204.0_DIGITAL%20TWIN%20

PROCESS/2019_DigitalTwinProcessIndustries_LNS.pdf, on 9-10-2019.

Pettey, C. (2017) Prepare for the Impact of Digital Twins; Gartner: Stamford, CT, USA.

Rodič, B. (2017) Industry 4.0 and the New Simulation Modelling Paradigm. Organizacija.

Rosen, R., Wichert, G. von, Lo, G., Bettenhausen, K.D. (2015) About The Importance of Autonomy and Digital Twins for the Future of Manufacturing 48, p. 567.

Shaw, K. (2018) Wat is een digital twin en wat is het nut? Retrieved from:

https://computerworld.nl/markttrends/105432-wat-is-een-digital-twin-en-wat-is-het-nut on

10-10-2019 on 20-12-2019.

(33)

33 Stark, R., Kind, S., & Neumeyer, S. (2017) Innovations in digital modelling for next generation

manufacturing system design. CIRP Annals - Manufacturing Technology.

Stecken, J., Ebel, M., Bartelt, M., Poeppelbuss, J., Kuhlenkötter, B. (2019) Digital Shadow Platform as an Innovative Business Model. Elsevier. Procedia CIRP 83. 204-209.

Swedberg, C. (2018) Digital twins bring value to big RFIC and IoT data, Retrieved from:

https://www.rfidjournal.com/articles/view?17421 on 19-09-2019.

Tao, F., Zhang, M., Nee, A.Y.C. (2019) Digital Twin Driven Smart Manufacturing. eBook: Elsevier. Retrieved from:

https://books.google.nl/books?id=PvKGDwAAQBAJ&printsec=frontcover&hl=nl&source=gbs_ViewAP I&redir_esc=y#v=onepage&q&f=false on 04-09-2019.

Tharma, R., Winter, R., Eigner, M. (2018) An approach for the implementation of the digital twin in the automotive wiring harness field. System Engineering and Design, 3023-3032.

Uhlemann, T. H. J., Lehmann, C., Steinhilper, R. (2017) The Digital Twin: Realizing the Cyber-Physical Production System for Industry 4.0. In Procedia CIRP.

Van Aken, J., Chandrasekaran, A., Halman, J., Ketokivi, M. (2016) Conducting and publishing design science research Inaugural essay of the design science department of the Journal of Operations Management. https://doi.org/10.1016/j.jom.2016.06.004

Van der Ent, L. (2019) Snellere productontwikkeling en procesoptimalisatie met digital twin. Retrieved from ptindustrieelmanagement.nl on 04-09-2019. .

Venable, J., Baskerville, R. (2012) Eating our own Cooking: Toward a More Rigorous Design Science of Research Methods. Electronic Journal of Business Research Methods, 10 (2), pp. 141–153.

Waschull, S., Wortmann, J.C., Bokhorst, J.A.C., (2019a) Manufacturing Execution Systems: The Next Level of Automated Control or of Shop-Floor Support? .

Waschull, S., Wortmann, J.C., Bokhorst, J.A.C. (2019b) Identifying the role of Manufacturing

Execution Systems in the IS landscape. International Federation of Information Processing. 502-510. Wortmann, J.C. (2019) Logistics Information Systems. Syllabus Fall 2019.

(34)

34

Appendices

Appendix I:

For the sake of completeness, in this appendix the two remaining application types linked to the types of work they support (Wortmann, 2019) are described here.

Business process management - Workflow Management System

In a Workflow Management System (WfMS) work from a business process is automatically allocated by a computer system to resources (humans and applications) according to a predefined schema of the process, the available resources and their dependencies (Jablonski, Busser, 1996). According to Wortmann (2019), the WfMS facilitates collaboration and coordination among employees while it assists them and allows a certain degree of automation of the process. When a human activity in a business process becomes enabled, the WfMS is triggered to initiate the activity by sending a message (or messages) to the employee(s) involved. When an automated activity becomes enabled, the WfMS will start the applicable software or send a message to the applicable software which initiates the activity in due time. Besides, the WfMS monitors the progress of business process instances, gathers statistics of a business process (e.g. lead times), and provide support (e.g. alerting when activities are delayed). In general, the WfMS interpret business process models and provide automated support (Wortmann, 2019).

Logistical planning work - Advanced Planning Systems

Referenties

GERELATEERDE DOCUMENTEN

Secondly, because there was a lack of IPPS and many decisions that affect the production schedule are made by machine operators and programmers, the

Hindsight shows that the decision to begin the project by interviewing all the executives of the housing associations, the borough chairmen and four city aldermen involved in

mogelijk op een doelmatige wijze in het ontwerp van een dergelijke (sub)afdeling ingezet kunnen worden. Hiervoor werd eerst een inventariserend onderzoek ge- daan

Chapter 2 addresses the first research objective, namely, to identify and conceptualise the Constitutional and legislative obligations in respect of Disaster Risk

Exploring anti-fibrotic drugs: Focusing on an ex vivo model of fibrosis1. University

Our contributions are the following: application of local statistical texture maps to the classification of NC and MCI, which make no assumption about the ex- pected location

I declare that this research , direct quant itative gross up-measurements o f environmenta l water contaminated with nuclides from uran ium, thorium and actinium

De verwachting was dat het verschil op de door de ouders ervaren inhibitie (BRIEF) op de voor- en nameting groter zou zijn dan de verschillen van daadwerkelijk gemeten inhibitie