• No results found

An exploration of third party data sharing enabled by Enterprise Information Systems Collaboration in the Physical Internet:

N/A
N/A
Protected

Academic year: 2021

Share "An exploration of third party data sharing enabled by Enterprise Information Systems Collaboration in the Physical Internet:"

Copied!
82
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Collaboration in the Physical Internet:

An exploration of third party data sharing enabled by

Enterprise Information Systems

M.J. Kreijkes

MSc Thesis presented for the degree:

DDM Technology & Operations Management (2018-2020) 9 December 2019

Lead supervisor: Dr. N.B. Szirbik Second supervisor: Dr. A. Small

(2)
(3)

Acknowledgements

(4)

Abstract

Purpose: The logistics industry is too inefficient and inherently unsustainable due

to its polluting nature. A proposed solution is the Physical Internet, but there are major challenges on how the envisaged increase in collaboration and data sharing can be ensured. To find a solution, data sharing systems that are currently in place have been examined and reverse engineered, revealing the critical requirements for data sharing and future Enterprise Information Systems. Based on these requirements, preliminary functional architectures of the Third Party Model are designed.

Methodology: Design Science Research is carried out to design a solution to the

problem of collaboration and data sharing. Data is collected by conducting multiple case studies including observations and semi-structured interviews with industry experts. A validation workshop is conducted to evaluate the critical requirements.

Findings: The findings indicate that data sharing could potentially be enhanced

by using a third party mediator. In such a system, stakeholders require transparency, while power and outputs are evenly distributed and not centrally located. In addition, the system should be open to access and easy to implement, but restricted by specific protocols that determine data and technology standards.

Originality/contribution: This is one of the first studies in PI that establishes

critical requirements and develops a Third Party Model for collaboration and data sharing. The findings can be used in practice to develop new data sharing systems, and by academics for future research towards data sharing in a PI context.

Keywords: Collaboration, Data Sharing, Design Science Research, Digital Supply

Chain, Enterprise Information Systems (EIS), Information Sharing, Information Technology (IT), Logistics, Physical Internet, Third Party Model.

(5)

Table of Contents

1 Introduction... 8

2 Theoretical background... 10

2.1 Logistics... 10

2.2 Physical Internet ... 13

2.3 Enterprise Information Systems ... 22

3 Research questions ... 25

4 Methodology ... 26

4.1 Research method ... 26

4.2 Data collection ... 29

4.3 Validation method ... 30

4.4 Ethics and Risks ... 30

5 Analysis of case studies ... 31

5.1 Methods of collaboration and data sharing ... 31

5.2 Types of data shared ... 33

5.3 Reasons for data sharing ... 34

5.4 Challenges in data sharing ... 36

5.5 Critical requirements ... 37

6 Design ... 42

6.1 Third party model ... 42

6.2 Operational concept ... 44

6.3 Functional architectures ... 46

7 Validation ... 53

(6)

8 Discussion... 57

8.1 Discussion of findings ... 57

8.2 Implications for theory ... 59

8.3 Implications for practice ... 60

8.4 Limitations and future research directions ... 60

9 Conclusion ... 62

10 References ... 63

11 Appendices ... 70

Appendix A – PESTLE analysis logistics industry ... 70

Appendix B – Data collection ... 71

(7)

List of tables and figures

Table 1 - Strategic research directions in logistics (Speranza, 2018) ... 11

Table 2 - Unsustainability symptoms and their impact (Montreuil, 2011) ... 12

Table 3 - Four challenge areas for future EIS (El Kadiri et al., 2016) ... 24

Table 4 - DSR artifact levels (Gregor & Hevner, 2013) ... 27

Table 5 - Strategies for enhancing research rigor (Devers, 1999) ... 30

Table 6 - Methods of data sharing used by case companies ... 32

Table 7 - Types of sensitive data ... 33

Table 8 - Types of non-sensitive data ... 34

Table 9 - Characteristics of sound requirements (Buede & Miller, 2016) ... 37

Figure 1 - Overview of information flows in the envisaged PI system ... 9

Figure 2 - Interconnected supply network (Ballot et al., 2012) ... 14

Figure 3 - Container modularity (Montreuil, 2011)... 15

Figure 4 – Example of efficient transport planning with collaboration ... 21

Figure 5 - DSR cycle (based on Wieringa, 2009) ... 28

Figure 6 - International Data Space architecture (Otto et al., 2016) ... 42

Figure 7 – Potential design of a Third Party Model (TPM) ... 44

Figure 8 - Functional Architecture A-2 ... 46

Figure 9 - Interaction diagram A-2 ... 47

Figure 10 - Functional Architecture A-1... 48

Figure 11 - Interaction diagram A-1 ... 48

Figure 12 - Functional Architecture A.0 (alternative 1) ... 49

Figure 13 - Interaction diagram A.0 (alternative 1) ... 50

Figure 14 - Functional Architecture A.0 (alternative 2) ... 51

Figure 15 - Interaction diagram A.0 (alternative 2) ... 51

Figure 16 - Functional Architecture A.0 (alternative 3) ... 52

(8)

List of abbreviations

ADOM Automation, Decentralization, Openness, Modularity AI Artificial Intelligence

ALICE Alliance for Logistics Innovation through Collaboration in Europe API Application Programming Interface

ASN Advanced Shipping Notification CRM Customer Relationships Management

CSCMP Council of Supply Chain Management Professionals EDI Electronic Data Interchange

EIS Enterprise Information System ERP Enterprise Resource Planning

IoT Internet of Things

IS Information System

KPI Key Performance Indicator LSP Logistics Service Provider MAS Multi-Agent System

OLSCM Operations, Logistics and Supply Chain Management

PI Physical Internet

RFID Radio-Frequency Identification

(9)

1 Introduction

Every year in Europe, around €160 billion is lost due to low capacity utilization of trucks and containers only, with tremendous unnecessary amounts of CO2 emissions as a result (CSCMP, 2015). As sustainability plays an increasingly important role in the world, it is essential to enhance logistics processes. The Physical Internet (hereafter PI) is a relatively new but promising concept, aiming to improve efficiency and sustainability in logistics. It provides organizations with the potential to radically decrease triple bottom line (TBL) issues e.g. transport costs, CO2 emissions and truckdriver turnover. This can be achieved by combining logistics networks, using smart modular containers, and an increased sharing of information. Building on the four PI pillars of Automation, Decentralization, Openness and Modularity (Montreuil et al., 2013), PI could be the missing link in transforming towards a sustainable supply chain (Montreuil, 2011). The founding father of the PI concept is Benoit Montreuil, who first described the concept in 2009 in his so-called Physical Internet Manifesto. In one of his seminal papers, PI is defined as “an open global logistics system founded on physical, digital, and operational interconnectivity, through encapsulation, interfaces and protocols” (Montreuil et al., 2013). In combination with more recent research by several academics (Ballot et al., 2014; Crainic & Montreuil, 2016; Treiblmaier, 2019), a group of papers can be identified to be of paramount importance in this field. These will be discussed thoroughly in the theoretical background section.

(10)

For the concept to become fully operational, which is envisioned to take place between 2030-2040, developments in Enterprise Information Systems (EIS) based on the four PI pillars will play a paramount role. What these developments and challenges are, remains undetermined in literature. It is sure that the tremendous increase in data/information creates additional challenges. This thesis therefore addresses the question: how can data be shared effectively in the

Physical Internet, enabled by Enterprise Information Systems? A set of critical

requirements will be created to assist researchers and practitioners. Based on these findings, a preliminary functional architecture to improve data sharing with the use of a third party mediator will be designed and evaluated. Concluding, this thesis will 1) provide exploratory research in the field of Enterprise Information Systems and data sharing; 2) attempt to contribute to PI research by identifying critical requirements and challenges for data sharing and future EIS and 3) provide openings for future research in PI and EIS.

The remainder of this paper will contain a review of current (non-)academic literature on logistics, PI and EIS, an explanation of the methodology used, the findings that were obtained, and a thorough evaluation and discussion of these results. Finally, the conclusion will summarize the main findings of this research.

(11)

2 Theoretical background

To describe the setting of the research, this chapter first examines recent developments and challenges in logistics through a review of (academic) literature and interviews with industry experts for practical insights. Subsequently, although PI is a rather new concept, several papers provide great value in this field and will be reviewed to build a greater understanding of what PI entails. After this, the focus will be on the area of Enterprise Information Systems and data sharing, which are vastly grounded in literature but will also be complemented with expert interviews. Following the review of these fields, gaps in literature are identified, based on which the research question and sub-questions are constructed.

2.1 Logistics

As PI is expected to exist within a logistics context, the present situation in logistics should be examined first through an industry analysis. Based on this analysis, challenges are identified, followed by the proposed solution to tackle these challenges.

(12)

2.1.1. Industry analysis

Nowadays, logistics (sometimes called ‘transportation & logistics) is one of the largest global industries, with a total worth over 4730 Billion USD (€4251 Billion) (IMARC Group, 2018). It comprises almost all types of industries and is often subdivided into segments such as land, water and air. With the increase in globalization, the market became increasingly dominated by global logistics corporations like FedEx, DHL, UPS, DSV and Maersk. More recently, large technology companies like Amazon, Google and Alibaba are increasingly investing in this industry, making it exceptionally competitive. Due to societal and technological developments, the logistics industry is rapidly changing (Zijm et al., 2019). Under the term ‘Industry 4.0’, developments in multiple industries are made to increase efficiency, sustainability and customer satisfaction. New technologies like Internet of Things (IoT) and Artificial Intelligence (AI) are adopted by more and more companies (EFT & JDA, 2019). Furthermore, different modes of transport like ships, planes, rail and road are combined to a greater extent for intermodality advantages. Next to these technological developments, there are also strategic developments. Speranza (2018) discusses three strategic research directions: systemic, collaborative and dynamic directions (see table 1 below). Academics are increasingly focusing on integrating technological and strategic developments. The Physical Internet addresses all three directions.

Systemic “Better solutions to problems can be identified when broader parts of the supply chain are jointly modeled and optimized”. Collaborative “A tool that enables integration and global optimization of a

supply chain”.

Dynamic “Systems should be more reactive to changes and provide more effective responses”.

Table 1 - Strategic research directions in logistics (based on Speranza, 2018)

(13)

2.1.2. Challenges

Despite the increasing focus on achieving higher efficiency and sustainability, there remains a huge potential to further develop this industry. In 2008, the efficacy rate was estimated to be less than 10% (Ballot & Fontane, 2008). There are many other wastes in the current logistics industry, as summarized by Montreuil in table 2. This table also indicates what type of waste it entails, in relation to the Triple Bottom Line (TBL) of people, planet and profit. These wastes are strengthened by the amount of uncertainty in the current logistics industry, caused by many factors e.g. new regulations, technologies, public pressure regarding climate change, depletion of natural resources, and as mentioned by several interviewees, the rise of e-commerce.

Symptoms People Planet Profit

1. We are shipping air and packaging X X

2. Empty travel is the norm rather than the exception X X

3. Truckers have become the modern cowboys X X

4. Products mostly sit idle, stored where unneeded, still often unavailable

X X

5. Production and storage facilities are poorly used X X

6. So many products are never sold, never used X X X

7. Products do not reach those who need them the most

X X

8. Products unnecessarily move X X

9. Fast & reliable intermodal transport is still a dream X X X 10. Getting products in/out of cities is a nightmare X X X

11. Networks are neither secure nor robust X X

12. Smart automation & technology are hard to justify X X

13. Innovation is strangled X X X

Table 2 - Unsustainability symptoms and their impact (Montreuil, 2011)

(14)

2.2 Physical Internet

Apart from a description of the concept in the Physical Internet Manifesto in 2009, PI first occurred in academic literature in 2010 (Montreuil et al., 2010). Therefore, no exhaustive body of papers is available to consult yet compared to other research areas in logistics, supply chain and operations management. Resultingly, most of the research in this field (44%) is conceptual/theoretical (Treiblmaier et al., 2016).

The Physical Internet is an analogy from the Digital Internet, which had similar goals, challenges and requirements several decades ago. PI is described as “an open global logistics system founded on physical, digital, and operational interconnectivity, through encapsulation, interfaces and protocols” (Montreuil et al., 2013) and is primarily focusing on shipments larger than standard parcel sizes (UPS, DHL etc) till standard container sizes (20/40 FT). It is expected to bring radical (digital) innovations to the current logistics network which improve efficiency and sustainability. This can be achieved by "transforming the way physical objects are handled, moved, stored, realized, supplied and used" (Montreuil, 2013). In other words, with the use of digital innovations, new technologies, as well as smart modular containers (PI-containers) and an increased sharing of information, PI can come into existence. This requires warehouse sharing, intermodal transport and increasing collaboration between stakeholders to create an open supply web (sharing of supply networks).

(15)

Figure 2 - Interconnected supply network (Ballot et al., 2012)

2.2.1 Physical Internet pillars

It is evident that PI is a global, overarching and integrating framework, which consists of many elements that should interconnect with each other. Most importantly, the PI is expected to be built on four foundational pillars that have shortly been mentioned in the introduction: Automation, Decentralization, Openness and Modularity, often abbreviated as ADOM. Combining these primary requirements is what makes PI different from current best practices in logistics. Based on these pillars, the requirements of future EIS and data sharing will have to be constructed.

Automation concerns performing a task with minimal human support. In the ideal

(16)

Decentralization indicates that there should not be a central authority governing

the system. This is of paramount importance to provide autonomy to the containers and protect data in the PI network. Despite the advantages, complete decentralization could also create challenges when all stakeholders pursue their own goals, potentially leading to congestion at specific important hubs or areas. Recent papers discuss whether a certain degree of centralization might be more beneficial to the envisaged PI system (Ambra et al., 2019). Having multiple third parties with an advisory role could help mitigating this risk.

The PI network is transparent and open for everyone to join: from suppliers, container owners, transportation companies, hubs to customers, independent of their sizes and without any limits. Openness requires transparency and sharing of information. With more data available, supply chains can plan more efficiently, creating advantages to all stakeholders. Developments have to be made to ensure only required stakeholders have access to the information/data they need. Another efficiency related advantage of PI is modularity: shipping and combining multiple (smaller) containers that are able to interlock with each other and the mode of transport by latching and the use of intelligent technology. This has the potential to radically improve the container space utilization rate, which is currently too low as described by Montreuil, caused by high weight, volume or value. Figure 3 below provides a visual representation of how multiple containers of varying sizes can be combined into one shipping unit. A challenge in this area can be to determine the right size of the required container for a specific shipment.

(17)

2.2.2 Main elements in the Physical Internet

The Physical Internet consists of several important elements, which will all be described below. All elements have physical (actors) as well as virtual entities (agents), which communicate with each other through software (EIS). In terms of software, the Physical Internet can be described as a decentralized multi-agent system (MAS). In the field of Computer Science, MAS is defined as “a system composed of multiple interacting computing elements, known as agents. Agents are computer systems that have two important capabilities: autonomous actions and interacting with others.” (Wooldridge, 2009). These agents act towards specific objectives, while observing their environment using sensors and intelligence (AI) (Weiss, 2013). In the PI, these agents communicate with each other and with the EIS of the main actors. However, data sharing issues like privacy, confidentiality and security become even more apparent in a decentralized multi-agent system, as there is a lack of control. Companies prefer to have a certain degree of sovereignty over their data: the ability to keep control over who has access to their data and when. This creates a trade-off: while the advantages of data sharing are evident, many companies still decide not to share because of the uncertainties regarding security.

PI Container

(18)

Suppliers

Nowadays, suppliers mostly have a responsible role in making sure all goods of required quality are made ready for shipment correctly. An important task is to provide the LSPs and customers with a large amount of information about units, dimensions, type of goods shipped and more, possibly by providing an Advanced Shipping Notification (ASN) to the LSP and customer. In the PI system, it is important that the supplier still provides this information, which is expected to be communicated through the smart PI container to the other agents that require it. Suppliers are seen as the driving force behind PI developments, as they can indirectly force their clients (LSPs) to change (Meyer & Hartmann, 2019). The main advantage to suppliers is that the PI opens additional markets, as they can offer their products to a whole new range of customers.

Customers

On the receiving end of the shipments are the customers. Currently, customers activate the shipment process by placing an order. Next to that, there are not many tasks other than checking whether the container arrives on time and planning delivery with transporters. This is not expected to be different in a PI system. The main advantages that PI can offer are more automation and a larger market, as customers should be able to order everything, from everywhere, at every moment.

Logistics Service Providers (LSPs)

(19)

Intermodal hubs

Hubs are the places where goods are changed from one transportation unit to the other. This can be done at seaports, airports, railway stations or warehouses. When a shipment uses more than one type of transport, it is called intermodal. For example, if goods arrive in a seaport and are transported to the hinterland by train or truck. Intermodal transport is expected to increase customer service, while decreasing lead times and shipment costs (Cambra-Fierro & Ruiz-Benitez, 2009). Currently, these hubs play a large role already and in the future PI network, their role will be even more important due to the expected increase in containers and use of intermodality. The main difference is that several hubs such as warehouses should become shared hubs with other stakeholders, instead of being privately used. In this way, the benefits of a shared PI network as described in figure 2 can become operational. This requires a radical change in the business model for warehouse owners.

Container owners

Container owners play a fairly passive role in the current logistics system. Shipping containers can be owned by LSPs, but also by other companies that lease or rent out these containers to make a profit, but are not involved in any transport related activities. In the PI, the primary objective will not be different (making profit), but it is expected that the amount of container owners will increase. This is caused by the increase in smaller containers of various sizes, which are easier to purchase and own, as they operate themselves guided by basic instructions from the owners. Furthermore, owners will be able to specify certain tasks to their containers. Thus, one of the main functions of the PI container is to maximize profit (or any other target) for its owner by operating as efficiently as possible.

Third party mediator

(20)

distribute the required information to customers that need it, but only with approval of the supplier. In this way, suppliers keep sovereignty over their data. Furthermore, the mediator can perform an optimization of the acquired data and provides this optimized advice to the stakeholders. Chapter 6 describes the functions of the mediator in more detail.

2.2.3 Research avenues for the Physical Internet

The previous sections indicated several challenges to logistics and PI, as well as the most important elements. To speed up research and developments, the EU set up a European Technology Platform (ETP) for PI: ALICE (Alliance for Logistics Innovation through Collaboration in Europe), as part of the Innovation Union initiative (ETP-ALICE, 2019). The ALICE ETP identified five roadmaps for future PI research: information systems; sustainable, safe and secure supply chains; corridors, hubs and synchromodality; global supply network coordination and collaboration; and urban logistics. All roadmaps were first envisaged to be completed around 2050, however more recently the target for PI was set to 2030-2040, while aiming for entirely carbon neutral, zero emission supply chains around 2050 (Ballot, 2018). As supply chain development in recent years is mainly driven by innovations in information systems (Treiblmaier, 2019), there is an increasing need to advance research in this area.

(21)

Collaboration

Next to interoperability of EIS, collaboration between stakeholders is regarded as an essential requirement to establish a shared network. There is extensive research on current collaboration networks, and several examples might clarify what the advantages are. In general, the main reason for collaboration is to improve performance in an increasingly dynamic market (Cao & Zhang, 2011). An example could be a retailer that collaborates with his suppliers, where the retailer provides his demand forecast to the suppliers. With this forecast, suppliers can more effectively plan their production and deployment of staff, in order to provide better service to the retailer. They can offer shorter delivery times which provides an opportunity to decrease warehouse costs, as both parties need to keep less items in stock. Another potential improvement is the quality of products. In many industries, stakeholders collaborate by combining R&D resources to create innovative solutions, becoming more competitive in their market. Large global logistics service providers also often collaborate with smaller local companies to improve their coverage and increase flexibility. They can use more of the capacity of their local partner during periods when demand is higher, and use their own capacity when demand is lower.

The general benefits of collaboration in and across current supply chains, often expressed in KPIs, are evident and well researched. But why is collaboration expected to be even more advantageous in the future PI system? It remains difficult to validate as the system is not in place yet, but there are several future visions that describe the importance of collaboration.

(22)

transport to get there. By using the available information, the PI can see whether a truck will be able to take the containers from C to B. In this way, one of the goals of PI can be achieved: optimizing capacity utilization. This can only be done when collaboration between different actors is in place.

Figure 4 – Example of efficient transport planning with collaboration

Next to capacity optimization, collaboration is also required to effectively share resources. Sharing resources is expected to make the PI a thriving system (Montreuil et al., 2013), and radically improves flexibility. Furtado & Frayret (2014) identify that infrastructures such as transportation hubs, warehouses, hauling capacity and trailers should be shared to accommodate the flow of goods in the future PI system. This has several advantages. Distances between hubs become smaller, goods can be stored in warehouses closer to where they are needed, or orders for the same customer can be bundled more effectively, all reducing lead times.

(23)

2.3 Enterprise Information Systems

Before exploring the third party data sharing model, it is evident to define the concept of EIS, and why these are preferred for sharing information. Enterprise Information Systems can be defined as “software systems for business management, encompassing modules supporting organizational functional areas such as planning, manufacturing, sales, marketing, distribution, accounting, financial, human resources management, project management, inventory management, service and maintenance, transportation and e-business” (Rashid et al., 2002). These help to control the performance of business processes (O’Brien, 2003). EIS generally comprise six particular types of systems: Enterprise Resource Planning (ERP), Supply Chain Management (SCM), Manufacturing Execution Systems (MES), Customer Relationship Management (CRM), Product Lifecycle Management (PLM) and Business Intelligence (BI) (Romero & Vernadat, 2016). These commonly used modules can help organizations to achieve high operational efficiency, significant cost savings, and thus maximization of profit (Olson & Kesharwani, 2009). Furthermore, these can help to increase the product value by offering a better client service (Zacharewicz et al., 2017). ERP systems are often seen as the backbone system, bringing many crucial improvements to the company (El Kadiri et al., 2016) and is therefore one of the fastest growing, most profitable areas in the software industry (Da Xu, 2011). The €75 billion industry is largely dominated by large software companies that offer complete packages, such as SAP, Microsoft and Oracle (Statista, 2019).

2.3.1 Data sharing

(24)

challenging due to potential misuse of company sensitive data. This creates a dichotomy for many businesses: sharing sensitive information securely can prove to be very profitable, but data leakage might cause disastrous consequences. Next to data sharing, the concept of Big Data has gained a lot of attention in research and practice over the past decade. Through data mining and analysis, companies can radically improve their performance, often measured in KPIs (McAfee et al., 2012). Because of these advantages, data is nowadays often regarded as the most valuable resource to possess. Despite this shift towards a data economy, many improvements can be made to improve effectiveness of data sharing. Due to the lack of international data standards, many systems exist that are not interoperable with each other. Ultimately, the goal for EIS is to provide an open and interoperable system, that can be used easily by all stakeholders, accessed anywhere and is able to communicate with other types of EIS (Panetto et al., 2016), while companies maintain sovereignty over their data. This would align with the general requirements of a system for the Physical Internet.

2.3.2 Best practices

To achieve this ultimate goal, companies, practitioners and academics are continuously trying to develop and refine new functionalities. Some recent trends are for instance cloud-based systems, advanced analytics, and Internet of All Things (IoAT), which is an extension to the better-known Internet of Things (IoT) (Bughin et al., 2013). Most of the developments are driven by the increasing requirements for more data, collaboration and flexibility, to create so-called Next Generation EIS. These next-gen EIS are expected to have several innovative characteristics that differ from monolithic legacy systems (Panetto et al., 2016):

- Use of IoT technology

- Software-as-a-Service (SaaS)

- Use of blockchain (Distributed Ledger Technology) (Badzar, 2016) - Interoperable with other EIS

- Able to process larger amounts of data (via 5G e.g.)

(25)

programming interfaces (API) or data hubs. Especially data hubs are an important development, as data from multiple sources can be collected and analyzed collectively. This could be from multiple departments, but also from multiple companies, which is regarded as important requirement for PI. Recent technological developments make it possible to consolidate data from different types of EIS.

2.3.3 Challenges for EIS

Despite the rapid developments in the past decades, academics still identify substantial challenges for the next-gen EIS, indicating four areas described in table 3: (1) data value chain management; (2) context awareness; (3) interaction and visualization; and (4) human learning (El Kadiri et al., 2016). This thesis mainly addresses the first area, which is about allowing integration, sharing and security of data through interoperability of systems.

Area Description

Data value chain management

How to allow data/information analysis, mining, integration, sharing, security through interoperability?

Context awareness

How to offer scalability and integration capabilities between business processes within EIS?

Interaction and visualization

How to deliver new and intuitive ways for interacting with EIS?

Human learning How to support development of professional competences triggered by new scientific/technological advances?

Table 3 - Four challenge areas for future EIS (El Kadiri et al., 2016)

(26)

3 Research questions

The review of the theoretical background above indicated several research gaps. One of the most apparent gaps is related to Enterprise Information Systems in a PI context. It is clear that EIS will play a paramount role, but in what way remains ambiguous. Resulting from this, a second gap can be identified: bridging the difference between current best practice EIS and future EIS that are able to align with the PI requirements. It is evident that developments in EIS should be made before PI can work properly. But most importantly, due to the expected data explosion in a time where privacy and security become more prevalent, issues of collaboration and data sharing are fundamental and have to be addressed first. Based on these gaps, this thesis aims to answer the following research question:

How can data be shared effectively in the Physical Internet, enabled by Enterprise Information Systems?

To answer this question, several sub-questions will be investigated first, through multiple case studies and interviews. These will be examined and discussed in the following chapters.

1. What are the current systems used for sharing data? 2. What types of data are considered to be sensitive? 3. What are the critical requirements for sharing data (or not)? 4. How can a preliminary functional architecture of the Third Party Model

(27)

4 Methodology

This chapter describes the research method used that fits best to find an answer to the research question(s). Subsequently, the methods of data collection and validation will be described. For these sections, advantages and limitations will be discussed. The final section explains about potential ethics and risk issues involved in this research, and how these are addressed.

4.1 Research method

A first step after reviewing the existing literature is to determine what type of research is most capable of producing knowledge to answer the research question, based on its distinct characteristics. The characteristics of this research topic are qualitative and exploratory, which indicates a direction towards conducting case studies or Design Science Research (DSR). As the goal is to solve a practical problem (data sharing and collaboration issues in/across supply chains), PI is an envisaged concept and the research is strongly interconnected with the field of (Enterprise) Information Systems, the choice for DSR was made. This is conducted following the guidelines of Hevner et al. (2004) Van Aken et al. (2016) and multiple articles by Wieringa & Heerkens (2007; 2009). Several other academics (Holmstrom et al., 2009; Gregor & Hevner, 2013) discuss additional ways to conduct DSR and how to produce valuable knowledge.

(28)

Artifact characteristic Type of contribution Artifact examples

Abstract / mature Level 3: well-developed design theory

General design theories (mid-range and grand theories) Level 2: nascent design theory / knowledge as operational principles Constructs, methods, models, design principles, technological rules

Specific / less mature

Level 1: situated implementing artifact

Instantiations (products or implemented processes)

Table 4 - DSR artifact levels (Gregor & Hevner, 2013)

This research aims to create a level 2 artifact: design principles (also called critical requirements, to be used in this research) to create a novel business model. It is not uncommon for DSR research to develop design principles. Several reputable papers that create these as an artifact are Markus et al. (2002) and Lindgren et al. (2004) but also more recent work (Seidel et al., 2018) can be viewed as an example. In these papers, design principles like transparency and flexibility are often discussed among others.

(29)

Figure 5 - DSR cycle (based on Wieringa, 2009)

The first phase analyzes the systems that are currently in place, which might be replaced by the PI. Here, often the challenge or problem can be identified. This was done in section 2.1, where the state of the art in logistics was discussed and challenges are identified. Phase two examines literature about the envisaged system, in this case the PI, which was done in section 2.2. In the next phase, the artifact will be designed. For this research, novel critical requirements will be created. The final phase is about evaluating and validating the designed artifact to ensure that the created knowledge is valuable, and the problem is relevant. The primary advantages in conducting DSR are that generating academic knowledge can be combined with practical, relevant problem solving, while allowing for exploratory research. Furthermore, there are no strict guidelines in how to conduct the research or analyze it, which improves flexibility. However, taking such an approach also has several limitations: DSR is not as accepted yet in some areas as a valid research method, results can be abstract and therefore it might be difficult to test the validity and generalizability. Furthermore, not following certain guidelines might cause researchers to lose their focus on the real problem and the system under analysis (Kotzé et al., 2015).

(30)

4.2 Data collection

In DSR, there are multiple ways to collect data for the research. For qualitative studies, common methods are case studies, literature reviews, questionnaires, interviews, action research (observation) or focus groups. Combining several methods is possible and can enhance the validity of the study by means of data triangulation (Denzin & Lincoln, 2011; Carter et al., 2014).

For this research, both primary and secondary data will be used. Information is generated by a literature review complemented with multiple semi-structured interviews and observations during 7 case studies. It is often agreed to that the optimal amount of case studies lies between four and ten, depending on the information quality (Eisenhardt, 1989). Mason (2010) argues that information becomes saturated after six case studies in qualitative studies due to the concept of diminishing returns. This implies that more data does not necessarily mean more information. Interviewees consist out of logistics, PI, EIS experts and other relevant stakeholders to create a holistic overview. Cases will be selected based on several criteria, summarized in Appendix B, table 1. An overview of interviewees/cases can also be found in Appendix B, table 2.

(31)

4.3 Validation method

To enhance the rigor of this research, several strategies of Devers (1999) will be used in the collection of data. These strategies in table 5 improve the main research criteria of validity (internal and external), reliability and objectivity.

Criteria Strategies

Internal validity Triangulation; search for disconfirming evidence; subject review

External validity Have a detailed description of the study context

Reliability Data archiving; skeptical peer reviews

Objectivity Journal keeping; or a combination of strategies above

Table 5 - Strategies for enhancing research rigor (based on Devers, 1999)

To further assure the rigor and validity of the research, the created artifact in DSR should be evaluated (Venable et al., 2016). The artifacts to be validated are the critical requirements. This validation stage can be conducted in multiple ways. The preferred option is to implement the design, which is impossible as PI is yet to come into existence. For envisioned systems, expert panels, focus groups, workshops or additional interviews can be used to evaluate the design principles. In this research, experts in data sharing and/or PI are consulted to assess the relevance of the problem and quality of the findings. If the validation stage provides additional important insights, adjustments can be made to the critical requirements and functional architecture.

4.4 Ethics and Risks

(32)

5 Analysis of case studies

To validate the literature, create critical requirements and preliminary functional architectures, current systems have to be analyzed first. This can be done by observing systems that stakeholders have in place and reverse engineer these, revealing the critical requirements that make these systems work (Szirbik, 2019). The current systems were analyzed by conducting interviews and observations. All case studies examined general logistics processes (related to PI), and at least four aspects of data sharing: methods, types of data, reasons, and challenges. These are introduced below and provide a base for developing critical requirements (coding trees can be found in appendix B). Concludingly, this chapter answers sub-question 1 and 2.

5.1 Methods of collaboration and data sharing

This section describes the contemporary methods that are used for sharing data and information by case study companies. Table 6 provides an overview of the methods that were discovered per case company. Interviewees from the consultancy firm described contemporary methods in general based on their thorough experience.

Nr Company Category Methods used

A Global B2B distributor luxury goods

Other stakeholders Custom built EIS EDI

Data warehouse B Leading consultancy

firm

Logistics + EIS + other EIS EDI/API Platforms Cloud C Leading global port Logistics + PI Legacy

(33)

D Global LSP Logistics Integrated EIS EDI/API

E Regional port Logistics + PI Legacy

EIS Data hub Platforms F Leading consultancy

firm

Logistics + EIS + other EIS

Platforms / cloud I Global producer in

nutrition and health

Other stakeholders EIS

Data lakes

Table 6 - Methods of data sharing used by case companies

What is most apparent is that many similar methods are used, but only the hubs still identify using legacy systems. This is due to the fact that they are more of a facilitator, so they have to cater to the wishes of their customers. Their customers range from small enterprises that only use legacy systems like telephone/email to large multinationals that integrate their systems via EDI/API connections or platforms. Furthermore, the terms data warehouse, data hub and data lake were used. While they seem similar, there are several differences. As it was not possible to investigate these concepts in detail, it is decided to combine these into the term ‘platforms’. It is expected that more of these platforms will be used in the future. This was confirmed by interviewee B:

“A general trend you see in EIS is that platforms are becoming increasingly important, providing an opportunity to make collaboration easier. Compared to the past, when many point-to-point integrations had to be made, you only need to integrate once with a platform, which saves a lot of time and costs. From a PI

(34)

5.2 Types of data shared

Next to understand the methods that are used for data sharing, it is essential to know what types of data stakeholders currently share, and might share in the future PI. The first aspect that became apparent is that companies often classify their data. Many used a distinction between strategic and tactical data, or potentially sensitive vs. non-sensitive data. Non-sensitive data is considered to be widely or publicly available, whereas sensitive data is better secured and more private. As discussed, sharing of sensitive data can bring the largest benefits, but can be commercially sensitive caused by data leakage. However, sometimes companies are not sure about what is potentially sensitive data or not, so they play safe and do not share much. This is one of the challenges that will be discussed more in depth in section 5.4.

Sensitive data

Findings from case studies indicate that most data that is shared, is potentially sensitive. Therefore, this data is often shared only with trusted suppliers, customers or other supply chain partners. To create a more concise overview, sensitive data types were grouped into five categories: transport information, customer/supplier information, sales data, planning data and financial data. Table 7 below indicates what types of data belong to which category.

Category Data types

Transport info Name of LSP / carrier; container number; truck number plate; delivery destination; delivery times (ETA/ETD); contents of shipment

Customer/supplier info Customer names; supplier names; preferred supplier; quality; origin; payment terms

Sales data Order info (confirmation; picking list); customer orders; historical demand; quantities sold

Planning data Strategy; capacity; stock; demand forecast; Research & Developments (R&D); marketing plans

(35)

Non-sensitive data

This type of data includes data that everybody can have access to. For instance, the dimensions and weights of a product can be found easily, for instance on the internet or when the product is purchased. Therefore, it makes less sense to keep this data private. Without the addition of sensitive types of data, non-sensitive data are not valuable to others.

Category Data types

Administrative data Public records; data required by governments; amount of employees; vessel/container GPS

Product info Dimensions; weights; goods classification (customs); Annual report data Income statements; cash flows; sustainable practices;

balance sheets; dividends

Table 8 - Types of non-sensitive data

5.3 Reasons for data sharing

Several reasons why companies should collaborate and share data were already discussed in the theoretical background. However, case studies provided additional insights about the reasons for data sharing.

Reasons for sharing data

Case companies described many reasons why they share their data with suppliers, customers, LSPs or others. It increases their market share, they can optimize their planning, offer a better service by for instance improving quality or delivery speed, and there are other financial benefits. These advantages all relate to the concept of efficiency: increasing output while decreasing inputs. Interviewee A gave the following example:

“We could share more data with for instance LSPs, to align our demand with their capacity. Currently, LSPs always come at a specific time with a specific capacity that was agreed upon beforehand. If we would give them real-time insight in our data, they could keep track of the amount of packages ready for

(36)

Another scenario was explained by interviewee E from the regional port:

“This year, we did an analysis of arrival times of ships in our port. The research concluded that almost a third of the ships arrived more than 2 hours too early,

or more than 2 hours too late. This has a huge impact on all stakeholders involved. Currently, we are building a data platform to share this arrival data

more effectively through the chain.”

However, data is not only shared on their own initiative to improve efficiency. Some case companies discussed that there are many laws and regulations from governments that require companies to share certain types of data. This is mostly related to customs, who need information on the value and type of products that are shipped in/out of a country, but also passenger data that is required for crew and others that are located on a vessel, to prevent stowaways and other illegal activities.

Reasons for not sharing data

The main reasons for not sharing (sensitive) data are related to technological inabilities and trust. Specifically when it comes down to horizontal data sharing, case companies do not trust their (potential) competitors to handle their data in a way that creates benefit for both. They are afraid of a misuse of their data, so potential benefits are lost. Furthermore, companies fear losing control over their data. Once their data becomes available to others or via a platform, they have concerns about security over their data and not being able to decide who has access to their data and when. This concept is called data sovereignty. This relates to the next reason identified for not sharing data: most companies are resistant to change. They feel that changing the way they conduct business can be harmful to future profitability, so are not willing to change. This can be caused by the culture of a company, something that can prove very difficult to change. Interviewee E identified resistance as one of the most challenging barriers towards PI adoption:

“I agree that we should go into the direction of PI, but the main hurdles lie in the change process, the willingness to change. We should create tangible

(37)

5.4 Challenges in data sharing

Next to the methods, types of data and reasons, it is paramount to discuss the challenges that companies currently experience. These are often closely related to the reasons for not sharing data, and should be applied when designing critical requirements and a functional architecture. Challenges are categorized in two groups: business and technology.

Business related challenges

These type of challenges relate to the challenges experienced by people in a company, specifically the people responsible for decision-making. As mentioned above, there is a lot of uncertainty regarding trust, data security and control, but also about ownership. Once data is shared (with a platform e.g.), who owns it, and makes sure everybody benefits? Additionally, effective data sharing requires substantial investments in IT, costing a considerable amount of time for testing, but also money, as often expert knowledge from outside has to be brought in. This creates a lack of visibility of short term benefits. For several companies, it was sometimes unclear what types of data they could share or not. Interviewee A mentioned the following:

“It remains unclear what additional types of data we should share compared to what we do now. We do not see the short term benefits of sharing more data.”

Technology related challenges

(38)

5.5 Critical requirements

Based on the input from literature and case study interviews, critical requirements can be established. These critical requirements, sometimes called stakeholder requirements, determine what is required for the unit of analysis to function. The unit of analysis in this study is a system of effective data sharing. Critical requirements can be categorized in four segments: input/output requirements; technology requirements; trade-off requirements and system qualification requirements. This partitioning provides a coherent structure and helps with establishing more rigorous and sound critical requirements (Buede & Miller, 2016). The critical requirements will be designed using the guidelines from Buede & Miller (2016). Table 9 describes their characteristics of sound requirements. These characteristics will be tested on the critical requirements during the validation phase.

Characteristics Description

Unambiguous Each requirement has only one interpretation

Understandable Each requirement is clear to those who review them Correct The requirement states something required of the system

Concise No unnecessary information is included

Traced Each requirement can be traced to a document/statement

Traceable Derived requirements must be traceable to a higher level Design independent Each requirement does not specify a particular solution Verifiable Each requirement can be verified cost-effectively

(39)

5.5.1 Input/output/control requirements

The input, output and control requirements are created from analyzing inputs, outputs and controls from the external environment that should make the system work. Control requirements are a special type of input, while these do not really provide data but information that is related to policies and guidelines, bringing certain restrictions. Based on literature and findings in case study interviews, the following high-level requirements could be established:

- Input: The system shall be open to receive data from all sources, according to specific data standards and data quality requirements.

As the PI is expected to be an open system, everybody should be able to contribute their data. Many stakeholders expressed their concerns about the quality of data from other stakeholders, which is currently often one of the main difficulties.

- Output: The system shall produce a transparent, fair and equal output based on the relative input from each stakeholder.

Most of the interviewees mentioned their concerns about a fair output from the system. If this can not be promised, stakeholders would be reluctant to using the system.

- Output: The system shall produce visible, short term benefits for all stakeholders.

Next to a fair output, interviewees required that outputs should create visible short term benefits, otherwise the investment plan will likely be rejected.

- Control: The system shall be restricted by specific protocols and business rules that were agreed upon by all stakeholders.

(40)

5.5.2 System-wide technology requirements

These requirements relate to the technical capabilities of the system as a whole, not to the inputs, outputs or controls separately. These are closely related to the technological challenges in current data sharing systems. A system to share data effectively requires the following:

- The system shall provide interoperability.

Interoperability indicates that different stakeholders with different EIS should be able to make use of the system. Interviewees expressed that it would be impossible that one EIS would be used by all stakeholders. Therefore, there should be no restriction to what type of EIS is used by stakeholders, although they should have certain ISO certified functionalities, which can be included in the protocols or business rules described above. How this could be provided is not within the scope of this research.

- The system shall be easy and inexpensive to test, implement and use. As the Physical Internet should be open to everybody this is an important requirement, especially for smaller stakeholders. Systems that are very complex and expensive to test, implement and use often run out of time and budget, causing entire projects to fail. A well-known example from the UK is the failed NHS Connecting for Health IT system, which was abandoned after almost ten years of investments at an estimate cost of more than £10 (€12) billion (Syal, 2013). As the terms ‘easy’ and ‘inexpensive’ are subject to interpretation, these are hard to define specifically. Therefore, it is critical for each stakeholder to assess their own situation and capabilities.

- The system shall provide a secure and fast connection with stakeholders EIS. Data should be available in securely and in real-time to provide the most optimal advice.

- The system shall always be accessible. The technology used could provide cloud-access, which should be available without any disruptions.

(41)

5.5.3 Trade-off requirements

While designing a system, there are typically trade-offs to be considered. Traditionally, the trade-off is often between performance and costs. This is not different to the system under analysis. However, another trade-off was also identified, which discusses the level of centralization versus decentralization that the system applies. Most case companies discussed that not one party should have control over the system, indicating their preference towards a certain level of decentralization. However, as Ambra et al. (2019) discussed, a certain degree of centralization could bring more benefits to all stakeholders. Therefore, the right balance should be found. Concludingly, the trade-offs are summarized below:

- The system shall not have too much power and control, but everybody wants that all stakeholders use the advice provided by the system, as this will bring the largest benefits. Here, the trade-off is the level of decentralization vs. centralization.

The issue that was mentioned most by interviewees was the power structure of the system. As it would contain a lot of sensitive data from many stakeholders, the controlling party potentially has too much power when the system is centralized. Therefore, decentralization might be preferred, but then trust should be established to make sure everybody uses the advice provided by the system.

- The system shall not be too expensive to implement, but it should have several functionalities. If the system is easy and cheap to implement, quality and functionalities are often less.

(42)

5.5.4 System qualification requirements

System qualification requirements are needed for testing and validating the system, and should be used before and during integration. Several requirements that should be established and tested in advance:

- Companies shall first assess their own readiness for sharing data with the system, as input data is needed for the system to function.

Readiness relates to the willingness to share data, as well as being able to adhere to the data requirements required by the protocols and business rules (data standards and quality). Several interviewees mentioned that they expect most stakeholders in their supply chains to have issues regarding the willingness to share data, as well as their technical capacities to actually share this data. Many still use outdated systems or EIS and are reluctant to IT investments that would make this system possible. Regarding willingness, the following was expressed:

“Even though most stakeholders know it would be beneficial to the entire economy to share more data, they are just not willing to share as it helps their

competitors as well. This is something deeply embedded in some companies cultures and could prove very difficult to change.”

- Trust shall be established with the system or with the controlling party of the system (mediator).

(43)

6 Design

This section examines the use of a third party company in the PI network (hereafter named ‘mediator’), where data sharing is driven by Enterprise Information Systems. First, a general description of this proposed third party model is provided. Second, building upon literature and the critical requirements established in the previous chapter, a preliminary functional architecture will be designed, using CORE, a systems engineering software tool. This architecture describes the functions and interactions between functions of the proposed model.

6.1 Third party model

A novel business model architecture using a third party company, sometimes also called ‘trustee’, ‘mediator’ (to be used hereafter) or ‘orchestrator’ for logistics systems was proposed by several academics such as Dalmolen et al. (2018; 2019a) and Lee and Whang (2000). Dalmolen et al. identify the International Data Space (IDS) architecture as one with potential for improving collaboration and data sharing. The International Data Space (previously called Industrial Data Space) is “a virtual data space using standards and common governance models to facilitate the secure exchange and easy linkage of data in business ecosystems” (Otto et al., 2016). It utilizes so-called broker service providers, clearing houses and identity providers for data exchange and is best described by figure 6 below:

(44)

Furthermore, the EU also recommends the use of a neutral trustee to enhance horizontal collaboration in logistics, “to whom different stakeholders give data to be held and analyzed, preventing the transfer of potentially sensitive commercial data such as volumes, delivery addresses, costs, product characteristics etc.” (Bogens and Verstrepen, 2013). However, these proposed architectures were not addressed to align with the specific PI context, but aimed at data sharing in traditional supply chains. Furthermore, they only address one-to-one collaboration between one data provider and one data consumer, whereas the PI system needs many more stakeholders to collaborate. Additionally, it remains unclear who should share their data, when, and what data. Therefore, academics encourage additional research on this type of model.

(45)

6.2 Operational concept

The model under discussion could look like figure 7 below. In short, the system should work as following on a day-to-day basis: multiple data providers (in this model only two are displayed for visibility, but it can be an infinite amount) share their standard, non-sensitive data through the PI. These data providers can be the stakeholders in the future envisaged PI as discussed in section 2.2.2. However, the PI is unable to operate effectively with only this data. It requires additional, potentially company sensitive data to reap the benefits from a shared network. Therefore, companies could provide their potentially sensitive data to a trusted mediator. This mediator then anonimizes, analyzes and optimizes all the data combined, and provides its optimal advice to all stakeholders that provided their data. Furthermore, it would provide this data to the PI so it can take this into consideration while optimizing shipment decisions. The other way around, the PI provides their data to the mediator if it needs additional data to optimize its advice towards stakeholders. Concludingly, the functioning of the system depends on the stakeholders, whether they use the optimal advice that is provided to them or not.

Figure 7 – Potential design of a Third Party Model (TPM)

(46)

recommendations on what to watch next based on ones previous views, whereas Facebook uses this system to recommend potential connections based on current connections and/or interests. To ensure an adequate degree of decentralization, the mediator is expected to only have an advisory role, keeping the final decision for the stakeholders to be made. There are several exceptions to consider in designing such a system. This is due to the fact that there are many different stakeholders, with different strategies, objectives and requirements.

One-off shipments

For shipments that are expected to only happen once, there is no need for supplier, customer, LSP or other stakeholders to share sensitive data with each other. They can share standard shipment data simply through PI, which should be sufficient. Only with repeated shipments, collaboration can become beneficial due to collaborative planning and other reasons discussed earlier.

Logistics Service Providers

For LSPs the situation is slightly different, as they have to share data and collaborate no matter the frequency or type of shipment, in order to establish and operate a shared PI network. They also need to share assets (such as warehouses). Therefore, sharing company sensitive data is often required. LSPs are expected to become the main users of the mediator services.

Antitrust / competition laws

(47)

6.3 Functional architectures

To describe the operational concept using a more systematic method, functional architectures should be developed. Next to the functional architectures, interaction diagrams will also be provided to highlight the interactions between the functions, connecting their inputs, outputs and controls. The design of these architectures is guided by literature, as well as the critical requirements and other findings that resulted from case study interviews and observations. Most importantly, the function under analysis is A.0 ‘perform mediator functions’. To start, the wider context of function A.0 will be described first, after which three alternatives will be provided for the first level decomposition of function A.0.

6.3.1 Function A-2: Perform PI context functions

This first functional architecture describes the widest external context of function A.0 in the PI. As discussed before, PI can be segregated into three types of flows, which here provide the first decomposition of the PI. Function A.0 is included under the A-1 function: perform information flows.

(48)

Figure 9 - Interaction diagram A-2

6.3.2 Function A-1: Perform information flows

(49)

Figure 10 - Functional Architecture A-1

(50)

6.3.3 Function A.0: Perform mediator functions

As the mediator functions can be performed in many different ways, three alternative options will be presented below. Each alternative has different characteristics based on which the mediator operates.

Alternative 1

The first alternative is characterized by a more reactive approach by the mediator, after trust is already established. To perform its functions, it expects that stakeholders will provide the data to the mediator without need to ‘ask’ for it. Furthermore, this alternative takes into account a decentralized approach of PI, as the mediator is only allowed to provide optimized advice without actually implementing it. Most of the system is controlled by PI protocols.

(51)

Figure 13 - Interaction diagram A.0 (alternative 1)

Alternative 2

(52)

Figure 14 - Functional Architecture A.0 (alternative 2)

Figure 15 - Interaction diagram A.0 (alternative 2)

Alternative 3

(53)

reactive approach. However, as this mediator has more power, it might be possible that it is able to require certain types of data from all stakeholders, so trust is less important to be established compared to the first two alternatives.

Figure 16 - Functional Architecture A.0 (alternative 3)

(54)

7 Validation

The critical requirements are evaluated with two expert workshops. One from an academic (PI) perspective (S. Dalmolen) and the other from a business perspective (at a leading consultancy firm), where both participants are experienced in the area of data sharing. This validation was conducted to examine the relevance of the problem, the characteristics of the critical requirements (using table 9 on p.37), the feasibility of the proposed solution (model) to the problem, and to gain additional expert insights. The functional architectures were not discussed, as these are only designed preliminarily and need to be developed further in future research. In the workshops, the research context and problem statement were presented and shortly explained first. When these were clear, the critical requirements were presented per category, on which an open discussion started. This discussion was advanced by several questions if needed, and provided valuable feedback about the requirements in both workshops.

7.1 Validation of critical requirements

Most importantly, both participants found the critical requirements to be understandable and correct for a data sharing system. No extra clarifications were needed to explain the critical requirements, but several of the requirements could be made slightly more specific. However, according to both participants this also depended on the scope of the research. Furthermore, the problem statement was regarded clear and very relevant, and the requirements follow a coherent partitioning. Nevertheless, both participants had concerns on the practical applicability of a third party data sharing system and how it would align with currently used networks, that are mostly closed. The first participant summarized it as following:

“What you propose is an open infrastructure, where everybody can share data securely and in controlled manner. The problem is relevant and the requirements

Referenties

GERELATEERDE DOCUMENTEN

Op basis van de bodemgesteldheid (pleistoceen zand op een gemiddelde diepte van 40 à 70cm) en het gebrek aan relevante sporen in het aansluitende perceel, gecombineerd met de vele

When the MAGE-ML standard is finalized, data will be imported from collaborators and exported from RAD using this format.. However, each of these data representations—RAD and

The governance structure for data sharing proposed here involves the exchange of raw user information and not information further processed by firms, so that the system is

In such cases, regulators and legislators can intervene by imposing duties to share data (including in the form of a hybrid intervention through the envisaged New

The exchange of data is made possible by these functional building blocks such as tags that identify citizen, sensors that collect data about citizens, actuators

To address the above-mentioned obstacles of sharing and re-use of cross-linguistic datasets, the Cross- Linguistic Data Formats initiative (CLDF) offers modular specifications for

Specific managerial functions such as communication, advertising and research (such as the availability of data for the fashion industry to analyse to develop new