• No results found

Determining the Infrastructure Developments for emerging Computing Typologies: A study to the possible Operationalization of Edge Computing

N/A
N/A
Protected

Academic year: 2021

Share "Determining the Infrastructure Developments for emerging Computing Typologies: A study to the possible Operationalization of Edge Computing"

Copied!
46
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Determining the Infrastructure

Develop-ments for emerging Computing Typologies:

A study to the possible Operationalization of

Edge Computing

Written by Iris Maijer Student Number: 10757112

University of Amsterdam, Amsterdam, Faculty of Science & Avanade Netherlands BV, Utrecht

Thesis Master Information Studies: Business Information Systems Final version: 30-06-2018

Supervisor: ir. A.M. Stolwijk Examiner: drs. A.W. Abcouwer

(2)

2 Abstract

This research is executed with the goal to explain the evolution of IT from a Technological and Or-ganizational perspective. The study is based on theories of Moore and Ceruzzi, and attempts to fore-cast Edge Computing as a new computing typology. To do so, it was determined which infrastructure developments would lead to the emergence and operationalization of Edge Computing.

Literature was compared with expert interviews in the field of IT development. The change and growth of the hardware, software and network components was studied to identify the technologi-cal and organizational drivers of developments in IT. Drivers were determined by developing Crititechnologi-cal Success Factors. These were minimizing latency, reducing bandwidth, lowering cost, being scalable, reducing threats, avoiding duplication, improving reliability, and maintaining compliance.

From an historical perspective there have been various waves in the development of IT. In-stead of only focusing on IT or Business, it was chosen to align those two perspectives. Based on the power surge between vendors and clients of the business side and the technological developments, the infrastructures changed from centralized to decentralized and so on. For Edge Computing this trend is continued. However, new business models made it possible to have more fragmented architectures to comply to the CSFs.

Keywords

Information System Configurations, Computing Typologies, Cloud, Edge, Internet of Things, Busi-ness and IT Alignment.

Index 1. Introduction 3 2. Theory 4 3. Research methods 9 4. Results 11 5. Conclusions 16 6. Discussion 17 7. References 18

Appendix A – Conceptual model IT development 21

Appendix B – Interview Framework 22

Appendix C – Interview Transcripts 23

(3)

3

1. Introduction

Information Technology (IT) has proven to be a crucial tool in industrialization. In the In-formation Age knowledge is the new power source, instead of capital. Organizations are exploiting it to develop and expand their business models (Humbert, 2007). IT enables pro-gression in software, hardware and networks.

1.1 Research motive

The research is done in cooperation with the University of Amsterdam (UvA) and Avanade. Avanade is an IT Consultancy company owned by Accenture and Microsoft. Microsoft is currently developing products aiming for the Edge market. From a business perspective, this development will require new ways of programming, security, data storage and will possibly affect machine learning (Miller, 2017).

At present, enterprises are mostly concerned with their strategies considering Cloud compliance, they are exploiting centralization to lower marginal costs. Nevertheless, external innovations, like the emergence of Internet of Things, force organizations to expand their thinking beyond centralization and toward more distributed processing options.

According to IT and research consultancy firm Gartner (2017): ‘Cloud seems to be too distant, response time is too slow and the bandwidth is too narrow, network systems that work in the Edge collect and process data much closer to their source’ (Panetta, 2017). They predict that one of the IT development trends will be the shift to Edge. Edge are enabling technologies that allow computation to be performed at the network border near data sources. Miller (2017) stated: ‘Processing data at the Edge is the next step in the develop-ment cycle of Computing and the developdevelop-ment of IT. In the past decades there have been other shifts from centralization to distribution and the other way around’ (Miller, 2017). Therefore, after a centralized era it seems only logical to expect a development towards de-centralization.

1.2 Research goal

This paper explores the IT literature from a historical perspective to interpret knowledge to predict the future Information Systems Configurations and Computing Typologies. The main purpose of this research is to study the expansion of the components as hardware, software and networks to identify the technological and organizational drivers of developments in IT. The literature has helped to reach key concepts of the as-is scenario. These concepts are the technology waves, hardware and software generations, networking typologies and Critical Success Factors (CSFs). A CSF is a critical factor required to ensure the success of an organ-ization. The term was initially used in the world of data- and business analysis: ‘CSFs in-clude issues vital to an organization's current operating activities and to its future success’ (Boynton & Zmud, 1984).

This research is an attempt to explain the evolution of IT. Current literature is mostly written to predict the future based on current experiences with technology. With this research an attempt is made to combine the technological perspective with the organizational perspec-tive and to define a trend based on the theory of technology waves.

The theory is used with the aim to contribute to previous research with the creation of a conceptual model that could predict new IT milestones based on previous indicators. For this research the case is the forecast of Edge Computing as a new computing typology. The conceptual model is reviewed in expert interviews with specialists in IT history, Cloud, Net-working, Software and Infrastructure.

1.3 Research question

The research topic emerges from technological development, namely the shift in components of Information Systems configurations. The research is set up to provide theoretical and practical underpinning for these and historical developments in IT with in this case a focus on Edge Computing.

(4)

4

A conceptual model is created based on the developments from the past. The model contains the components and variables that will lead to changes in the infrastructure and gives measurements to set out trends to predict the future. Throughout this base the follow-ing research question (RQ) emerges:

Which infrastructure developments lead to the emergence and operationalization of Edge Computing?

Henceforth, sub questions were formulated to support answering the RQ. It was taken in consideration to make a part descriptive and a part explanatory, the questions devised were: - How is emergence being determined?

- What could be defined as Infrastructure developments? - How is operationalization being determined?

- What could be defined as Edge Computing? 1.4 Research structure

In the next section the sub questions are explained by reviewing the theory waves in IT de-velopment and important hardware and software concepts. Section 3 is aimed at describing the research method that led to the comparison of papers and the validation of the factors in the model with semi-structured interviews. This is followed up by the results in section 4 and the conclusions in section 5. After the research period, a reflection is made and some topics were identified to remain for future research. These are described in the discussion in section 6.

2. Theory

Before looking forward it was chosen to look backward to define a trend based on develop-ments in the past and to understand the technological and business drivers in the industrial environment. The section describes the computing history with the aim to provide a chrono-logical story of events all the way to the current state of IT. Via this path the sub questions will be answered to provide a background and rationale for this area under discussion.

2.1 Technology waves

According to Alvin Toffler, who was an American writer and futurist known for his discus-sion on modern technologies and their effects on cultures worldwide; ‘The society always seeks to solutions for the increasing complex technological developments’ (Alvin Toffler: Biography, 2014). To determine emergence, the theory of technology waves is used.

Toffler described three waves of development. The 1st was the Agricultural Revolu-tion which was followed by the 2nd wave, known as the Industrial Revolution. This created a new social system based on mass production, mass distribution, mass consumption, mass education, mass media, etc. In an interview he explained that if combined with standardiza-tion, centralizastandardiza-tion, concentrastandardiza-tion, and synchronizastandardiza-tion, it causes a bureaucracy (Toffler, Life Matters, 1998). Lastly, he predicted a 3rd wave which he called the Information Age in which information becomes abundant (Toffler, 1980).

According to experts from industry and the majority of relevant research papers the 3rd wave is triggered by the Internet. Brettel et al. declare that: ‘Internet allows communica-tion between humans as well as machines in Cyber-Physical-Systems (CPS) throughout large networks’ (Brettel, Friederichsen, Keller, & Rosenberg, 2014). The Internet led to the emer-gence of a new architecture in organizations.

(5)

5

Figure 1: Waves of innovation of the first and the next industrial revolution (Smith, 2013) Figure 1 shows the waves of innovations that have an influence on the industrial revolutions. The research will focus on this Information Age (1950-2020), with the emphasis on the his-tory of computing in order to understand what is meant by infrastructure developments and computing typologies.

The trend will start at 1951. In this year the first business computer was developed by Lyons Electronic Office (Martin, 2008). According to Boogaard and Alberts: ‘Computer history is about the devices, about programming, about connecting the devices, and about the practices of application and automation. Every phase has its own pioneers’ (Boogaard & Alberts, 2008).

Paul Ceruzzi defines three phases in computer history that will be used in this paper. He makes a periodization of mainframes and the rise of the minicomputer (1945-1975), the rise of microelectronics and the personal computer (1965-1995), and the rise of the Inter-net/World Wide Web (1981-2000). These historical periods describe foremost the develop-ment of computers in America. Besides, the emphasis lays on the developdevelop-ment of electron-ics, instead of the development of programs. These facts will be considered as the scope of this research. They will be used to develop a timeframe and to make an inventory of infra-structure developments.

Another factor in these historical waves is Moore’s law. Gordon Moore predicted the number of transistors in a compressed circuit would double approximately every two years. This law started as an observation and projection of an historical trend. Though, the law is still used in the semiconductor industry with the purpose of reaching it as a target. A critical note to this law is that there is a prediction that the law will reach saturation in the upcoming age. It is worthwhile mentioning this critical note of Moore's law, since it did describe a driv-ing force of technological and social change, productivity, and economic growth (Schaller, 1997). These drivers are used in the results to make a division in CSF categories and as an example of a trend analysis.

2.2 Hardware developments

The concepts of architecture and infrastructure are sometimes confused with each other. When the architecture is addressed in this paper, the conceptual model of components of some IT systems is meant. The IT system is a cohesion of processes, functionalities, applica-tions and infrastructure (Business Dictionary, 2018). In this section the sub question on

(6)

de-6

velopments in infrastructure is answered. The term infrastructure is used to appoint IT hard-ware components that, when combined, shape the IT environment.

When describing the history of hardware and software the term generation is often used. In the history of computing hardware, computers using vacuum tubes were called the first generation. The second generation contained transistors which were smaller, required less power and generated less heat. This research starts at the third generation (in the infor-mation era) where multiple developments occurred in a short amount of time as depicted in figure 2.

In the third generation the supercomputer came into use. These computers were gen-erally costly and owned by large institutes. The computers were used by experienced special-ists. They did not usually interact with the machine itself, instead they prepared tasks for the computer on off-line equipment, such as card punches. A more interactive form of computer use developed commercially by the mid-1960s. In a time-sharing system, multiple teleprinter terminals let many people share the use of one mainframe computer processor. This was still a centralized form of using IT (Ceruzzi, 1998).

In the early 1970s the microprocessor was introduced and after this, the minicomput-er could be connected to a hybrid circuit whminicomput-ere solid state devices wminicomput-ere intminicomput-erconnected on a substrate with discrete wires. Minicomputers were much smaller, less expensive, and gener-ally simpler to operate than the mainframe computers. Minicomputers freed organizations from the batch processing and bureaucracy of a computing center and provided a client-server model. In addition, minicomputers were more interactive than mainframes, which created the development of operating systems (Reilly, 1993).

Hereafter the Personal Computer (PC) emerged. In the 1980s, networking was revo-lutionary. Soon the Internet was introduced and in 1990 the World Wide Web became a con-cept that is currently indispensable. Local networks and mobile devices were other distribut-ed computing typologies, introducdistribut-ed in this decentralizdistribut-ed era.

To conclude, the above hardware developments could be defined as the most im-portant infrastructural developments in the Computing history.

Figure 2: Timeline IT Milestones 2.3 Software developments

Parallel to the infrastructure developments, the operationalization of it evolved as well. Software is used to operationalize the hardware. Therefore, it could be determined as the operationalization factor for IT developments.

The first generation of software is typically understood as machine language. Fol-lowed by the second generation which was the start of low-level programming languages to read machine instructions. In the third-generation, domains began to occur which led to higher level programming languages. Lastly the fourth generation built from these bases of domain-specific programming languages. The languages became more specific to a certain purpose and showed a higher level of abstraction.

(7)

7

Software in this paradigm is focused on system software like operating systems, computer programs and their corresponding data as applications for mainframes, PCs and mobile de-vices. In recent years, also televisions, cars and machines (things) are embedded with soft-ware. The software timeframe should be briefly mentioned, because it is a supporting factor in developments of the computing typologies. In the present and future developments Soft-ware becomes more and more emphasized as an independent program that can be decoupled from hardware.

Software was originally written by Ada Lovelace; however, it was never created (Hally, 2005). The first theory about software was proposed by Alan Turing, who was also one of the first who reasoned about Artificial Intelligence (AI). These happenings laid the foundation that eventually led to the creation of computer science and software engineering.

Boogaard and Alberts refer to the sixties as: ‘The pioneering age of software’ (Boogaard & Alberts, 2008). In this age the software became bundled to hardware and gave change to the third generation. The scope of this paper starts in this generation; after the emergence of PCs, programs started to grow, distribution became easier and Software came down in price. Software could be categorized in seven utilities, namely programming lan-guages, operating systems, computer networks, computer graphics, word processors, spread-sheets and Computer Aided Design. Especially in the Computer Networks a fundamental change emerged, as depicted in figure 2 there was the rise of the World Wide Web (Kurose & Ross, 2013). Communication networks changed from Local Area Networks (LAN), Wide Area Networks (WAN) to Internet Area Networks (IAN). This network replaces telephone networks and laid the base for Cloud platforms. However, connecting network nodes dates back to the mainframe era, with an addition to wireless networks.

In short, the operationalization could be defined by the software and network devel-opments which led to the usage of IT on a broader scale.

2.4 Current situation

The role of software has changed over time and should be considered an important part of future developments in algorithms in AI subjects like Machine Learning (ML) and Internet of Things (IoT).

IoT was first introduced to the community in 1999 for supply chain management (Ashton, 2009), and contains the concept of making a computer sense information without human intervention. According to Shi et al. IoT is arriving in the post-Cloud era, where there will be a large quantity of data generated by things that are immersed in our daily life. A lot of applications will also be deployed at the Edge to consume this data (Shi, Cao, Zhang, Li, & Xu, 2016). According to Cisco Internet Business Solutions Group (IBSG); ‘the Internet of Things was born in between 2008 and 2009 at the point in time when more ‘things’ were connected to the Internet than people’ (Evans, 2011).

In addition, the amount of data produced by people, machines and devices is grow-ing as well. ‘The prediction is that 50 billion ‘thgrow-ings’ will be connected to the Internet by 2020’, according to IBSG (Evans, 2011). Some IoT applications might require very short response time, some might involve private data, and some might produce a large quantity of data which could be a heavy load for networks. Current research argues that ‘Cloud Compu-ting is not efficient enough to support these applications’ (Shi et al., 2016).

Cloud Computing is the IT architecture of this current era. The National Institute of Stand-ards and Technology (NIST) defines Cloud Computing as ‘a model for enabling convenient, on demand network access to a shared pool of configurable Computing resources (e.g., net-works, servers, storage, applications, and services) that can be rapidly provisioned and re-leased with minimal management effort or service provider interaction’ (P. Mell, 2011). Cloud could be seen as the ‘core’ of computing.

However, a possible new era is emerging. Gartner predicts that 'Edge Computing' is the new configuration of 2020, which could provide more flexibility and a competitive

(8)

ad-8

vantage because of its distributed model. In figure 3 the place of Edge is visualized. There is still a core in the form of Cloud, however the services are running closer to the devices.

Figure 3: Places of Network Intelligence

Research on Edge Computing to reach Edge Intelligence is in its infancy. However, a few researchers have proposed some definitions.

Firstly, according to Shi et al. (2016): ‘Edge Computing refers to the enabling tech-nologies that allow computation to be performed at the network Edge so that computing happens near data sources. It works on both downstream data on behalf of Cloud services and upstream data on behalf of IoT services’ (Shi & Dustdar, 2016). Technologies such as Cloud and IoT are important push and pull factors in the emerge of IT development, consid-ering their new hardware and software components.

Secondly, the perspective of Buyokkoc (2017) is more focused on network compo-nents. He defines Edge as: ‘An open Cloud platform that uses some end-user clients and located at the ‘mobile Edge’ to carry out a substantial amount of storage -rather than stored primarily in Cloud data centers- and computation in real time, communication -rather than routed over backbone networks-, and control, policy and management -rather than controlled primarily by network gateways such as those in the LTE core-’ (Buyukkoc, 2016). Network is also an important driver for this new IT development, especially when devices are chang-ing from only consumchang-ing to a data producer as well as data consumer.

This research is built on the definition of Edge Computing a:

Technology that allows data from devices to be analyzed, formatted and translated at the edge of the network before being sent to a server, creating intelligence closer to the

data source.

This technology could be used for two purposes. According to Hewlett Packard Enterprise (HPE), these are: ‘Operational Technology (OT) Edges and IoT Edges.’ OT Edges are likely to be found in factories where machines contain intelligence and controls, but have been traditionally limited in connectivity and compute power. The IoT Edge development provid-ed the demand for this research to infrastructure and operationalization of Edge Intelligence, in this case there is a need for security and filtering of data. (Hewlett Packard Enterprise, 2017).

The distinction between OT and IoT, is made because it has an influence on the way Edge Computing is used and how the infrastructure develops. The size of an Edge device could vary from a microcomputer to the size of a server stack. OT Edges are often more powerful and larger in size, whereas for IoT Edge takes on a smaller form comparable to a node.

(9)

9

3. Research methods

The thesis project is a deductive empirical research. As written in the Theory section, the assumption is made that Edge Intelligence is the next step or generation after the centralized era of Cloud. This section describes the type of research that has been conducted for this paper and the methods that were used to answer the RQ and its sub questions in the theory section.

3.1 Literature review

To provide evidence, a content analysis of existing theories is made. The literature was stud-ied by the methods of Kitchenham (2004). Similar studies to Critical Success Factors used this method to gather data. Kitchenham describes a systematic literature review where the process of searching, evaluating and interpreting is explained in three stages. The stages are; Review planning, Review Execution and Result Reporting (Kitchenham, 2004).

Firstly, the review planning was made. This planning was based on the RQ. It also provided the basis for the search terms on which the papers were searched and selected. The Papers were categorized based on their perspective on past, present and future Information Technologies. The past perspective consisted the search terms ‘Technology wave’, ‘Industry 4.0’, ‘History of Computing – Internet – IT’, and ‘Moores Law’. The second and current literature was searched on Cloud Computing, with extra terms as ‘Cloud Storage’ and ‘Cloud Manufacturing’. The third category was a search for visions on Edge; ‘Edge Intelligence’, ‘Edge Computing’, ‘Edge Cloud’ and ‘Mobile Edge’. The focus laid on academic and practi-cal publications about visions and trends in IT.

Step two was the review execution. The literature is gathered from the electronic da-tabases of the UvA, Business Source Premier and Google Scholar. When reviewing the pa-pers, focus has been on the citation and the journal in which the paper was published. The journals were Elsevier and IEEE with publication years 1999-2018. This timeframe was chosen since IoT emerged during this period, however to frame the past criteria some rele-vant research papers were included from previous years (see table 1).

Because of the newness of the research topic, a part of the papers was selected based on the snowball method. Initially a set of papers was selected on the design types historical control or cross-sectional studies with the search terms; Technology Waves, Cloud and Edge. During the review new terms emerged that were not considered in the first draft, how-ever they were significantly present in multiple papers. Thus, references were used to select relevant and already cited literature about these topics. The snowball analysis was an itera-tive process to gain the most relevant papers on the topic. Table 1 shows the 22 papers even-tually used for the Systematic Literature review (SLR).

Lastly, the result was a literature based comparison between the paper groups and Critical Success Factors. Those were marked with the aim to provide a framework for the conceptual model (appendix A). The advantage of this review is that it provides information about a phenomenon across a wide range of studies exploiting different methods. Hence the consistency of results could provide evidence that the phenomenon is valid and reliable.

(10)

10 Table 1: Papers used for systematic literature review

Author and year Title Journal Method Past

J.E. Smith (1990) Future General-Purpose Supercomputer Architec-tures

IEEE SLR

D.D. Clark et al (2005) Tussle in Cyberspace: Defining Tomorrow's Internet

IEEE Snowball method B.M. Leiner et al.

(2009)

A Brief History of the Internet IEEE SLR M. Brettel (2014) How Virtualization, Decentralization and

Net-work Building Change the Manufacturing Land-scape: An Industry 4.0 Perspective

International Journal of ICE

Snowball method R. Schaller (1997) Moore's Law: Past, Present and Future IEEE SLR R. Al-Zaidi et al.

(2018)

Building Novel VHF-Based Wireless Sensor Networks for the Internet of Marine Things

IEEE SLR

N. Chapin et al. (2001) Types of Software evolution and Software Maintenance J. Softw. Maint. Evol. Snowball method Present M. Armbrust et al. (2010)

A view of Cloud Computing: Clearing the Clouds away from the true potential and obstacles posed by this Computing Capability

ACM SLR

M. Wang et al. (2012) Cloud Manufacturing: Needs, Concept and Archi-tecture

IEEE SLR

J. Morgan et al. (2015) Enabling a Ubiquitous and Cloud Manufacturing Foundation with field-level service-oriented Architecture International Journal of CIM Snowball method A. Ahmed et al. (2017) Mobile Edge Computing: Opportunities,

Solu-tions, and Challenges

Elsevier SLR W. Shi et al. (2016) Edge Computing: Vision and Challenges IEEE SLR H. Chang et al. (2014) Bringing the Cloud to the Edge IEEE SLR R. Roman (2016) Mobile Edge Computing, Fog et al.: A Survey

and Analysis of Security Threats and Challenges.

Elsevier Snowball method D. Liao (2018) Energy-efficient Virtual content distribution

network provisioning in Cloud-based data centers

Elsevier Snowball method

Future

K. Wang et al. (2016) Green Industrial Internet of Things Architecture: An Energy-Efficient Perspective

IEEE Snowball method M. Hermann et al.

(2016)

Design Principles for Industry 4.0 Scenarios IEEE SLR X. F. Liu et al. (2017) Cyber-physical Manufacturing Cloud:

Architec-ture, Virtualization, Communication and Testbed

Elsevier Snowball method A. Meloni et al. (2018) Cloud-based IoT Solution for State estimation in

Smart Grids: Exploiting Virtualization and Edge-Intelligence technologies

Elsevier SLR

S. Islam et al. (2010) Network Edge Intelligence for the Emerging Next-Generation Internet Future Internet Snowball method M. Satyanarayanan (2017)

The Emergence of Edge Computing IEEE SLR K. Bilal et al. (2017) Potentials, Trends, and Prospects in Edge

tech-nologies: Fog, Cloudlet, Mobile Edge and Micro Datacenters

Elsevier SLR

K. Kaur (2018) Edge Computing in the Industrial Internet of Things Environment: Software-Defined-Networks-Based Edge-Cloud Interplay

IEEE Snowball method F. Bonomi (2012) Fog computing and its Role in the Internet of

Things

Cisco Whitepaper

Snowball method

(11)

11

3.2 Interviews

The research is qualitative in nature. The theories derived from the literature were evaluated in 8 semi structured interviews with experts in the field of IT development. This led to a total of 30 sources. The interviews served as a purpose to test the criteria, characteristics and components of the concepts.

The research is conducted in cooperation with Avanade whose consultants provide advice and services for software development with a focus on Microsoft Cloud and Mobile technol-ogy. The cooperation with Avanade provided the opportunity to perform expert interviews inside the organization. The interviews were held to gather information that could support the model in the direction of infrastructure development.

The choice was made to perform semi-structured interviews with people who have knowledge about IoT infrastructure, IoT Networks, IoT Software, Cloud, and Edge Intelli-gence. The interviews were deliberately held with people who have different expertise since their perspectives and the interview structure creates a possibility for new ideas to be brought up. The interviewees were partly from technological departments (Infrastructure, Innovation and Analytics) and partly focused on business and external (Sales, Business development and Consultancy).

The CSFs and challenges were presented to the interviewee with the aim of getting some solutions based on their scientific or practical knowledge. Two interviews were held with external parties (Accenture and HPE) in an attempt to bring the research in perspective. Saldaña declares that every qualitative study is unique and that every approach is dif-ferent. Most of the coding methods he profiled overlap and can be used together. However, the process is generally the same (Saldaña, 2009). These interviews were transcribed and, in the first cycle, deductively coded. The framework and transcripts of these interviews can be found in appendices B and C. In the coding process, CSFs found during the SLR were searched in the interviewees answers. These factors provided a coding framework for struc-tural coding within the descriptive codes (technological and organizational).

In the second cycle evaluation coding was used from the evaluative nature of the in-terviews, this type of coding was supplemented by magnitude coding and values coding to explore the participants values and beliefs about the topics discussed.

4. Results

During the evaluation of the literature it became clear that most of the research papers were dedicated to predicting the future. However, to analyze these predictions it was important to make a forecast based on data from the industry’s past. With this data a trend could be de-fined. Most of the papers divide these trends in to hardware-defined eras. These eras are labeled as: ‘the mainframe era, the PC era, the Internet era, and the post-PC era’ (O'Regan, 2008). There has been discussion about the term post-PC. Though the prediction of MIT scientist David Clark is quite like the reality, so for the sake of consistency, this term is used for the Cloud network architecture and ubiquitous computing (Clark, Sollins, Wroclawski, & Braden, 2005).

The eras together could be called the evolution of IT. During this evolution different Critical Success Factors were mentioned in the analyzed papers. In this section these are summed up and compared with the interview results to determine which infrastructure de-velopments lead to the emergence and operationalization of Edge Computing.

4.1 Critical Success Factors from the literature

The papers used to define Critical Success Factors were categorized into three segments; namely, past, current, and future. Throughout these papers the most common drivers for emerging IT developments were extracted with the purpose of identifying and analyzing CSFs. In table 2 the CSFs are summed up and ranked by the number of times mentioned.

(12)

12 Table 2: Critical Success Factors from literature

Properties of Technological CSFs Occurrences Properties of Organizational CSFs

Occurrences

Bandwidth 14 Communication 9

Connectivity 13 Scalability 9

Latency 12 Privacy 8

IoT compliance 11 Security 8

Mobile compliance 8 Cost 7

Realtime 7 Energy efficient 7

Response time 7 User experience (expectation) 5

Storage 6 Geographical distribution 4

Virtualization 5 Resource Sharing 3

Network access 5 Growth in number of devices 3

Data growth (quantity of data) 4 Policy 3

Heterogeneity 4 Performance 3

Computing power 4 Continuity 2

Machine-machine collaboration 3 Trust 2

Hardware development 1 Flexibility 2

Power Distribution and Processing 1 Reliability 2

Resiliency 1 Efficiency 1

According to Ahmed et al.: ‘Non-functional requirements resembling Organization and Technology could be used as a basis for the architecture and to define progress direction’ (Ahmed & Rehmani, 2016). Therefore, these requirements were the main concepts to classi-fy the CSFs. This division is shown in table 2.

There were no CSFs that were mentioned in all the papers nor ones that had a con-vincing majority. A reason could be that during the SLR, the papers were categorized in a scoped timespan. It also became occurrent that most of the research was focused on the work in the fields of manufacturing, energy and utilities, transportation, defense, navy and tele-communications. Organizations are willing to use new IT configurations like Edge for vari-ous purposes. For example, to create smarter buildings, cities, work spaces, retail experienc-es, factory floors, and more. Though, in diverse sectors different factors could be considered important for success.

4.2 Results from interviews

The interviews aimed to validate the theory with experts, they were not intended to be gener-alizable. Initially all paradigms were covered. After the first set of interviews some refer-ences were given that led to the opportunity to reach all relevant people and acquire infor-mation saturation on the subject. Of the 30 sources 8 were interviews and 22 were literature. To compare them both data sources needed to be brought in equilibrium. Organizational CSFs and Technological CSFs were separately compared, since Moore stressed the differ-ence between those factors.

Figure 4 shows that resulting from the interviews, the most important Organizational factors for Edge and other IT developments were cost, growth in devices and scalability. These were causal factors that lead to a demand from the business.

In literature the important factors were communication, scalability and privacy. Arti-cles about IT developments were focused on the challenges from the business that Edge Computing would overcome. The exact percentages per factor are depicted in appendix D.

In the second cycle analysis of the interviews magnitude codes were applied. Partici-pants used ‘often’ and ‘a lot’ talking about trust, data gathering and money. According to these codes, they value these factors higher or see them as having a larger influence.

(13)

13

Figure 4: Comparison Organizational CSFs

Figure 5 depicts the Technological factors for Edge. It is striking that the differences be-tween interviews and literature are greater than the differences in the Organizational factors. Important factors in Figure 5 were bandwidth, connectivity and latency, according to the literature. In the interviews these factors were often brought up as well, however data growth, hardware development and IoT compliance were mentioned the most. When magni-tude codes were applied, ‘non’ and ‘little’ were used more in this context. It was emphasized that there is little knowledge in organizations, and the most important factor was network access to prevent connectivity loss.

Figure 5: Comparison Technological CSFs 4.3 Possible applications

Both in literature as well as in the interviews the term ‘Edge’ is used in many ways. Hence the common paradigm was that Edge is where ‘the things are’ in the IoT. Or an intelligent buffer for heavy machinery. Common use-cases for this were, the manufacturing floor, smart building/cities, powerplants, oil rigs, cars, and ships.

(14)

14

The importance of cost and energy efficiency, real-time and response time were fre-quently mentioned for both applications. Figure 6 depicts the architectural variables neces-sary for operationalization of Edge Computing in any form. Though some may be more im-portant than the other, depending on the use case. The architecture considers the CSFs with the security and protocol layers, virtualization, computing power and storage.

Figure 6 Edge Architecture

To connect the applications with the most important architectural elements, a division is once again made between OT and IoT.

IoT Edge: To collect and process data from IoT devices Edge Computing is often better suited than the Cloud. Edge Computing can be accomplished using gateway network-ing devices. Meola pointed out some key benefits, includnetwork-ing ‘near real-time analysis of data, lower costs related to operations and data management, reduced data sent back to the Cloud, and the assurance that other IT assets will remain operational even when one device mal-functions’ (Meola, 2016). This view was very similar to the understandings of the interview-ees. In addition, they mentioned the security and protocol translation as a supplement of Edge for IoT. The architectural elements that are most important, are the protocol and securi-ty layers and network.

OT Edge: Another application is more focused on Operational Technology. Chang et al. propose a development that integrates Edge and Cloud, something Microsoft also could implement according to the interviewees at Avanade. The Edge Cloud is a model of a hybrid Cloud architecture with micro datacenters which are designed to deliver low-latency, band-width-efficient, and resilient end user services.

Chang et al. pose that: ‘The Edge Cloud is designed to extend the data Cloud all the way to the end user by leveraging user and operator contributed compute nodes at the Edge.’ (Chang, Hari, Mukherjee, & Lakshman, 2014). The OT Edge facilitates a split Cloud appli-cation by interconnecting Edge networks from the micro datacenter with networks of a pub-lic Cloud. Hereby latency-sensitive computation and user interaction components close to end users are enabled, while additional heavyweight processing and database components will be buffered and later deployed in the Cloud.

This phenomenon occurs often with customers who are dubious of new technologies and do not want to be the first to take the leap. These users often suffer from the law discov-ered by Jan Romein, freely translated as ‘dialectics of lead’ (Romein, 1937). This law ex-plains that organizations are stuck with monolith legacy systems that need to be integrated with state of the art developments, however this move means also a change in the whole organizational structure and is not easily implemented. This occurs a lot in practice as well according to the interviewees. For this application the hardware platform and virtualization need to be solid as well as computing power and storage.

(15)

15 4.4 Challenges

The concept of the distributed Edge raises issues concerning IoT and OT. Challenges that need to be overcome for IT developments generally concern:

1. Minimize latency, 2. Reduce bandwidth, 3. Lower cost, 4. Be scalable, 5. Reduce threats, 6. Avoid duplication, 7. Improve reliability, and 8. Maintain compliance.

These are the CSFs on which the new technology needs to comply to make it a success. These factors are demands from the business. Although some articles pose that Edge is a disruptive development. Interviewees argued that IT develops incrementally, the need from the business ensures the breakthrough in the hype cycle and ‘the chasm’ is crossed (Moore, 2014).

Credibility and trust from the business are drivers for the central and decentral movement in time as well. It is a surge of power between the users and vendors of technolo-gy. Figure 7 illustrates the trend curve as explained in the theory section, where each era has its dominant infrastructure. The figure also depicts the current place of IT developments.

Infrastructure developments deal with software, hardware and network developments as they complement one another and are moving from centralized to decentralized. Though, another factor that has an influence on IT developments. This is the open source structure of developing and software decoupling from hardware. These were not mentioned in the re-viewed literature, nonetheless they pose a challenge for the predictability of developments, because of their fragmented form.

(16)

16

5. Conclusions

In the start of the computing revolution devices were solely used for mathematical calcula-tions. Every decade has different issues and challenges. At the same time, organizations find use for new technologies. As seen in the past, computers began to require more storage and power. Infrastructure changed from centralized to decentralized and vice versa. Table 3 summarizes the infrastructural developments throughout history in hardware, software, net-work and business that lead to the emergence and operationalization of Edge Computing. Table 3: Summary of IT developments per era

Drivers Hardware Software Network Business

Eras

1950-1960 First business computers

Machine Language Centralized Acceleration in mathematical calcu-lations

1960-1970 Supercomputer and mainframe

Low Level machine language

Centralized Administration pro-cesses

1970-1980 Minicomputers High level program-ming language

Decentralized Process control in industry

1980-1990 Personal Comput-er

Domain specific pro-gramming language

Decentralized Commercialization 1990-2000 Mobile devices WWW Decentralized Digitalization 2000-2010 Smart devices/

Public data centers

Cloud Centralized Internet economy 2010-2020 Virtual Machines Open source Centralized Network economy

2020-2030 Sensors IoT Decentralized Knowledge

economy

Edge Computing is a logical continuation of the moving trend towards decentralized archi-tectures. It meets all criteria from the business and therefore could be, in theory, the new computing typology for the next era. These criteria (latency, bandwidth, cost, scalability, security, simplicity, reliability and compliance) are variables and predict the success based on historical events. When the technology does not meet one of the 8 dimensions anymore, the need for IT development occurs.

The foremost infrastructural development that led to the emergence and operationali-zation of Edge Computing was the growth in number of devices. The digitalioperationali-zation led to the need for gateways, routers, and micro data centers closer to the mobile devices or for exam-ple in factories, depending on the use. This is explicitly mentioned because there are two directions for Edge, namely IoT and OT.

However, Edge will not replace the Cloud architecture, since the CSFs; ‘computing power’ and ‘storage’ could not be reached with Edge Computing alone. This brings up a second infrastructural development, the decoupling of software. This paradigm contradicts this trend in such a way that instead of choosing one development strategy, the decoupling of software and hardware lead to a more fragmented model of Cloud, Servers on premise and Edge. A division could be made between critical and non-critical implementations whereas applications which demand real-time decisions or smart filtering will be processed at the Edge and storage and heavy computing will be provided by Cloud. In the end, all computing typologies need to comply with the 8 dimensions because of their common drive to reach maximal customer experience.

Cloud benefitted from this development because it gave path to open sources. Open sources provide a standardized infrastructure that meets demands in computational and lifecycle management and has lower cost. For Edge the need for standardization and open source is even greater. In the Edge, vendor specific solutions need to interoperate. This be-comes more and more important considering the growing number of devices and fragmenta-tion.

(17)

17

6. Discussion

This research was highly based on predictive literature. An attempt is made to use historical sources during the SLR. This led to a varied palette of Critical Success Factors. The Inter-views were used to validate these factors and not to find new ones from practice. All inter-viewees of Avanade were Microsoft orientated, this could pose a risk of bias. To attenuate this bias, it was chosen to interview two employees of other IT consultancy organizations (HPE and Accenture), however the focus in all organizations laid on the IoT driver for Edge. The other stream was Operational Technology Edges. It may be interesting to explore the other stream in future research, and compare to see if one sector grows or develops faster than the other. On top of that, interviews with employees of other companies in the sector and a larger sample size could probably give a more representative result, which could make it easier to compare.

Previous research posed the implication that Edge would replace Cloud as computing typol-ogy. Although in the interviews all participants agreed that Edge could not work without Cloud. Edge could function as buffer, however, data still needs to ‘land’ on a scalable server (Cloud) and requires the functionality and capacity of Cloud in terms of computing power. Further research on replacement of Cloud could be conducted when Edge is in its majority phase.

During the research it also became apparent that the fluctuation from centralized to decen-tralized is still distinguishable, however there is a movement to fragmented structure along-side this main trend. By using the processing power of IoT devices, Edge Computing appli-cations can pre-process, filter, score, and aggregate IoT data. Except it also uses the flexibil-ity of the Cloud services to run complicated analysis on that data. How this works in practice is another interesting research topic that may be further explored.

Lastly, the key architectural goal for Edge is to have functionality in isolation in the form of services which can be deployed anywhere. The isolation can be achieved through micro-services which are self-contained. This deployment requires a runtime environment that is the same for every location. To do so, interviewees explained the concept of containerization (an Operating system abstraction), in the literature the use of Virtual machines (a hardware abstraction) was preferred. Both concepts could be topics for future research.

(18)

18

7. References

Ahmed, E., & Rehmani, M. H. (2016). Mobile Edge Computing: Opportunities, Solutions, and Challenges. Elsevier, 59-63.

Armbrust, M., Fox, A., Griffith, R., Joseph, A., Katz, R., Konwinski, A., . . . Zaharia, M. (2010). A View of Cloud Computing: Clearing the clouds away from the true potential and obstacles posed by this computing capability. Communications of the ACM, 50-58.

Ashton, K. (2009). That Internet of Things thing. RFiD, 97-114'.

Barik, R., Dubey, H., & Mankodiya, K. (2017). SoA-Fog: Secure Service-Orientated Edge Computing Architecture for Smart Health Big Data Analytics. IEEE , 15.

Boogaard, A. v., & Alberts, G. (2008). De Geschiedschrijving van computers en computergebruik in Nederland. Studium, 89-100.

Boynton, A., & Zmud, R. (1984). An Assessment of Critical Success Factors. Sloan Management Review, 17-27.

Brettel, M., Friederichsen, N., Keller, M., & Rosenberg, M. (2014). Virtualization, Decentralization and Network Building Change the Manufacturing Landscape: An Industry 4.0 Perspective. International Journal of Information and Communication Engineering, 37-44.

Business Dictionary. (2018, 5 15). Information System. Opgehaald van Business Dictionary: http://www.businessdictionary.com/definition/information-system.html

Buyukkoc, C. (2016, 3). Edge Definition and how it fits with 5G era networks. Opgehaald van IEEE Software Defined Networks: https://sdn.ieee.org/newsletter/march-2016/edge-definition-and-how-it-fits-with-5g-era-networks

Ceruzzi, P. (1998). A History of Modern Computing. Massachusetts: MIT Press.

Chang, H., Hari, A., Mukherjee, A., & Lakshman, T. (2014). Bringing the Cloud to the Edge. IEEE , 346-351.

Clark, D., Sollins, K., Wroclawski, J., & Braden, R. (2005). Tussle in Cyberspace: Defining Tomorrow's Internet. IEEE, 462-475.

Evans, D. (2011). The Internet of Things: How the Next Evolution of the Internet is Changing Everything. Cisco White Paper, 1-11.

gPress. (2013, 4 8). A Very Short History of Information Technology. Opgehaald van Forbes:

https://www.forbes.com/sites/gilpress/2013/04/08/a-very-short-history-of-information-technology-it/#5bd63802440b

Hally, M. (2005). Electronic brains: Stories from the dawn of the Computer Age. London: BBC and Granta Books.

(19)

19

Hewlett Packard Enterprise. (2017, 5 23). The Intelligent Edge: What it is, what it's not and why it's useful. Opgehaald van Enterprise nxt:

https://www.hpe.com/us/en/insights/articles/the-intelligent-edge-what-it-is-what-its-not-and-why-its-useful-1704.html

Humbert, M. (2007). Technology and Workforce: Comparison between the Information Revolution and the Industrial Revolution. Berkeley: University of California, School of Information.

Kitchenham, B. (2004). Procedures for Performing Systematic Reviews. Keele: Keele University, Department of Computer Science.

Kurose, J., & Ross, K. (2013). Computer Networking: A Top-Down Approach. Brooklyn: Pearson.

Langston, M. (2009, 10 25). DoD can't get to wave 3 using wave 2 processes. Opgehaald van Smart future: http://smart-future.org/wave-3-dilemma/

Martin, D. (2008, 6 29). David Camineer, 92 Dies; A Pioneer in Computers. New York Times, p. 24.

Meola, A. (2016, 12 20). The roles of cloud computing and fog computing in the Internet of Things revolution. Opgehaald van Business Insider:

http://www.businessinsider.com/internet-of-things-cloud-computing-2016-10?international=true&r=US&IR=T

Miller, R. (2017, 8 3). Push the cloud to the fringe. Opgehaald van Tech Crunch: https://techcrunch.com/2017/08/03/edge-computing-could-push-the-cloud-to-the-fringe

Moore, G. A. (2014). Crossing the Chasm. New York: Harper Business.

Northbridge Venture Partners. (2016, 12 13). INDUSTRY’S LARGEST CLOUD SURVEY REVEALS CLOUD MOMENTUM DRIVING ENTERPRISE TO

‘RE-ORCHESTRATE’ STRATEGY . Opgehaald van North bridge: http://www.northbridge.com/

O'Regan, G. (2008). A Brief History of Computing. London: Springer-Verlag.

P. Mell, T. G. (2011). The NIST working definition of cloud computing. NIST Special Publication, 145 - 153.

Pallis, G. (2010). Cloud computing: the new frontier of internet computing. IEEE Internet Computing, 70-73.

Panetta, K. (2017, 10 3). Gartner Top 10 Strategic Technology Trends for 2018. Opgehaald van Gartner: https://www.gartner.com/smarterwithgartner/gartner-top-10-strategic-technology-trends-for-2018/

(20)

20

Romein, J. (1937). De Dialektiek van de Vooruitgang. Het onvoltooid verleden: Kultuurhistorische studies, 9-64.

Saldaña, J. (2009). The Coding Manual for Qualitative Researchers. London: SAGE Publications Ltd.

Satyanarayanan, M. (2017). The Emergence of Edge Computing. Computer, 30-39. Schaller, R. (1997). Moore's Law: past, present and future. IEEE, 52-59.

Shi, W., & Dustdar, S. (2016). The Promise of Edge Computing. IEEE, 78-81.

Shi, W., Cao, J., Zhang, Q., Li, Y., & Xu, L. (2016). Edge Computing: Vision and Challenges. IEEE, 637-646.

Smith, M. (2013). The Natural Advantage of Nations: Business Oppurtunities, Innovation and Govervance in the 21st Century. London: Earthscan.

The European Graduate School. (2014, 1 7). Alvin Toffler: Biography. Opgehaald van The

European Graduate School:

https://web.archive.org/web/20140107225236/http://www.egs.edu/library/alvin-toffler/biography/

Toffler, A. (1980). The Third Wave. New York: Bantam Books. Toffler, A. (1998, 3 5). Life Matters. (N. Swan, Interviewer)

(21)

21

Appendix A – Conceptual model IT development

The research assumes that the historical trend provides new technologies. These have an influence on the hardware, software and network developments. However, there is another driver for these developments, coming from the Business. There will be a break through when technology meets the Critical Success Factors posed by the business.

(22)

22

Appendix B – Interview Framework Name:

Function: Date:

It was chosen to perform semi-structured interviews. This framework consists of the themes that are to be explored with the interviewee. There is a possibility that new ideas are brought up during the interview. The specific topic of focus is the Development of IT to Edge Compu-ting. The CSFs and the challenges they bring are presented to the interviewee with the aim of getting some solutions based on their scientific or practical knowledge. The interviews are also used to bring the research in perspective.

General questions:

To frame the paradigm of interviewee, could you describe your function?

What do you think, from your perspective, could be the new IT developments in 2020 and 2030?

What would be a good technological and business approach to deal with this emerging IT development?

How would the organizational architecture look like in 2020 and beyond? Do you think in the future IT components would be centralized or distributed?

Introduce Edge:

Are you familiar with Edge Intelligence? How did you find out about it?

(Edge is seen in literature as the new computing typology. Instead of running configurations in the Cloud, it is decided to build the components as applications, platform and infrastruc-ture on the Edge of the network, closer to the devices which are considered data sources. There are three broad categories of Intelligent Edge: ‘operational technology (OT) Edges, IoT Edges, and information technology (IT) Edges.)

Could Edge Intelligence be considered as a new development or is there a new IT develop-ment emerging?

What is needed for (Edge) deployment? How would an (Edge) infrastructure look like?

Introduce CSFs:

Could you tell me some challenges for the upcoming IT developments? How reasonable are these challenges and are these easy to solve?

(Purpose of this question is to compare challenges from literature and practice. CSFs from literature; minimize latency, reduce bandwidth, lower cost, reduce threats, avoid duplica-tion, improve reliability, and maintain compliance.)

What do you think will be CSFs for Edge Intelligence or another architecture? How difficult is this to realize?

(23)

23

Appendix C – Interview Transcripts

The interviews were conducted in Dutch. During the interviews examples and customers were mentioned. These were left out for privacy reasons. The interviews were coded based on the CSF’s. These were translated in Dutch as well and color coded. Technological factors are blue

and Organizational factors are orange. Table 4: Dutch translations of the CSFs

Technologie KPIs Organisatie KPIs

Bandbreedte Communicatie

Connectiviteit Schaalbaarheid

Wachttijd Privacy

IoT Veiligheid

Mobile Kosten

Realtime Energy efficientie

Response time Klantverwachting en gebruikservaring

Opslag Geografische verdeling

Virtualisatie Resource Sharing/ Kennis delen

Netwerk Hoeveelheid devices

Data Beleid

Heterogeniteit Prestatie

Rekenkracht Continuiteit

Machine-machine samenwerking Vertrouwen

Hardware ontwikkeling Flexibiliteit

Krachtverdeling en verplaatsing Betrouwbaarheid

Veerkracht Efficientie

Interview I

Name: L. van Veen

Function: Pre- Sales Hardware HPE Date: 8-5-2018

I: Hoe kan ik uw functie het best omschrijven?

L: Ik zit in een pre sales organisatie dat zit vast aan de sales organisatie. Verkoper met de rol is om met klanten te praten om duidelijk de behoeftes in kaart te brengen en inventariseren welke oplossing van hp daarbij past. Gericht op techniek, maar ook een stukje commercieel.

I: Is dat vooral hardware?

L: Ja nu wel, voorheen hadden we ook software oplossingen, maar dat is gesplitst. Maar hardware en software gaan samen. Focus op SAP klanten, dan is het niet genoeg om alleen servers te verkopen, je moet wat weten van het software landschap en hoe ze dat gaan beheren. Je

moet kennis hebben van software om hardware te kunnen verkopen, andersom niet vaak.

I: Vanuit uw perspectief wat denkt u dat de nieuwe IT-ontwikkelingen kunnen zijn in 2020 en 2030?

L: Hele brede vraag. Op gebied van opslag, verwerking data e.d. waar ligt de vraag?

In grote lijnen, wat organisaties willen doen, dan kun je het samenvoegen onder de noemer inzicht verkrijgen. Software bedrijven als SAP en Microsoft zijn erop gericht om inzicht te krijgen in het bedrijf (data-driven enterprise, computational driven enterprise)

Computational: met data rekenen. Hoor je niet zo veel, maar bouwt voort op het dataverzamelen.

I: Kun je het zo zien dat de huidige situatie is zoveel mogelijk data verzamelen en alles wat inzichtelijk gemaakt kan worden inzichtelijk maken met dus ook IT oplossingen?

L: Ja daar wordt veel in geinvesteerd. Ook veel investeringen in applicaties als Tableau en SAS. Veel investeringen in ML en AI, maar dat is vooral marketing driven. Maar bedrijven worstelen hier erg mee. Ze hebben hun oude IT nog niet op orde.

(24)

24 I: Dus 2030 is nog te ver?

L: Moeilijk zeggen. Met name Cloud gaat heel snel. Miljarden investeringen ook vanuit microsoft, amazon, google en kleinere partijen richten zich op het software ecosysteem.

L: Probleem met organisaties dat zij alle hardware en software vaak nog zelf doen. Ze hebben verschillende guidelines. In de Cloud is er een framework en er worden diensten aangeboden die vrij makkelijk te verkrijgen zijn in plaats van zelf al het beheer en dergelijk. Veel deals lopen nu anderhalf jaar terwijl voorheen dit process nog niet eens 3 maanden was. Er zijn zoveel opties;

Cloud, dedicated servers, hosts voor specifieke programma’s. Elke optie heeft ook zijn eigen specialisaties.

I: Hoe komt het dat het zo lastig is om een standaard pakket te kiezen?

L: Een standaard pakket is er wel, maar dit is vooral interessant voor nieuwe bedrijven. Zij kopen een pakket en passen het bedrijf daarop aan. Maar bestaande bedrijven (15 jaar en ouder) hebben vaak nog een traditionele bedrijfsvoering en veel verschillende stakeholders en leveranciers. Aansluiting op de echte wereld is lastig, wanneer je denkt IT op orde te hebben veranderd het allemaal weer, voorbeeld interfaces. Hier zitten miljoenen in (kosten). Zorgt voor aarzeling in

beslissingen maken over applicaties. Snelle veranderingen in code, zorgen ook voor veroudering.

SAP heeft ooit een programmeertaal geintroduceerd dus klanten hebben daar interfaces op gebaseerd, maar als de leverancier daar opeens mee stopt stoppen al die devices er ook mee. De complexiteit zit hem in het niet flexibel of universeel zijn.

L: Cloud kan dingen mogelijkmaken vanuit IT beheer van applicaties en de relaties met elkaar. Je hoeft het process niet te doorgronden. Klanten vinden het aanlokkelijk om dit uit te besteden, maar dit kost ook geld.

I: Is er een technologische of organisatorische aanpak die je zou kunnen hanteren als klant of als provider om in te spelen op de toekomst?

L: Ik denk van niet want er zijn er te veel. En elke keuze heeft consequenties. Open source is wel opkomend. Veel bedrijven kijken nu naar mogelijkheden in de open source wereld. Als jij een

infrastructuur basis wil deployen heb je openstack, dit is een software omgeving om

infrastructuur te managen. Geen domeinkennis nodig. Het is redelijk veilig om hier in te investeren.

I: Er wordt dus meer gedeeld? L: Ja.

I: Vanuit theorie wordt Cloud vaak genoemd als te ver afstaand van de klant, dus nu wordt intelligent Edge uitgedragen als gedecentraliseerd model. Wat denk u daarvan?

L: De afstand is een infrastructureel probleem. I: Zijn er wel eens vragen vanuit de klant?

L: Jazeker, dit is een zeer actueel vraagstuk. Ook deze term wordt wel gehyped. 2 jaar geleden zijn we begonnen met IoT, en dat was zeer lastig. IoT bevind zich in de Edge. Edge is plat gezegd alles wat niet in de Cloud staat. Bijvoorbeeld de mobiele telefoon. Data verzamelen is relatief eenvoudig omdat veel wordt geschreven naar servers, denk aan backups en dergelijken.

Wanneer iets meer realtime wordt(processen met machines, leveringen) wordt snelheid een

kritischere factor. Wanneer een foto geupload wordt naar de Cloud om te analyseren duurt dit veel te lang. Met name voor machine learning wordt doormiddel van de infrastructuur en

datacenters de Edge geprogrammeerd. Je geeft een blauwdruk van de echte wereld. Waar moet op gelet worden (code neurale netwereken) dit wordt berekend in de Edge.

I: Is het een buffer?

L: Ja een buffer voor berekeningen.

I: Waar werd het als eerst opgepikt? Klant of techologische ontwikkeling?

L: Lastig te zeggen. Als je kijkt naar de OT veel fabrieken hebben meerdere sensoren in de

machines. Ze meten spanning, tijd, energiegebruik. Behoefte van een klant om te weten wat er in de fabriek gebeurt is er al lang. Maar de technologie ontwikkelingen qua opslag en computing

(25)

25

power maken dit vraagstuk weer actueel. Machinelearning bestaat al sinds de jaren zestig, maar

de computers waren niet krachtig genoeg om deze intensieve berekeningen te doen. Maar

computers zijn veel sneller met oa grafische kaarten en daardoor komen oude ideeen weer op om te kijken of dit nu wel kan. Voornamelijk predictive maintenance. Vraag was eerder en nu begint de technologie er klaar voor te zijn.

I: Is het een incrementele ontikkeling of disruptief? L: Dit is incrementeel.

I: Is Edge een nieuwe architectuur?

L: Ja, maar Edge is voornamelijk een nieuw vraagstuk bij veel bedrijven. Het borgen van deze infrastructuur is heel complex. Klanten gaan ook naar meerdere consultants. Je hebt de technologie om inzicht te verkrijgen. Maar je hebt te maken met verschillende partijen en privacy. Ook aansluitingen zijn er wel maar kunnen niet meer integreren in de huidige tijd. Flexibele integratie van IoT is opkomend.

I: Wat zou er nodig zijn om Edge architectuur uit te voeren voor de klant?

L: Het grootste struikelblok is de manier waarop het bedrijf zelf is ingericht. Er is niet vaak iemand verantwoordelijk vanuit het bedrijf. Om daar een project van te maken met buy in vanuit de business.

I: Is dit etisch of is er geen kennis?

L: Vooral het gebrek aan kennis. Omdat het nieuw is is het lastig om het beginpunt vast te stellen. En nog een heel belangrijk punt is het vraagstuk over wat het oplevert. Dit is nog niet altijd duidelijk.

I: In de literatuur wordt vaak beschreven dat bedrijven nog steeds in de transitiefase zitten naar de Cloud. Is dit in praktijk ook het geval?

L: Vooral kleine bedrijven hebben alles in de Cloud. Veel bedrijven zijn inderdaad nog niet echt Cloud ready, veel nog on premise. Het overzicht van de kosten en de locaties waar data

opgeslagen staat is vaak onduidelijk. Hiervoor komen wel nieuwe applicaties op om inzicht te krijgen in deze kosten. Maar dit is ook weer een extra product wat erbij komt.

I: IoT is veel meer gedecentraliseerd. Zijn bedrijven nog een centralisatiefase?

L: Het wordt nog niet breed aangepakt. Veel bedrijven kopen bij machines vaak wel meteen sensoren erbij om onderhoudszaken te kunnen voorspellen. Businessmodellen moeten ook worden aangepast, er worden geen producten verkocht maar een dienst.

I: In Cloud betaal je toch ook voor gebruik? L: Ja een pay per use model.

I: Kun je de centralisatiefase overslaan? Heb je Cloud nodig voor Edge?

L: Je hebt sowieso iets nodig om data op te slaan. Edge stuurt data, dat moet ergens op landen. Er

moet een infrastructuur zijn. Dit hoeft niet op de Cloud te staan. Belangrijk is netwerkverbinding

en genoeg bandbreedte. Het voordeel van Cloud is dat zij een voorziening bieden met veel

software applicaties om deze gegevens te verwerken. Dit kan ook on premise. Het is een

afweging tussen kostenen tijd. Wil je het zelf bouwen of een pakket kopen wat al kant en klaar

is.

I: Samenvattend wat zijn dan de uitdagingen voor de IT ontwikkelingen? L: Integratie is de grootste uitdaging vanuit de techniek.

De data stromen zijn er. Het lastige is om deze op een plek te krijgen en dat zal nog wel even blijven.

I: Is er niet een punt dat er te veel data is?

L: Er zijn wel technologien om te kijken of data dubbel is. Je kunt heel veel data hebben maar je

(26)

26

hoe meer ruis. Vanuit techniek is veel data ook een probleem. Het kan heel veel kostenom alle

data te berekenen.

I: Hoe worden de kosten van data verwerken berekend?

L: Nooit echt over gesproken met de klant. I: Wat zijn de factoren?

L: Voor een datawarehouse bevoorbeeld moest total cost of ownership berekend worden.

Als je een lijstje zou maken heb je Infrastructuur (servers, storage, afschrijving en onderhoud ed.), Opslag (center, stroom, toegang), Mensen (beheerders, analystisten, IT experts), Software

kosten (licenties en onderhoud) dat zijn de grootste factoren. Training kun je meenemen als je het goed doet. Hiermee loopt het al snel op. Met Cloud vallen dingen weg (pay-per-use), die komen samen in een factuur. Maar Cloud moet ook 24/7 bereikbaar zijn. Edge kan alles verzamelen en bufferen en op een bepaald moment in tijd uploaden. Maar is dit wat je wilt.

L: Edge zorgt ervoor dat je niet alles met de Cloud (het brein) hoeft te delen. IoT devices kunnen zelfstandig beslissingen maken. Dit maakt Edge efficient. Uiteindelijk is er wel een correlatie

nodig met het grote geheel, dus je ontkomt niet aan data verzameling. Hoe meer je op de Edge

kan doen, scheelt kosten in bandbreedte en je kunt realtime beslissingen nemen. Daarnaast moet

er altijd communicatie zijn al is het alleen voor updates.

I: Zijn efficiente lage kosten hoge bandbreedte dan kritische succesfactoren voor het gebruik van

Edge voor IoT devices? Hoe kun je meten dat het werkt?

L: Vanuit technologie kun je bijna alles maken. In NL is een hele goede netwerk dekking. Kosten

zijn nog wel een beperking. Hier kan een project op falen. Kostenzorgen voor acceptatie.

Privacy is ook heel erg aan het opkomen, wordt vaak genoemd maar mensen kunnen er ook nog niet zoveel mee. We denken vaak dat we veel vanuit verschillende perspectieven moeten bekijken maar er is vaak een ‘geitenpad’en de rest doet er niet toe.

I: Zou je kunnen zeggen dat bedrijven een verouderde strategie hebben die niet bij de IT ontwikkeling past?

L: Ze zitten vooral met hun legacy, ze moeten aan hun oude systemen gekoppeld blijven. I: Waarom?

L: Stel je wil doet iets met sensoren en je hebt allerlei machines die gekoppeld zijn, processen zijn geregeld. Dit is gebaseerd op een oude technologie. De nieuwe technologie moet hier wel op

aansluiten. Je kun wel alles weggooien en opnieuw beginnen hebben maar dan komt het kosten

vraagstuk.

I: Hoe lang duurt implementatie van IoT voor een bedrijf?

L: Dit hangt af van de integratie, de business case, gebrek aan kennis. Heel veel IoT projecten zijn vooral pilots. Veel wordt eerst standalone ontwikkeld, dan hoeft het niet te integreren. Dat is een mogelijke strategie. De meeste projecten gaan kapot op geld en resources.

Interview II

Name: R. Moerman

Function: Cloud transformation and Sales lead Avanade Date: 17-5-2018

I: Wat is de omschrijving van je functie?

R: Cloud transformation lead. Dat wil zeggen dat ik de solution op gebied van Cloud mede ontwikkel en uitdraag en help verkopen. Cloud transformation dat zijn alle zaken die te maken hebben met applicatie en workloads naar de Cloud brengen in welke vorm dan ook. De Cloud kan ook zijn Azure Stack. Naast het verkopen ook het initieren en de eerste architectuur discussies. Veel contact met klanten.

I: Vanuit uw perspectief, wat zouden de nieuwe IT ontwikkelingen kunnen zijn in 2020 en verder?

Referenties

GERELATEERDE DOCUMENTEN

We can take note of the United States initiative to develop and enhance cyber intelligence and cyber security measures in order to better predict computer-related

Omdat nieuwe infectieziekten niet alleen in Nederland, maar overal ter wereld kunnen ontstaan en zich snel naar Nederland kunnen verplaatsen, moet voor de bewaking van de

The research originated out of the thoughts that the opportunities of cloud computing were studied at national and European level but not yet at the level of

Based on these criteria, a shortlist of CSPs was made, and those were approached for interviews. The interview questions can be found in Appendix F. Each interview resulted in

The research towards a positive business model for edge computing adoption within the health care sector to support adoption of and provide patients with a system for remote

Naar schatting zijn er binnen de bebouwde kom tussen de 560 en 784 rotondes met vrijliggende fietspaden, waarvan op 60% fietsers voorrang hebben.. Rotondes met 'fietsers in de

Enkele sporen dateren mogelijk uit deze periode, zoals een dempingspakket overheen twee grote grachten ten zuiden van de kerk, al kan door middel van het aardewerk de 19 de

In section 4 the simple algorithm is optimized until the majority of a bag with N elements can be computed with never more than r 3 fl- 2 element comparisons.. Fischer has proven