• No results found

Big (crisis) Data and Volunteers. Exploring OCHAs Crowdsourcing Advancements since Haiti 2010

N/A
N/A
Protected

Academic year: 2021

Share "Big (crisis) Data and Volunteers. Exploring OCHAs Crowdsourcing Advancements since Haiti 2010"

Copied!
76
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Big (crisis) Data and Volunteers

Exploring OCHAs Crowdsourcing Advancements since Haiti 2010

Name: Emil Berlin

ID: s1752952

Date: 08/06/2017

Word Count: 23,756

Course: Master Thesis CSM

Supervised by: Dr. S.L. Kuipers

(2)
(3)

3

Table of Contents

Abstract……….….….……..5 Foreword………..……….…6 List of Abbreviations………..…….…….7 1. Introduction………....8 1.1. Academic problem………....8 1.2. Research Objective………..10 1.3. Research Question………...12 1.4. Societal Relevance ………..12

1.5. Academic and Societal Relevance………...12

2. Theoretical Framework……….………..14

2.1. Definitions and Concepts Used………...14

2.2. Diffusion factors by Rogers ……….…..16

2.3. Hypotheses ………...22

3. Methodology……….23

3.1. Research Design………..23

3.2. Why Single Case Study………...24

3.3. Data Collection………25

3.4. Operationalization………...25

3.5. Methodology: Process Tracing ………...26

3.6. Limitations………..27

4. Analysis……….29

4.1. Exploring OCHA’s Crowdsourcing Advancements Since Haiti 2010………...29

4.1.1. Background of Big Data in Humanitarian Interventions (case examples)…….29

4.1.2. Characteristics and Overview of various V&TCs………...36

4.1.3. Recap of Research Puzzle………...39

4.2. Factors Applied………...39

(4)

4

4.2.2. Communications Channels Used………54

4.2.3. Effects of Timing………55

4.2.4. Social System Characteristics……….56

5. General Conclusions………...62

5.1. Answer to Research Question………62

5.2. Reflections: Limits and Implications of V&TCs beyond the Case Study………...64

5.3. Avenues for Further Research………....66

(5)

5

Abstract

When organisations find out that their coordination efforts are insufficient for current goals, they tend to align themselves with actors with different yet compatible missions. In an early stage of this cooperation it is common to choose to align through collaboration instead of coordination, since the latter indicates shared authority and responsibility. When OCHA need to gain further information about an ongoing sudden disaster onset they look towards Volunteer and Technical Communities (V&TCs). This has led to an outsourcing relationship emerging between the two, with V&TCs producing crowdsourced material on behalf of OCHA.

This exploratory study will outline OCHAs development in including crowdsourced information in sudden disaster onsets. This thesis will also analyse which factors that played the most significant role in OCHAs decision to outsource certain information management products to outside actors with the help of the diffusion of innovations theory.

Key words: Disaster Management, Information Management, Big Data, OCHA, and Volunteer & Technical Communities

(6)

6

Foreword

This study has been conducted as a part of the final assessment of the MSc in Crisis and Security Management at Leiden University in The Hague. This research was part of a larger project via the Centre for Innovation at Leiden University, in which I was a research assistant in a project called SeventyTwo1 in which a web-scraping tool was developed for UNICEF in cooperation with Humanity X and the Dutch Coalition for Humanitarian Innovation. The interviews were conducted, transcribed, and coded by me and my research colleagues and later applied to the research part of the project. The research project aimed at mapping the current information management landscape by humanitarian relief organisations in sudden disaster onsets. Some of those interviews were later used in the analytical section of this thesis.

A great deal of inspiration for the study came from John Sabou, which was one of the researchers who got me on board the project and helped me with developing the research idea. I am therefore very grateful for all the help he provided to me in the earliest stages of this study.

Furthermore, I am also incredibly thankful to my supervisor Dr. S.L Kuipers for taking over me as a supervisor after John. Your inputs have been incredible valuable for me throughout the end stages of this thesis and I cannot thank you enough for having so much patience with me and some of my (sometimes) unrealistic grand plans for this thesis.

With those words, I want to end by saying that I am very grateful to identify this project as my final work at Leiden University.

(7)

7

List of Abbreviations

U.N - United Nations

V&TCs - Volunteer and Technical Communities

HOT - Humanitarian OpenStreetMap Team

OCHA - Office for the Coordination of Humanitarian Affairs

SBTF - StandByTaskForce

WFP - World Food Programme

DHN - Digital Humanitarian Network

NGO - Non-Governmental Organisation

(8)

8

1. Introduction

“Powered by cloud-, crowd-, and SMS-based

technologies, individuals can now engage in disaster response at an unprecedented level. Traditional relief organizations, volunteers, and

affected communities alike can, when working together, provide, aggregate and analyse information that speeds, targets and improves humanitarian relief” (Weinandy, 2016).

1.1. Academic Problem

During an ongoing disaster or crisis; every actor involved ranging from the public to the media and relief organisations aim to gain as much situational awareness as possible in an early stage. This is arguably a very complex process, which demands a great deal of understanding and comprehension from various sources of information (Castillo, 2016). Gaining situational awareness is a combination of various sources of information and various actors interacting with each other. For humanitarian workers, the lack of verified and well-processed information on which they are able to base decisions upon has for long been considered one of the greatest challenges they are facing during deployments. In the early stages of sudden disaster deployment, information about events are moving rapidly with new patterns emerging regularly. As a consequence, information related to the disaster is scattered across various decentralised reports across a range of different organisations.

Since the earthquake in Haiti 2010, new information management tools and practices have evolved, particularly regarding the acquisition of information gathered from big data. This has drastically changed the way in which information is collected and analysed during disasters. Mining such data, translate it into an understandable output, and finally put that information in the hands of decision makers is tremendously labour intensive. At the same time has the number of people in need of humanitarian aid almost doubled in the past decade, which is further expected to continue rising. Additionally, the amount of field workers (those staffing resources which are explicitly aimed at crisis response and rehabilitation activities) went through an average increase of 6 percent a year since the turn of the twentieth century, while at the same time the costs of international humanitarian aid have more than trebled (Harvey,

(9)

9 2010). On a positive note, there are today an abundance of actors and volunteers who are willing to lend their time and services to humanitarian organisations or NGOs during disasters, especially in regard to various information management tasks. However, this demands a lot of coordination and expertise which humanitarian aid organisations traditionally are lacking expertise in.

One can thus understand the challenges what an increased need of relief efforts coupled with a more complex flow of information and communication may have had on the information management tasks for the United Nations (U.N) during sudden disaster onsets. This has led to an “Information gap [which] is now more pronounced than ever, not because of a lack of human effort, but because communications are growing more complex at a faster rate than current tools and practices can handle. The [U.Ns] cluster system has been heavily criticised for not being structured or given the resources necessary to deal with the information dynamics they face” (Harvard Humanitarian Initiative, 2011, p.18). The stakeholders in the U.Ns cluster system are therefore dedicating time and money to develop better ways in which they can collect and use digital data to improve their information management products. This is where digital networks come in to the picture.

Research by Bennet and Segerberg (2012) has shown how digital technology – in particular social media – is able to rapidly form networks around shared goals in a previously unprecedented speed. This new development is something that many have tried to harness by launching various platforms and develop new methods aimed at increasing situational awareness for humanitarians on the ground (such as mapping services). The actors providing such services have been named Volunteer & Technical Communities (V&TCs) and are usually set up by voluntarily computer scientist from across the globe. These services are primarily based on big data gathered information which is collected from a variety of sources, such as social media platforms, phone records, satellite images, and so forth. The primary benefits of big data gathered information in disaster management can be observed in the earliest stages of a disaster when information is scarce and situational awareness low (Plotnick et al., 2015). However, big data can also have more long-term benefits, such as analysing epidemic trends. Scholars (Meier, 2015; Weinandy, 2016) have repetitively argued that validated and well-processed big data information can improve the effectiveness and efficiency in disaster management.

(10)

10 By utilising such services, information managers can build on new insights based on the information collected, adapt plans according to real-time information, and measure the impact of their work more readily. For humanitarians, the disconnected aspect (i.e. lack of interactions with headquarters in e.g. New York or Geneva) is now not as important as before, whereas a shift of importance has taken place from basic connectivity to information management in the past years (Harvard Humanitarian Initiative, 2011). This thesis therefore aims at understanding and locating the factors which have influenced and facilitated such diffusion of outsourcing certain products to actors who have traditionally been located outside of the humanitarian aid domain.

1.2. Research Objective

When organisations find out that their coordination efforts are insufficient for current goals, they tend to align themselves with actors with different yet compatible missions. In an early stage of this cooperation it is more common that these actors choose to align themselves through collaboration instead of coordination, since the latter indicates shared authority and responsibility (Sabou & Klein, 2016). Since a lot of the technical knowledge related to big data has been absent within the U.Ns cluster system - and the humanitarian community in general - this has resulted in paving the way for V&TCs to enter the arena. Since the earthquake in Haiti, OCHA has started to outsource tasks which demand a certain degree of technical expertise to V&TCs. These V&TCs are collecting 3W2 data and later coordinate their findings with technical developers. This data is subsequently synthesised into products such as crisis maps and situation reports which are subsequently delivered to OCHA as an information management product. Understandably, this progress has led to a variety of fundamental changes for information managers active in the humanitarian community.

The V&TCs, which are voluntarily in nature, are utilising new technologies such as media tracking, geo-location, geo-mapping, data cleaning, and social networks monitoring to produce information accessible to aid workers on the ground of a disaster. In the aftermath of the earthquake in Haiti, V&TCs collected, verified, and analysed more data than the entire humanitarian field even had the capacity to process (Capelo et al., 2012). These networks are thus able to offer otherwise very timely and cost-efficient tasks for humanitarian organisations,

(11)

11 but it does also demand a certain level of training and professionalism in order to not produce false and not accurate results (Guay, 2016).

It is important to bear in mind the critique the U.N have been receiving after being described as an actor which has been increasingly lagging behind in terms of technological advancements due to rigid organizational practices which are very firmly rooted in the work model of documents and databases passing through hierarchies (Harvard Humanitarian Initiative, 2011). Humanitarians working in the field have therefore argued that “OCHA itself may find itself in danger of becoming irrelevant if it does not take meaningful action to augment its existing capabilities with technical and social contributions afforded by the 21st century” (Sabou & Klein, 2016, p. 14).

What makes this rapid adoption unique is that the U.N. is very well-known for being an extraordinary bureaucratic and slow organisation when it comes to implementing new innovations (McGreal, 2015). Furthermore, many within the U.Ns organisational structure have also expressed their dismay towards V&TCs and shared their belief that crowdsourced data is something which should be controlled rather than adopted. As one may understand, it therefore should not be farfetched to believe that (based on previous innovative implementations), that implementing V&TCs and crowdsourced data into the standard procedures of a sudden disaster onset should be a very slow process and that the implementation would occur from the top-down. However, such has not the case been with this innovation. Instead, the adoption has been very rapid and entered the organisational routines via bottom-up approach, namely that it was adopted voluntarily by humanitarians operating on the ground rather than a developed product that was trickled down throughout the organisation (Sabou & Klein, 2016). This should thus be viewed as a remarkable process which seem to thus far have been unprecedented within the humanitarian aid domain. It thus possesses some great characteristics to conduct research on. We can therefore here observe an actor that has been deviating from its traditional practices by implementing new alignments and work protocols in an unprecedented speed. It is important thus to conduct exploratory research on the topic in order to try to augment a picture of the why’s and how’s such rapid adoptions has been made possible with the help of innovation theory.

This thesis will hence analyse the development from 2010 up until 2016 by the U.N. regarding the outsourcing of products to actors outside of the cluster system during sudden disaster

(12)

12 onsets. This means that the thesis will locate how the main information management actor within the U.N., namely OCHA, has been utilizing V&TCs to service information gaps through various web 2.0 and social media scraping tools and identify how such progress has been made possible. It will do so by identifying the factors that have facilitated or impeded the spread of such technology with the help of the diffusion of innovation theory by Rogers (2010).

1.3. Research Question

This leads us to the research question:

RQ: Which factors did facilitate OCHAs decision to diffuse the outsourcing of certain information management products (such as crisis maps and situation reports) during sudden disaster onsets to Volunteer and Technical Communities that are located outside of the traditional humanitarian aid domain?

1.4. Academic and Societal Relevance

This subject is a rather under researched one due to the relative newness of big data gathered information being utilized in sudden disaster onsets. Most studies thus far have mainly focused on the technological aspects of the innovation, such as data mining and NoSQL (e.g. Fadiya, Sadayam & Zira, 2014; Elgendy & Elragel, 2014), whereas actual in-depth research on what factors that influenced the organisations (and in particular OCHAs) decisions to utilize new ways of cooperation with non-traditional humanitarian actors has thus far been scarce. It is relevant for those reasons for academics and practitioners to gain a deeper understanding of how the involvement of outside actors in a historically closed domain which traditionally only involved a handful of big actors is suddenly shaping the future of deployments for sudden disaster onsets. It is further also relevant to see how actors who practically lack any type of humanitarian background (but has a rather data oriented technical background) are able to assist and shape the information management landscape in an organisation such as the U.N.

By understanding the factors leading to the decisions to outsource products to outside actors, one can gain a deeper and more thorough understanding of the landscape within the change is taking place. Furthermore, this thesis will aim to fill the gap of understanding how a traditionally slow and bureaucratic organisation such as the U.N has been able to implement

(13)

13 new practices into their sudden disaster protocols in such a fast speed. Hence this work will aim at assisting future researchers by function as a starting point to base future research upon.

(14)

14

2. Theoretical Framework

2.1. Definitions and Concepts Used

In this section, concepts that will be used throughout the study will be defined and elaborated upon.

Situational awareness

The first is Situational Awareness, which is the “state of knowing what is happening in your immediate environment and understanding what that information means for a particular situation - including perception of the elements in the environment and how those elements relate to each other “(Vieweg, 2012, p. 20). In order to attain situational awareness, one is required to understand “the objects in the region of interest,” together with “knowing the relations among the objects that are relevant to the current operation” (Matheus et al., 2003). Big Data

The second concept is ‘Big Data’ which refers to the extremely large datasets from which one may analyse trends, patterns, and associations. The definition of the concept does usually contain what is called the three V’s, namely Velocity, Volume and Variety. Big data can be data which is collected from any source ranging from social media, to blogs, sensor networks, image data, and other various forms of data which can be in any size and format possible (Ward and Barker, 2013).

OCHA

This thesis will focus on the United Nations and in particular the Office of Coordination of Humanitarian affairs (OCHA) since it is the primary information management actor within their cluster approach. Hence it is the actor which is in charge for facilitating an efficient coordination of information among all U.N. actors involved (UNOCHA, n.d.).

Outsourcing

When this thesis is referring to outsourcing, it is referring to the practice of contracting out certain tasks (in this case information management products) to outside suppliers (in this case V&TCs) in order to obtain better or more cost-efficient services than the in-house capacity of an organisation or company would be able to produce alone (Dictionary, 2002).

(15)

15 Information management

Information management is at the core of this thesis. When discussing it throughout the thesis, it is referring to the practice of acquiring information from one or more sources and then distributing that to those who are in need of that information (Boaden and Lockett, 1991). In this case, those in need of it are decisions makers within the U.N.

Sudden disaster onsets

Within disaster management, actors may face a wide array of types of disasters. In this thesis, the primary focus lay on the sudden onset ones, which means that the disaster is happening without any prior warning amid its breakout. Most common ones are natural disasters and are usually earthquakes and tsunamis. On the contrary, slow onset disasters are easier to predict and are relatively slow, but can still be equally disastrous for those affected (DWF, n.d.).

Crowdsourcing

This rather new concept is referring to the practice of utilizing a crowd of people to collectively participate in the initiatives to perform certain practices, funding’s, or evaluations, in order to engage in suppling a goods or a service (Estellés and Gonzáles, 2012). It can thus be viewed as a collective workforce who is utilizing the fact of being large in numbers to work collectively towards a certain shared goal.

Volunteer and technical communities (V&TCs)

V&TCs are the ad-hoc groups launched for the purpose of utilising crowdsourced material to reach their goal. They are simply the networks of volunteer which gather all the data on behalf of humanitarian organisations which they subsequently deliver to said organisation in forms such as mapping products and reports. The V&CTs which are covered by this study are the most prominent ones that have actively worked together with OCHA in at least one deployment. These are all part of the Digital Humanitarian Network3 (DHN) and involve actors such as MapAction, Stand-by Task Force (SBTF), Humanitarian OpenStreetMap Team (HOT), and CrisisMappers.

3 A network of networks of V&TCs which aim to provide an interface between professional humanitarian

(16)

16 Humanitarian aid domain

The humanitarian aid domain is referring to the sector in which all humanitarian organisations operate within. It is including actors such as the United Nations, the Red Cross, Doctors without Borders, and other similar organisations which are deployed in sudden disaster onsets.

Disaster management

Is the management of a disaster which has already struck and is done in order to limit the risks and hazards of those affected by it. Disaster management is not able to advert or prevent a disaster from happening, but is solely dealing with limiting the effects of it ones it has happened.

Digital humanitarianism

Is not only a book written by Patrick Meier, but also a term he coined for the people who are one way or another volunteering for V&TCs digitally. It is therefore used as a name for people vesting their free time to assist with crisis mapping or similar activities online (Meier, 2015).

Web 2.0

Internet with more input from users compared to its predecessor Web 1.0. Previously, publisher and viewer relations meant someone uploaded the material online which could later be viewed by users. But 2.0 is referring to the internet with greater collaboration and contribution by users (e.g. social media) leading to a way more data being available.

2.2. Diffusion Factors by Rogers

This thesis will apply the theoretical lens of ‘Diffusion of Innovations’ by Everett Rogers which aims at explaining how, why, and at what rate innovations such as new ideas and technology spread. The essence of the theory, and of diffusion itself, is encapsulated in four main elements and is defined as “the process in which an (1) innovation is (2) communicated through certain channels (3) over time (4) among the members4 of a social system” (Rogers, 2010, p.35). These four elements are the cornerstones of the entire theory, and are identifiable in every diffusion research study, campaign or program that exists. The diffusion of innovation theory provides

(17)

17 an ideal ground to base and analyse this work upon. In a later chapter, this theory will be applied towards the diffusion of outsourcing technical expertise by OCHA and analyse which of the following factors contributed to either the facilitation or impediment of that diffusion. The following section will thoroughly elaborate on the above mentioned four main elements and lay out a detailed description of the theory.

Characteristics of the Innovation

The first element to start with is the characteristics of the innovation, which is arguably the most essential of the four elements. Innovation is defined as an “idea, practice or object that is perceived as new by an individual or other U.N.it of adoption” (Rogers, 2010, p.40). In regard to time, it matters rather little whether the idea is new (measured from its time of discovery). However, more significant attention is focused at its ‘perceived newness’, i.e. how new it feels towards the organisation or individual (if the idea is new to the individual, than it simply is an innovation). Newness can also be understood and expressed in terms of knowledge, degree of persuasion, or whether a decision is taken to adopt or not. All these factors play a role in defining the innovations newness. In some social systems, however, new innovations are not desirable and can even in some cases be considered harmful. This can also have to do with whether the innovation is considered desirable by the adopters and their situations. Although Rogers (2010) published his first work in the 1960s, it has continuously been updated throughout the years, and more emphasis has been placed on the almost synonymous terms ‘diffusion’ and ‘technology’ throughout each edition.

Rogers defines technology as “a design for instrumental action that reduces the uncertainty in the cause-effect relationships involved in achieving a desired outcome with two components: (1) a hardware aspect, consisting of the tool that embodies the technology as a material or physical object; and a (2) software aspect consisting of the information base for the tool” (Rogers, 2010, p. 41). For this thesis, the latter of the two is the most important since the innovation dealt with is almost entirely composed of information. However, it is important to note that both aspects are always represented in a mixture, i.e. one cannot have software without hardware (e.g. a computer). Rogers (2010) further argues that technological advantages may seldom be clear cut to the adopters, and that it rarely is viewed as a superior alternative to already existing practices. The author is also elaborating on perceived attributes of innovations

(18)

18 in his work, and believes that it is of uttermost importance that one is not viewing all innovations as equal units of analysis (e.g. cell phones took only a few years, whereas seat belts took decades to diffuse to the extent that they became standard in every car). There are also five characteristics that Rogers (2010) argues that are able to explain different rates of adoption:

1) Relative advantage

First one is the relative advantage, which is simply defined as the degree to which the new innovation supersedes the idea already in place. The focus here does not lay on the objective advantage (e.g. convenience or economic terms), but rather whether the user of it consider it to be advantageous.

2) Needs and value compatibility

The second characteristic is dealing with the innovations compatibility with the already existing needs and values of the adopters. It is believed that an innovation which is fully incompatible with the adopter’s norms and values in a social system will either be slowly adopted or not at all compared to one which is fully compatible to both.

3) Hard to understand

The third characteristic deals with whether the new innovation is perceived as hard to understand and use, namely its degree of complexity. In some social systems, innovations are complicated and take time for the users to understand and can thus slow down the adoption process.

4) Trialability

The fourth characteristic elaborated upon is the innovations trialability, namely the degree to which the new innovation can be tried and tested before implemented. It is argued that if an innovation can be tried and tested in instalments before getting fully adopted, it increases the timeframe it takes for it to be fully adopted.

5) Observability

Last characteristic is in regard to the innovations observability. Here it is argued that the easier it is for an individual to view the positive features of a new innovation, the more likely they are to adopt it. Based on past research, these five characteristics are the most important when explaining the rate of an innovations adoption (Rogers, 2010).

(19)

19

Communication on the innovation

The second element of the theory to be elaborated upon is communication, which is defined “as the process in which participants create and share information with one another in order to reach a mutual understanding” (Rogers, 2010, p. 45). In this theory, diffusion is considered a special type of communication since the message communicated is solely concerned with a new idea. Communication is very important for the diffusion of an innovation, since it assists in spreading the idea of a new innovation to other members in the same (or similar) social system which could perhaps also benefit from it. There are various communication means which are able to assist in spreading an idea:

1) Mass Media

This mean is arguably the most rapid and efficient way of spreading an innovation. It is most vital and effective in an early stage, in which the idea of an innovation can be contributed to spread in as high pace as possible to as many potential users as possible.

2) Interpersonal Channels

However, interpersonal channel links are considered more effective when it comes to persuading an individual or organisation to adopt a new innovation, which is especially salient when both are very similar. Interpersonal channels are more important at the stage in which an innovator is being persuaded to adapt an innovation, whereas mass media mentioned above plays a more vital role in the knowledge sharing stage.

3) Internet

Increasingly, the role of the internet is assisting in spreading new innovations. This since many is increasingly relying on subjective evaluations by people who already adopted the innovation.

Effects of Timing

The third element to be elaborated upon is time, which is according to Rogers (2010) one of the strengths of diffusion research. In the diffusion of innovations, time is included in; 1) the innovation-decision process, which measures the time from when an individual became aware of a new innovation, to subsequently being adopted or rejected; 2) how innovative (i.e. relative earliness/lateness) an individual is compared to the other members of a system; and 3) the innovation rate of adoption, which is measured as the number of members of a specific system who adopts it within a given time frame.

(20)

20 Although time is one of the four main pillars of the theory, it is not being applied as a factor to explain what have facilitated or impeded the diffusion of the innovation in this particular study. This since it is mainly focusing on individual adoption, rather than organisational as this study’s focus lay on. Furthermore, the time pillar is also mainly useful when multiple cases are being studied over an extended period of time, which is not the case in this study. However, this section will still elaborate on the various sub-categories of the pillar.

1) Innovation-decision process

Another important concept within the time element of theory is the innovation-decision process, which measure the process from “an individual or other decisions making unit passes from first knowledge of an innovation, to the formation of an attitude, to a decision to adopt or reject to implementation and use of the new idea and to confirmation of this decisions” (Rogers, 2010, p. 46). Within this process there are five5 main steps; (1) Knowledge; (2) Persuasion; (3) Decision; (4) Implementation; and (5) Confirmation, which are all sequential and aims at decreasing the level of uncertainty one may feel about an innovation.

2) Relative speed

Another dimension in which time is involved is the rate of adoption, which is defined as the relative speed for an innovation by all members of a social system. As soon a new innovation is entering a new social system and a few members are adopting it, the diffusion curve starts to climb more and more as new adopters are joining, but eventually the curve flattens out as fewer and fewer remain to have not yet adopted as can be seen in Figure 1.

Figure 1: S-Curve (Rogers, 2010)

(21)

21 However, it is once again important to bear in mind that the rate of adoption can vary greatly between different innovations, since some ideas diffuse rapidly, whereas others take a great amount of time to reach the top of the S-curve.

Social system and its influence on adoption

The fourth and last element is that of a social system, which is defined “as a set of interrelated units that are engaged in joint problem solving to accomplish a common goal” (Rogers, 2010, p. 49). The members of a social system (individuals or organisations) are together, one way or another, seeking to reach a common goal which is shared among all members of the system. Further, a social system is like a boundary within which innovations are diffusing, and within it numerous factors may play a role when it comes to affecting the rate of adoption. These four factors are:

1) Social structure and Interpersonal Networks

Another crucial factor within any social system is that of its structure, i.e. the patterned arrangements of the units in a system which provides stability and consistency, and allows one to predict behaviour with a degree of accuracy. This predictability is best illustrated as a hierarchal structure in which the positions by individuals of lower rank are expected to follow orders by those above. It is believed that the structure that individuals and organisations are communicating with each other within a social system can assist in facilitating or impeding the rate of adoption.

Further, a social system is also constituted by various informal systems as well, such as interpersonal networks that are linking members of that social system with each other. It is believed that the with a stronger interpersonal network to one’s disposal, it is easier to both spread and innovation, as well as being reached by the emergence of one. But social systems can both facilitate or hinder innovations from diffusing, and the innovativeness of an individual is also influence of the social system that one is operating within.

2) Opinion leaders

Within social systems, one is also able to locate a so-called opinion leader, which is a member which is keen towards innovations and tries to be in the forefront of all members of the system to the uttermost degree. But they also have their own counter pool, namely the people that are

(22)

22 opposed to innovations. Leaders, however, have a tendency to be more cosmopolite, be more innovative, and also tend to have a higher socio-economic status. But they also may have a tendency of appearing like professional change agents who constantly promote change. If so is the case, the leader may lose credibility from the other members in the social system.

3) Types of innovation-decisions

Another sub-category within this element is that of types of innovation-decisions. Rogers (2010) elaborates on three of those; (1) Optional innovation-decisions which is a decision made to adopt taken by an individual member independent on decisions from others in the system; (2) collective innovation-decisions, which is a decision made to either reject or implement an innovation based on consensus of all members of the system. Generally speaking is this decision slower than the aforementioned one; and (3) authority innovation-decisions which is a decision made to either adopt or reject made by a handful of individuals who possess all power, status, technical expertise. This last decision is generally the fastest of all three of them.

4) Consequences of Implementation

There is also important to take into account the various consequences implementation can bring along to the social system. Rogers (2010) lists three of those; (1) desirable vs. undesirable consequences, which deals with whether the effects of an innovation is functional or not; (2) direct vs. indirect consequences, which are changes that occur within the social system as a direct consequence of the innovation, or second-order results; and (3) the anticipated vs. unanticipated consequences, which depends on whether changes are recognized and/or intended due to the innovation by the members in the system (Rogers, 2010).

2.3. Hypotheses

H1: The needs and value incompatibilities between OCHA and V&TCs negatively impacted the diffusion rate.

H2: The length of the trialability period of V&TCs positively impacted the success of the adoption for OCHA.

H3: The relative advantage over OCHAs former practices positively influenced the diffusion process to V&TCs.

(23)

23

3. Methodology

3.1. Research Design

This thesis will take an explorative qualitative single-n case study approach, and will explore and examine in depth a specific actor’s development and role in the humanitarian information management landscape. A case study is defined as an “as an empirical inquiry that investigates a contemporary phenomenon within its real-life context; when the boundaries between phenomenon and context are not clearly evident; and in which multiple sources of evidence are used” (Yin, 1984, p.23). It is thus a way of observing a natural phenomenon that exist within a set of data (ibid.). The aim of the study is to conduct a holistic research on the U.Ns - and in particular OCHAs - past years information management developments in disasters and try to understand what factors that have contributed or impeded to the trend of outsourcing mapping tasks to outside actors during sudden disaster onsets. It is also essential to study OCHA holistically due to the uniqueness of the U.N as an actor and since there is a complex new phenomenon of outsourcing which the study aims at understanding the factors behind. Since the study is explorative in nature, it aims at opening up the door for further examination of the phenomenon observed in the study. In order to do so, this study will aim at understanding the developments within the field by analysing documents and interviews of actors relevant in the OCHAs information management unit. The sampling technique applied in regard to the interviews for this study was purposive; meaning that the interviews were selected based on their characteristics and not on random. This since it is a well-suited technique for exploratory research where a small number of cases (especially a single-n case study) are researched.

The rationale behind focusing solely on OCHA as a single-n case study is because of its prominent position within the humanitarian information management landscape. Furthermore, the case is also chosen on the basis that OCHA was indeed the first major humanitarian actor to implement an outsourcing cooperation with outside actors. Since OCHA also is the main information management actor within the U.Ns cluster system, and since the U.N is arguably the largest humanitarian actor within the entire domain, it is of academic interest to have a thorough understanding of factors that either facilitated or impeded decisions to outsource certain information management products for further research purposes. This understanding can subsequently lead to new discoveries which may enable other humanitarian organisation to determine whether similar cooperation can be beneficial for their specific organisation.

(24)

24 As mentioned above is the U.N cluster a unique set of actors in terms of disaster management and has been holding a leading and prominent role for decades. With the introduction of new actors fighting for funding and expertise, the development is prone to produce changes. The introduction of outsourcing certain tasks, which for the time being was outside of the competences of the organisation, has changed the humanitarian information management landscape in numerous ways since Haiti 2010.

The study is an instrumental case study, meaning it provides a further and more in-depth understanding of a specific actor with the intention to draw generalizations of it in the final section (Stake, 2003). As the research method is qualitative, it will make use of interviews conducted of actors relevant in the U.Ns information management field with a detailed insight in the current structures and trends. Further, it will also make use of primary sources from individuals in V&TCs whoa are involved in crisis mapping, as well as secondary sources from scholars relevant in the field. The study will make use of the diffusion of innovations theory, which focuses on understanding why, how and at what rate new innovations are spreading. The theory aims at understanding how and why OCHA has been adopting new tools and practices in such an ordinary fast pace since it is located in a traditionally bureaucratic system. By applying the theory into the research, the study aims at understanding and map out how and why the introduction of outsourcing of certain information management tasks to outside actors of the established disaster field came about. The theory will be used as a framework to understand how the diffusion of such innovations was made possible by outlining key factors which explain the diffusion of the technology and conclude with what factors that played the most significant- and least important-role in the diffusion process.

3.2. Why Single Case study

The rationale behind focusing solely on one case in this study has a lot to do with the newness of the topic. Humanitarian organisations have only been utilizing cooperation’s between themselves and outside actors for information management tasks since the devastating earthquake in Haiti 2010. It is therefore not too many actors out there in the humanitarian aid domain that one can choose from to base the research on, since not many of them have develop such cooperation between themselves and other actors. Furthermore, since OCHA is arguably the largest information management actor within the entire humanitarian field, it appears natural to focus on it.

(25)

25 It is furthermore also important to base the study on a unique actor such as OCHA in order to enable a deeper and more thorough understanding of process tracing over time in the humanitarian aid domain. If this study is able to locate certain factors that are either helping to facilitate the diffusion of outsourcing products or whether it is impeding such decisions, then that information could be of interest of decisions makers in various humanitarian aid organisations.

3.3. Data Collection

For this study, the data is mainly collected from three types of sources: (1) Primary sources from the U.N and in particular OCHA, (2) Interviews which were conducted on Skype during the months of November and December 2016 with personnel either working for OCHA or any of the V&TCs covered by the study, and (3) Secondary literature published by scholars which are active and prominent within the field of big data and digital humanitarianism. The study started with an extensive review of the existing literature, with a main focus on publications from 2013 onwards (due to the newness and everchanging environment that the study is covering). The focus laid all the time on localising exploratory factors which could assist in explaining the diffusion of outsourcing the products. Meanwhile, interview questions were developed with the literature in mind. With the interviews conducted and transcribed and a substantial knowledge of the topic in the hands of the researcher; a research puzzle started to emerge and hypotheses were developed. After that, the actual writing process started.

The unit of analysis, and therefore the subject of this study, is the OCHAs information management working practices in sudden disaster onsets. The unit of observation for this study is the U.Ns cluster system, and in particular OCHA which plays the most central roles in the cluster as the U.Ns main information management facilitator during sudden disaster onsets. Furthermore, other units of observation will be related to V&CTs which are related to in particularly mapping a disaster based on big data during sudden onsets.

(26)

26

3.4. Operationalization

With the theory outlined in chapter two and the research design elaborated upon in a previous section in this chapter, it is now time to explain how the data and concepts in this study was operationalized.

This study will look in to the various factors outlined in the diffusion of innovation theory and link those with the case. It will do so by firstly identify the elements (such as the characteristics of the innovation and the social system the actors operates within among others) and then within those elements identify what factors that have either assisted in facilitating the adoption of the innovation or acted as a hindrance and thus impeding the adoption of it. Furthermore, it will also identify if any of the factors that were outlined in the theory played no role or was of low significance for the diffusion of outsourcing certain information management products by OCHA.

This will be done by thoroughly analyse the data gathered for this study and identify the factors that have been presented in the theoretical chapter. After that, it will be determined whether the factor identified either facilitated the diffusion and adoption of the innovation, or whether it has been a hindrance and thus resulted in a slower adoption process. The findings will then be tested towards the three hypotheses and see whether one can find support for them or whether it is able to disprove them.

The data this will be assessed on is the primary and secondary literature together with the interviews which had been gathered for this study. After this has been done, the study will subsequently lead to a conclusion to which of the factors that have been the greatest facilitator of the rapid adoption, but also present if there are factors that have impeded such adoption.

3.5. Methodology: Process Tracing

Process tracing is a method which essentially is used for identifying the intervening causal process (that is the causal chain and mechanism) between the independent and dependent variable. The method is a useful tool for a researcher that is trying to trace a process which has led to an outcome by narrowing down a list of potential causes. According to Falleti (2006), does process tracing “require the researcher to tap into her sociological or political imagination

(27)

27 in order to identify the theories relevant to the problems and puzzles she seeks to explain and to be able to derive feasible causal mechanisms” (p.6). What makes process tracing unique is not only that it cannot only proceed from both forwards (potential causes to effects) and backwards (effects to possible cause), but also that it takes into account equifinality; which is the principle that an end state can be reached by various potential means. This method therefore introduces the researcher to consider one and the same outcome being reached through various causal paths consistent with a single case. If multiple other cases are added, the researcher have the possibility of mapping out a collection of causal paths which have led to the same outcome, and therefore be able to outline the conditions under which the end results were achieved. The method is fundamentally different from other methods which instead looks at the covariance and comparison across cases (Falleti, 2006).

By utilizing process tracing, all intervening steps within a case must be taken into account and predicted by a hypothesis or else that hypothesis must be amended to explain the case. It is however, not sufficient that a hypothesis is consistent with a statistically significant number of intervening steps. It is important to note that process tracing indeed shares a few similarities with historical explanation. However, process tracing may take a multicity of different forms of which not all are visible in historical studies, but process tracing also has a greater variety of uses which historical studies is lacking. This is due to the fact that process tracing is focusing more on theory development and theory testing (of which the latter is most applicable for this thesis). When a case study which is employing the methodology of process tracing is not able to test a theory, the method can instead be utilized in the development of theories (George & Bennett, 2005).

The most simplistic way of utilizing this methodology is by tracing a narrative or story in a chronological way that shed lights on how a specific event came about in a linear way by mapping out the causality. Thus, such a narrative is highly specific and hence makes no use of theory related variables. However, it is important to note that such narrative is not useless, but a theoretical narrative may contain crucial information in the development of a more theory oriented process tracing. Thus, a comprehensive narrative may provide sufficient information for a researcher to explain the causal processes within a case so the most suitable process tracing method may be determined to explore a more theoretically oriented explanation. While applying a more analytical form of process tracing however, hypotheses are accompanying the narratives (at least partially) which are highly specific for the case.

(28)

28 Thus, process tracing is a great method for making causal interferences when so is not possible solely through controlled comparison method (and process tracing can even make up to certain limitations prevalent in that method). For this thesis, the method will be applied to trace the development which will be applied to the theory of diffusion of innovations, by analysing how the trend of outsourcing certain information management products to actors originally located outside of the U.Ns cluster domain came about in sudden disasters deployment (George & Bennett, 2005).

3.6. Limitations

One limitation is that the U.N can be a rather secretive organisation and, as was noticed during some of the interviews, are not always so keen on sharing plans for the future of the organisation. This is believed to be because of the current re-structuring that is ongoing within the organisation that is for understandable reasons not often shared with outside actors in order for them to remain a good position for the future.

In regard to external validity is a single-n case study open to produce results which are not necessarily generalizable and thus a risk of that is present in this study. Ray (1998) have argued in favour for a causal explanation necessitates comparisons between cases. However, this is mainly applied to use of process tracing as a methodology for theory development. Further, others (King et al., 1994) have argued that single case studies are not particularly useful for testing theories or hypotheses unless they are viewed in a bigger picture and compared to other cases. This is clearly a limitation which has to be addressed, but King et al. (1994) only view a single case as being used for one observation on the dependent variable, and since one case may in fact contain numerous potential observations the criticism is considered to be rather unjustified. Another possible limitation with the methodology is that two or more hypotheses can be consistent with the evidence produced. If so is the case, the researcher is left to assess whether an alternative or complementary explanation can be visible.

(29)

29

4. Analysis

4.1. Exploring OCHA’s Crowdsourcing Advancements since Haiti 2010

4.1.1. Background of Big Data in Humanitarian Interventions (case examples)

This section will outline the background of big data in humanitarian interventions since its introduction in Haiti 2010. The focus of this section is not only on the collaboration between OCHA and V&TCs, but more generally on the background of the usage of the big data technology to elaborate on its multitude of usage areas. This will be done to introduce the reader to particular milestones throughout the past decade in regard to information management developments with big data involved. Noteworthy is that the Ebola outbreak is not a sudden disaster onset and that the U.N had no official role in the search for the missing flight MH 370. Furthermore, examples from the Red Cross are included due to the relevance of showing the challenges that other humanitarian organisations also have faced and that every cooperation was not always successful (especially in the early activations). The cases are therefore mainly presented to inform the reader about the various developments which V&TCs have taken part in. This will make it easier for anyone not familiar with the background of big data in humanitarian interventions to better understand how it emerged and what the added benefits have been.

In 2010, the U.N went through a major reformation effort which introduced a cluster system approach which represented a new way of coordinating aid efforts with the aim to enhance the predictability and partnership among various humanitarian organisations. A core objective of this reformation process was the decentralisation of information management nodes within the organisation. It was predicted that by decentralising those nodes, the U.N could become a more collaborative and efficient humanitarian actor (Sabou & Klein, 2016). However, the tools required to assist this cluster reformation were lacking and humanitarians early realised that “one of the main issues with the cluster coordination system is that it causes information fragmentation. This approach is very prone to produce data silos due to a lack of instruments that can share information inter-organisationally” (Harvard Humanitarian Initiative, 2011, p. 22). Furthermore, it is relatively common within OCHA that different offices lack any sort of internal coordination because of the various bureaucratic barriers and protocols salient within the organisation (Ngamassi, Maitland & Tapia, 2014).

(30)

30

Haiti 2010

The history of digital humanitarianism and therefore the history of big data being utilized in relief operations dates back to 2010 when a 7.0 magnitude earthquake shook Haiti. The damage made by the earthquake was catastrophic and with an official estimate of 222,000 people killed as a result, it is ranked as the second deadliest earthquake ever recorded in history (Holzer and Savage, 2013). Relief operations were promptly launched by a variety of intergovernmental organisations (in particularly by the U.N and the European Union) and various international and local NGOs. However, upon arrival in Haiti, many aid workers from relief groups were dissatisfied with the huge information gaps that existed in regard to basic street level maps needed to dispatch search and rescue teams, lists of health facilities, information on upcoming shipments of medical supplies, and routeing for moving aid between locations.

In interviews conducted in the aftermath of the deployment, dissatisfaction with the U.Ns cluster lead agency - which was responsible for an effective and coordinated cluster response - was put forward by staff members. In their view were the clusters unable to carry out work beyond their own analysis, and hence they tended to manage information in such a way that it was primarily most suitable for their own immediate needs, but not for the overall system. This lack of incentives by the clusters to invest in overall coordination led to the core challenge in terms of information management in Haiti; information fragmentation. This created silos where divisions of data resources and analysis were so complex that it was deemed practically impossible to fuse or reintegrate information into composite pictures that can be shared inter-organizationally (Harvard Humanitarian Initiative, 2011).

Just hours after the earthquake, an ad hoc group of online V&TCs initiated by Patrick Meier started developing an online crisis map on which the members pinpointed the most affected areas by the earthquake. These people’s main tasks were to collect, organise, filter, translate, and verify the information they gathered from various sources (mainly from social media) (Weinandy, 2016). Gradually this dotted map started to portray an almost real-time assessment of the impact and needs of Haiti (see Figure 2) and due to the web page being so easy to use and navigate did the team behind it quickly received a lot of attention from both media and organisations involved in the relief operations (Meier, 2015).

(31)

31 Figure 2: Screenshot of the Haiti 2010 crisis map (Meier, 2015) In the meantime, the U.Ns cluster system was overwhelmed with the amount of complex data it had to deal with together with a lack of tools it had at its disposal. As a result, the volunteers quickly began to advocate the use of their map to humanitarian aid organisation operating in Haiti (Harvard Humanitarian Initiative, 2011). Increasingly the humanitarians on ground in Haiti started to utilize the map created by Meier and his team. This group was able to show for the first time that actors outside of the already established humanitarian sector were able to contribute and play a vital role in coordinating information during a deployment (Weinandy, 2016). This view is shared by OCHA, who in their evaluation report on the response to the earthquake concluded that the cooperation with outside mapping actors had been successful and benefited their work (Bjattacharjee & Lossio, 2011). In the aftermath, the V&TCs started pushing for organisations to outsource parts of the information management workflow to V&TCs.

Typhoon Haiyan 2013

Post-Haiti, the introduction of new information actors in the humanitarian field spilt over to launching a debate on the current information management work practices by humanitarian organisations. By the time that typhoon Haiyan hit the Philippines in November 2013, the information management sector of the “humanitarian community was in a state of evolution” (Weinandy, 2016, p.1). This evolution and tighter cooperation with outside actors was made

(32)

32 evident when OCHA the day before the landfall of Haiyan asked ten different volunteer organisations to assist with not only mapping the disaster, but also with other tasks such as verifying and gathering information on their behalf. Among those more prominent organisations were the Humanitarian OpenStreetMap Team6 (HOT), which was not only tasked by OCHA, but also by the Philippines Red Cross.

HOT was launched in 2009 and made its first international footprint as well in Haiti when it produced some of the best maps of the devastation the earthquake had caused on the island nation7. In Haiyan, the majority of the satellite data and maps analysed by these organisations came from a variety of national space agencies and commercial satellite image providers. The majority of the data these organisations processed during the Haiyan was shared for no cost under the International Charter on Space and Major Disasters8 by a variety of space agencies (Butler, 2013). But as was noted by the Red Cross in an official report published in the aftermath of the disaster did the cooperation with HOT leave room for improvement. The report was produced with the intention to assess the ability of the Red Cross to cooperate with crowdsourced platforms and to define the extent and shape of such cooperation for future deployments. Overall, the assessment showed that the data HOT produced was not reliable enough for damage analysis and recovery planning. According to the report did HOT a decent job in identifying affected buildings, but heavily overestimated the buildings believed to be completely destroyed, while at the same time underestimating the buildings that were majorly damaged. The reason to this was said to be due to the low level of resolution that the overhead satellite imagery could deliver. The report however concluded that HOT was not to blame for the errors produced, but more so that the Red Cross could benefit greatly from utilising such cooperation in the future with more high-resolution satellite images at their disposal (Red Cross, 2014).

However, criticism towards the maps produced by digital humanitarians during the typhoon was also expressed by various aid workers, arguing that up towards 35% of the maps were outdated and hence plotted points were often inaccurate (Parker, 2015). One can thus see here the challenges the newly established V&TCs faced during the earliest deployments. The

6 More than 1,000 volunteers from 82 countries worked for HOT during Haiyan.

7 Another well used map was made by the standby taskforce which is available here: http://tinyurl.com/z

musb6a

8 A non-binding charter including a wide range of space agencies and satellites, which provide charitable

(33)

33 criticism was strong towards the usage of them and many questioned the reliability of using their services for future deployments.

Ebola Outbreak 2014

Another very meaningful contribution big data has had for the humanitarian field is regarding disease tracking. When the Ebola virus outbreak started in West-Africa in 2014, there was a brand-new sort of humanitarian intervention which neither V&TCs (nor big data for that matter) had ever been utilised for. The World Health Organisation (WHO) together with OCHA tasked SBTF in collaboration with Net Hope9 to map the health facilities in mainly Liberia, but also in Guinea and Sierra Leone. Up until that time, the information management sharing between the different organisations operating there had been far from sufficient. Save the Children for example, had non-or very little information about Doctors Without Borders facilities, who in turn had almost no information about the Red Cross facilities and so on. It took them approximately three weeks to map out all available facilities, together with the location and number of people affected by the virus (Guay, 2016).

The work was divided up in two deployment phases. In the first phase, SBTF collected all baseline data for the affected countries in cooperation with Net Hope. Later, the two worked closely with OCHA in developing a common classification language in order to streamline the usage of the data between the various organisations and platforms involved in the deployment. Lastly, the data was classified and mapped on a platform which was available for the public. SBTF/Net Hope was one of few organisations working directly on data mining during the Ebola crisis, but their work was widely considered the most comprehensible. Other V&TCs either collaborated through the DHN or produced their own maps (SBTF, 2014).

Flight MH 370 2014

At the point of the disappearance of flight MH 370 the various V&TCs had gained a variety of experience from different deployments. However, when Malaysian Airlines flight MH370 disappeared from the radar in March 2014 on route to Beijing with 239 people on board, a new type of challenge awaited them. Two days after the airplane went missing; the US based satellite imagery company Digital Globe launched a campaign in which they collected 3,200 square kilometres of satellite imagery and published it on a platform named Tomnod. From there, millions of volunteers were able to analyse millions of sliced and diced images from where the flight was believed to have gone missing. Each of these images were then analysed

(34)

34 for traces of debris, oil leakage, life rafts, and so on (Meier, 2015). Within days of its launch, crowd searching individuals had analysed, processed, and tagged an estimated 15 million features of interest derived from approximately 1 billion satellite images. Further, Tomnod then utilized a crowd rank algorithm for deciding which of the tags that would be forwarded to the Australian rescue teams active in the search of the plane (Qadir et al., 2016). This is still up until today the biggest crowd sourcing campaign to have taken place based on the sheer number of individuals related in the search, as well as the magnitude of the data input analysed.

Nepal earthquake 2015

In April 2015, Nepal was shook by a 7.8 magnitude earthquake which left close to 9,000 people dead and 3.5 million homeless due to the damages on buildings (Goda et al., 2015). At this point in time, the outsourcing practices by OCHA had been well established with almost five years’ experience of the new alignments with outside actors. Once again OCHA requested support from MicroMappers in the immediate aftermath of the natural disaster (Corcoran, 2015). During the most intense first days of the relief work, the U.N benefitted from crowd sourced material and tools from HOT, which provided real time situational awareness with the help of thousands of volunteers from across the globe (Sabou, 2016). The crowd sourced community mappers which were involved in mapping the earthquake were; (1) MicroMappers, who solely focused on the analysis of images and text from Twitter; (2) MapAction, who focused on maps on affected population; and (3) Humanity Road, which focused on situation reports on communities in need (UN-Spider, 2015).

There were various questions V&TCs were confronted with in the sudden disaster onset of Nepal: in a country with very weak infrastructure, which roads are still usable in the aftermath of an earthquake?; which patches of lands in the most remote areas are most suitable for helicopters to land in order to provide necessary aid?; where will aid workers be able to be housed and work from as soon as they arrive?. These were just a few of the questions they were faced and it took only approximately 48h for HOT to complete a complete overview of the situation to be handed over to OCHA. In total, 21,241 kilometres of roads and 110,681 buildings were processed and analysed, with the help from tens of thousands of high resolution satellite images. Furthermore, approximately 20,000 tweets were analysed and mapped in less than 10 hours, as well as 55,000 text items and 234,727 images (Corcoran, 2015). With detailed information on possible routes into affected villages and damaged bridges, the volunteers helped speeding up the humanitarian aid work of the U.N substantially (Parker, 2015). This deployment also marked a milestone in terms of technical advancement with the introduction

(35)

35 of utilizing drones for mapping. Traditional satellite images according to Patrick Meier “give you a vertical image, looking straight down. That’s not helpful in making a disaster damage assessment … with drones, we get oblique imagery, taken at an angle, which gives you a much better idea of what the damage is” (Parker, 2015, p. 1). The drones have ever since been increasingly utilized to carry out dozens of humanitarian robotic missions and are believed to become a central tool for information managers in future deployments10.

In Nepal, also a new actor gained prominence namely the Kathmandu living lab which teamed up with the OpenStreetMap community. The non-profit organisation, which was launched three years prior to the quake by a group of Nepali computer scientists, had mapped all health facilities in the Kathmandu Valley even before the earthquake struck. The group behind the initiative had been certain that an earthquake would happen within a foreseeable future since Nepal is in one of the most vulnerable country for seismic activity and hence they had prepared for in case such an event would take place (Sinha, 2015). Further, the living lab had additionally also been working extensively over a period of time prior to the quake to map other points of interests in Kathmandu such as roads and bridges, so upon the launch by the OpenStreetMap community, a lot of preparatory work had already been done by the local mapping community. Another unique feature of the mapping of the earthquake was that about half of the volunteers lacked any prior experience of any similar work and had been given only a short online tutorial course organised by the OpenStreetMap community. However, a validation system was introduced here with a second pair of eyes with the more experienced mappers confirming and cross checking the data that was mapped by the less experienced volunteers (Clark, 2015).

Further unique developments during the Nepali onset was in relation to a new set of mobile data antennas developed by WFP, Ericsson, Nethope, and the Luxembourgian government which was a device small enough to be brought on a commercial flight. These antennas provided internet connection to humanitarian working in remote areas which either lacked such connection prior to the earthquake or that had been damaged by it. Another similar new initiative was the Facebook safety check, which also was employed for the first time. Users can simply click a button on Facebook that they were safe after the earthquake and thus inform all their friends and family connected to them via the social media platform (Corcoran, 2015).

10 Patrick Meier is also one of the initiative takers behind utilizing drones in humanitarian interventions. For

Referenties

GERELATEERDE DOCUMENTEN

To compare the content of the diagnostic activities between the report and the stimulated recall session, we counted the number of participants who described (report) or reflected on

We recommend four approaches to resolve the controversy: (1) placebo-controlled trials with relevant long-term outcome assessments, (2) inventive analyses of observational

To resolve the lack of a coherent and systematic measurement this research focuses on how to measure firms’ sustainability and their transition towards it, by looking at

Thus, in our example, your brain weighs in the costs (cognitive effort) and benefits (good grade) of studying, and then calculates how much you value obtaining a good grade and, as

voorwetenskaplike, wetenskaplike, vakwetenskaplike en teoretiese kontekste waarbinne Huntington se denke verstaan kan word.5 Elk van hierdie kontekste is toegerus met 'n stel

• Drought disaster management in South Africa: legislation, planning and coordination This chapter in the study will consider and build on number of existing policy

2-photon in vivo fluorescence microscopy (a mildly non-invasive technique which achieves an imaging resolution of 100–200 μm.. of the mouse cerebral cortex), one of the

The benefits that can be derived from this circulation are max- imised at the point in development where the helical twist of the cell is greatest, which coincides roughly with