• No results found

Decentralized ledger technology: assessing its link to institutional trust.

N/A
N/A
Protected

Academic year: 2021

Share "Decentralized ledger technology: assessing its link to institutional trust."

Copied!
147
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Decentralized ledger technology:

assessing its link to institutional trust.

By

Jonas Klemm

4th June 2018

University of Groningen – Newcastle University Business School

Faculty of Business and Economics

Double Degree MSc International Business and Management

Advanced Marketing

Supervisors: Professor Kees van Veen and Professor Iain Munro

j.klemm@student.rug.nl --- J.Klemm2@newcastle.ac.uk Beim Wasserturm 21, 76228 Karlsruhe, Germany

(2)

Abstract

Decentralized ledger technology (DLT) has long moved beyond its initial use case as a digital currency. Current understandings perceive it as an institutional technology. As such it is ought to be linked to the institutional environment, the institutions within it and generally - institutional trust. These relations will be assessed in this study, while treating the institutional environment as a moderating variable on the link between institutional and algorithm-based trust. The concept of algorithm-based trust is based upon research about open-source software and fundamentally linked to DLT. The respective relations are examined with the questionnaire-obtained data of 287 respondents. An additional research focus is set on obtaining a better understanding about people owning and using cryptoassets. The research results show no distinct influence of the institutional environment on neither institution-l, nor algorithmic-based trust. However, the theory suggesting algorithm-based trust as an alternative to its incumbent successor can be supported. Respondents with high-levels of institutional trust showed statistically significantly lower levels of algorithm-based trust and vice versa.

(3)

Table of contents

Abstract ... I Table of contents ... II List of figures ... III List of tables ... V

1. Introduction ... 1

2. Theoretical framework ... 6

2.1. Blockchaining the world? ... 6

2.2. What is trust? ... 17

2.3. Trusting the system? ... 20

2.4. In code we trust? ... 25

2.5. Connecting the dots? ... 32

3. Methodology ... 35

3.1. Research strategy and design ... 35

3.2. Operationalizing the hypotheses. ... 37

3.2.1. Institutional trust ... 37

3.2.2. Algorithm-based trust ... 39

3.2.3. Connecting the two ... 44

3.2.4. Proxy measurements of algorithm-based trust ... 46

3.2.5. Population ... 50

3.3. Questionnaire design ... 53

4. Empirical results ... 56

4.1. Quantifying institutional trust ... 62

4.2. Measuring algorithm-based trust ... 64

4.3. Quantifying algorithm-based trust ... 73

4.4. Quantifying the connection ... 77

4.5. Comparing proxy measurements ... 78

4.6. Expanding the horizon ... 80

(4)

List of figures

FIGURE 1:BITCOIN PRICE AND MARKET CAP DEVELOPMENT IN 2017... 7

FIGURE 2:GOOGLE SEARCH TERMS ... 14

FIGURE 3:THEORETICAL FUNDAMENT OF ALGORITHM-BASED TRUST ... 30

FIGURE 4:CONCEPTUAL MODEL ... 34

FIGURE 5:MEASUREMENT MODEL OF HYPOTHESIS 1 ... 38

FIGURE 6:PROXY MEASUREMENTS FOR ALGORITHM-BASED TRUST ... 41

FIGURE 7:MEASUREMENT MODEL OF HYPOTHESIS 2 ... 44

FIGURE 8:MEASUREMENT MODEL OF HYPOTHESIS 3... 45

FIGURE 9:GENDER DISTRIBUTION OF SURVEY RESPONDENTS ... 58

FIGURE 10:EDUCATION DISTRIBUTION OF SURVEY RESPONDENTS ... 58

FIGURE 11:INCOME DISTRIBUTION OF SURVEY RESPONDENTS ... 58

FIGURE 12:COUNTRY BREAKDOWN RESPONDENTS ... 59

FIGURE 13:HISTOGRAM OF THE MEAN OF INSTITUTIONAL TRUST ... 60

FIGURE 14:HISTOGRAM OF THE MEAN OF INSTITUTIONAL QUALITY ... 62

FIGURE 15:DESCRIPTIVE STATISTICS FOR THE MEAN DISTRIBUTION OF 'INSTITUTIONAL TRUST' AND 'INSTITUTIONAL QUALITY' ... 63

FIGURE 16:DESCRIPTIVE STATISTICS FOR THE PROXY MEASUREMENTS OF ALGORITHM-BASED TRUST ... 65

FIGURE 17:SCREE PLOT OF THE FA OF THE INSTITUTIONAL DIMENSION OF ALGORITHM-BASED TRUST .... 67

FIGURE 18:SCREE PLOT OF THE FA OF THE TECHNICAL DIMENSION OF ALGORITHM-BASED TRUST ... 67

FIGURE 19:FACTOR LOADINGS OF THE EFA WITH THE TECHNICAL DIMENSION OF ALGORITHM-BASED TRUST ... 68

FIGURE 20:FACTOR LOADINGS OF THE EFA WITH THE INSTITUTIONAL DIMENSION OF ALGORITHM-BASED TRUST ... 69

FIGURE 21:LOADINGS OF FACTOR 6 ... 72

FIGURE 22:DESCRIPTIVE STATISTICS FOR THE MEAN VALUES OF ALGORITHM-BASED TRUST AND THE INSTITUTIONAL ICTFOCUS ... 74

FIGURE 23:NORMAL Q-QPLOT OF THE MEAN VALUE OF ALGORITHM-BASED TRUST FOR PARTICIPANTS FROM A COUNTRY WITH A 'LOW' INSTITUTIONAL FOCUS ON ICT ... 75

FIGURE 24:NORMAL Q-QPLOT OF THE MEAN VALUE OF ALGORITHM-BASED TRUST FOR PARTICIPANTS FROM A COUNTRY WITH A 'HIGH' INSTITUTIONAL FOCUS ON ICT ... 75

FIGURE 25:SCATTER PLOT OF ALGORITHM-BASED TRUST AND INSTITUTIONAL TRUST WITH THE REGRESSION EQUATION AND CONFIDENCE INTERVALS ... 77

FIGURE 26:OVERVIEW ABOUT HYPOTHESES ... 82

FIGURE 27:PROXY QUESTIONS TO ASSESS INSTITUTIONAL TRUST ... 117

FIGURE 28:PROCEDURE PROTOCOL FOR TESTING THE ASSUMPTIONS TO CONDUCT A FACTOR ANALYSIS120 FIGURE 29:PROCEDURE PROTOCOL FOR THE MANN-WHITNEY U TEST ASSESSING THE INSTITUTIONAL TRUST SCORES FOR CRYPTO-USERS AND CRYPTO-NON USERS ... 121

(5)

FIGURE 31:SCATTER PLOT OF THE LINEAR REGRESSION OF ALGORITHM-BASED TRUST AND CRYPTO

IDEOLOGY ... 122 FIGURE 32:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST

AND CRYPTO IDEOLOGY ... 122 FIGURE 33:SCATTER PLOT OF THE LINEAR REGRESSION OF ALGORITHM-BASED TRUST AND USAGE ... 123 FIGURE 34:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST

AND USAGE ... 123 FIGURE 35:SCATTER PLOT OF THE LINEAR REGRESSION OF ALGORITHM-BASED TRUST AND 'GENERAL

PREDISPOSITION'... 124 FIGURE 36:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST

AND GENERAL PREDISPOSITION ... 124 FIGURE 37:SCATTER PLOT OF THE LINEAR REGRESSION OF ALGORITHM-BASED TRUST AND FINANCIAL

INVOLVEMENT ... 125 FIGURE 38:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST

AND FINANCIAL INVOLVEMENT ... 125 FIGURE 39:SCATTER PLOT OF THE LINEAR REGRESSION OF ALGORITHM-BASED TRUST AND CRYPTO

INTEREST ... 126 FIGURE 40:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST

AND CRYPTO INTEREST ... 126 FIGURE 41:PROCEDURE PROTOCOL FOR THE WELCH-T-TEST FOR THE TENDENCY FOR ‘GOLD RUSH’

AMONG CRYPTO USERS AND CRYPTO NON-USER ... 126 FIGURE 42:BOXPLOT OF SCORES FOR 'GOLD RUSH' FOR CRYPTOASSET USERS AND NON-USERS ... 127 FIGURE 43:PROCEDURE PROTOCOL FOR THE ASSESSMENT OF GENDER, EDUCATION, AGE AND INCOME ON

ALORITHM-BASED TRUST ... 128 FIGURE 44:MODEL SUMMARY AND CORRELATION COEFFICIENTS OF A MULTIPLE REGRESSION BETWEEN

GENDER, AGE, EDUCATION, INCOME AND ALGORITHM-BASED TRUST ... 128 FIGURE 45:MULTIPLE LINEAR REGRESSION TO ASSESS WHETHER GENDER, EDUCATION, AGE AND INCOME

CAN PREDICT THE LEVEL OF INSTITUTIONAL TRUST ... 129 FIGURE 46:PROCEDURE PROTOCOL OF A LINEAR REGRESSION BETWEEN THE COMFORT WHEN DEALING

WITH CRYPTOASSETS AND ALGORITHM-BASED TRUST ... 129 FIGURE 47:INDEPENDENT-SAMPLES MANN-WHITNEY U TEST FOR THE LEVELS OF INSTITUTIONAL TRUST

OF CRYPTO-USERS AND NON-USERS ... 131 FIGURE 48:INDEPENDENT-SAMPLES MANN-WHITNEY U TEST FOR THE ANXIETY OF LOW AND HIGH

SKILLED CRYPTO-USERS ... 132 FIGURE 49:INDEPENDENT-SAMPLES MANN-WHITNEY U TEST FOR THE TECHNICAL ALGORITHM-BASED

TRUST OF RESPONDENTS SCORING 'LOW' AND 'HIGH' FOR THE PERCEIVED POTENTIAL OF

(6)

FIGURE 54:BOX PLOT TO TEST THE APPLICABILITY OF AN INDEPENDENT-SAMPLES-T-TEST FOR

HYPOTHESIS 2 ... 136

FIGURE 55:SCATTER PLOT OF THE REGRESSION OF ALGORITHM-BASED TRUST AND INSTITUTIONAL TRUST ... 137

FIGURE 56:NORMAL P-PPLOT OF REGRESSION STANDARDIZED RESIDUAL OF ALGORITHM-BASED TRUST AND INSTITUTIONAL TRUST ... 138

FIGURE 57:SCATTERPLOT OF CRYPTO COMFORT AND ALGORITHM-BASED TRUST ... 139

FIGURE 58:NORMAL P-P PLOT OF REGRESSION STANDARDIZED RESIDUAL OF THE CRYPTOCOMFORT AND ALGORITHM-BASED TRUST ... 140

List of tables

TABLE 1:MEASURING THE PROXY MEASUREMENT ‘CRYPTO-IDEOLOGY ... 46

TABLE 2:MEASURING THE PROXY MEASUREMENT ‘USAGE’ ... 48

TABLE 3:MEASURING THE 'IT KNOWLEDGE' COMPONENT FOR THE PROXY MEASUREMENT 'GENERAL PREDISPOSITION'... 49

TABLE 4:MEASURING THE PROXY MEASUREMENT ‘FINANCIAL INVOLVEMENT’ ... 49

TABLE 5:MEASURING THE PROXY MEASUREMENT ‘CRYPTO INTEREST’ ... 50

TABLE 6:CONTROL QUESTIONS FOR THE 'GOLD RUSH’ METRIC ... 50

TABLE 7:COMPARISON OF THE INTERCEPTS, SLOPES AND ADJUSTED R2'S OF THE LINEAR REGRESSION OF 'ALGORITHM-BASED TRUST' AND THE PROXY MEASUREMENTS ... 79

(7)

1. Introduction

The lead programmer of Bitcoin, Gavin Andresen, once said, that he would rather trust ‘the wisdom of the crowds’ than rely on the fact that ‘politicians or central bankers wont screw it up’ (Goldstein and Kestenbaum, 2011). This statement is symptomatic for proponents of a new technology (Nakamoto, 2008) that originated in the cypherpunks movement which aspires to radically decentralize institutions (Hughes, 1996; Lopp, 2016). Propelled by the prior financial crisis of 2008, this technology started to gain traction as being ‘trust-free’ (Beck et al., 2016) and thus an alternative to ‘our established, centralized financial and political systems, [which] are far from being invulnerable to trust issues and systemic risks’ (Helbing, 2012; Glaser and Bezzenberger, 2015). This currently emerging technology (Panetta, 2017) is increasingly finding potential areas of application (Frey, Wörner and Ilic, 2016) as it enables secure and trust free transactions (Beck et al., 2016). In fact, the blockchain’s most famous use case, Bitcoin, is an anonymously invented and published digital currency (Nakamoto, 2008) that was the first solution to solve the old (Medvinsky and Neuman, 1993) problem of double-spending1 digital currencies. It combines

peer-to-peer networks with cryptography (e.g. public/private keys and hash functions) to create an immutable time-stamped public ledger (Pilkington, 2015; Swan, 2015; Davidson, De Filippi and Potts, 2018). This made it possible to agree on the true state of a ledger ‘without needing

(8)

to rely on any centralized or intermediating party – such as an auditor, a corporation, a market exchange or a government’ (Davidson, De Filippi and Potts, 2018, p. 1). In this sense, it is referred to as a ‘distributed ledger technology’ or more florid, a ‘trustless consensus engine’ (Swanson, 2015). The blockchain is therefore extremely pervasive (Glaser and Bezzenberger, 2015) and deemed to be a potent driver of the digitalization (Risius and Spohrer, 2017) which itself is ought to reshape even knowledge-intensive industries and services (Loebbecke and Picot, 2015). Distributed-ledger technologies such as the blockchain2, however, split academia and business into two factions. Supporters see the blockchain disrupt every intermediary service (Tapscott and Tapscott, 2016) and create a new governance form described by the term ‘cryptoeconomics’ (Zamfir, 2015; Davidson, Sinclair and De Filippi, Primavera and Potts, 2016). Critical voices, however, judge the technology as overloaded with expectations and searching for its use-case (Glaser, 2017). As of now, business models built upon the blockchain, have yet to prove their value and often-claimed game-changing potential. Meanwhile, high valuations of ‘cryptocurrencies’ supported criticists by feeding the fear of ‘cryptocurrencies being a bubble (Avital et al., 2016) or a Ponzi-scheme (Arnold, 2018).

These ‘cryptocurrencies’ should rather be called cryptoassets (Burniske and Tatar, 2018), as there are currently multiple different blockchain applications and most of them are not even providing payment settlements. ‘Cryptoasset’ is a general term for cryptographically

(9)

secured distributed ledger technologies, providing services verified and operated by a network instead of a trusted third party (Peters, Panayi and Chapelley, 2015; Yli-Huumo et

al., 2016). Replacing the incumbent model of trusted third-parties thus directly relates to

the core properties of the blockchain, namely ‘security, anonymity and data integrity

without any third-party organization in control’ (Yli-Huumo et al., 2016, p. 2 emphasis

added). Coherently, the essence of the blockchain is to replace centralized monitoring and control with openly accessible, networked, computation- and consensus (Leon Zhao, Fan and Yan, 2017, p. 6f). To eliminate the need for trusted third parties it was necessary to design a system that is transparent, secure and decentralized (Tapscott and Tapscott, 2016). Some authors coherently proclaim that ‘not the anonymity or the cryptography or the economics is at the core of it [the blockchain] being disruptive, but the fact that trust in code deals as a substitute for the credibility of persons, institutions, and governments’ (Maurer, Nelms and Swartz, 2013, p. 263). As such, the blockchain was celebrated as being ‘trustless’ (Vasek, 2015) until this notion was corrected by multiple authors (The Economist, 2015; Christopher et al., 2016; Berg, Davidson and Potts, 2017) laying out that the blockchain should rather be considered a ‘trust machine’ (Swanson, 2015).

Instead of researching this apparent relation of the blockchain and trust or the associated costs and benefits of blockchain systems on an individual level, current blockchain research3 is predominantly focusing on conceptual improvements and challenges of the

(10)

underlying code of cryptoassets (Risius and Spohrer, 2017). This is especially unfortunate since ‘little has been demonstrated regarding the individual level costs and benefits of blockchain technology’ (Risius and Spohrer, 2017) that goes beyond initial use-case research (e.g. Beck et al., 2016). For these reasons, it is crucial to understand this new technology by looking into the perceived benefits and costs of the users. Respective research could be especially interesting considering the contemporary ‘systemic loss of faith in the system’ (Edelman Intelligence, 2017, p. 23). However, only with empirical research the claims made about potential societal benefits of the blockchain can be substantiated (Risius and Spohrer, 2017, p. 395). Subsequently, a need emerges to conduct empirical research on an individual level that contributes to clarifying the connection between blockchain and trust and relates it to existing institutions.

(11)
(12)

2. Theoretical framework

The blockchain, ergo, appears to have an inherent and apparent relation to trust. How this relation impacts the present constellation of institution-based trust will be examined in the following. Before, however, jumping to this assessment, the relevant concepts will be introduced. In particular, insights on trust, the blockchain and the institutional environment will be outlined. From this knowledge, the concepts of algorithm-based trust and institution-based trust will be derived, which will subsequently inform the research hypotheses.

2.1.

Blockchaining the world?

(13)

billion USD (Coinmarketcap, 2018). With this dramatic rise of Bitcoin (c.f. Figure 1) came, however, a lot of greed and fear of missing out (‘FOMO’).

Figure 1: Bitcoin price and market cap development in 2017

(14)
(15)

efficient (Davidson, De Filippi and Potts, 2018, p. 4). A publicly highly-trusted ledger is thus a requirement for economic efficiency and prosperity (c.f. North, 1992; Ring and Nooteboom, 2003; Davidson, De Filippi and Potts, 2018). Historically, this ‘trust is highest when the ledger is centralized and strong’ (Davidson, De Filippi and Potts, 2018, p. 4). An example would be a higher trust associated with the Euro than the Egyptian Pound or a lower business risk to operate in Germany than in China. Since trust is a ‘means of overcoming the absence of evidence’ (Barbalet, 2009, p. 367) trust can be deemed a vulnerability (c.f. Rousseau et al., 1998) that needs to be managed. The act of managing this vulnerability creates transaction costs that raise the initial price of a business transaction and make it expensive to ‘manufacture trust’ (Davidson, De Filippi and Potts, 2018, p. 4). Moreover, centralized parties maintaining the ledger create additional costs in the form of overhead costs and distorted incentives (c.f. Krueger, 1974; Buchanan, Tullock and Tollison, 1983) and can furthermore abuse their power to exclude a certain population from its services (e.g. Love and Bruhn, 2009).

(16)

tremendously reduced transaction costs (c.f. Williamson, 1981; Dyer and Chu, 2003) in an open, decentralized, yet secure environment. The result of these conditions would be an empowerment of business in general and the creation of a multitude of new business opportunities (Davidson, De Filippi and Potts, 2018).

This view represents the first of three lenses which are usually adopted to classify the blockchain technology. The blockchain is, accordingly, either described as a means to realize true peer-to-peer markets, a general-purpose technology or as a new form of market coordination, thus ‘an institutional technology’ (c.f. Atzori, 2017; Davidson, De Filippi and Potts, 2018). By viewing the blockchain as a means to lower transaction costs, it essentially moves markets towards a peer-to-peer ideal (Coase, 1960; c.f. Catalini and Gans, 2016). Currently, transaction cost economics (c.f. Coase, 1937; Williamson, 1981) describes two forms of governance with the market and spot contracting on one side and hierarchical coordination and employment relationships on the other side. Depending on the characteristics of the transaction4 the one mode or hybrid mode (c.f. Sydow, 1992) that is the most efficient or has the lowest transaction costs should be chosen. The blockchain, however, distinctively changes two of the three main preconditions of transaction cost economics. Programmable contracts theoretically allow for complete contracts which contradicts the concept of bounded rationality (Davidson, Sinclair and De Filippi, Primavera and Potts, 2016, p. 8) while cryptography renders it possible to avoid

(17)

opportunism. This shows that this technology may offer a new way to structure business. Instead of ensuring the success of a transaction with lengthy contracts, independent auditors and sophisticated technology, business parties transacting on the blockchain can , in this example, simply take the conditions of a self-executing contract (c.f. Buterin, 2014) for granted. This showcases how transaction costs aimed at validating and managing information can be minimized with the blockchain.

(18)

Walport, 2015) the lucid logic of Davidson, De Filippi and Potts (2018, p. 4) is adopted, which describes the blockchain as an institutional technology with the potential to disrupt the ‘economic institutions of capitalism’ described by Hayek, Williamson, Buchanan, North and Ostrom. Blockchain applications do not ‘require third party verification (i.e. trust)’ (Davidson, Sinclair and De Filippi, Primavera and Potts, 2016), but rely on carefully designed economic incentives, utilizing the blockchain’s unique properties. Therefore, transaction costs deriving from a lack of trust between economic agents can be completely mitigated or at least substantially reduced5. The maintaining of a ledger – an agreement about facts (e.g. property ownership), decisions (e.g. contract fulfilment) and economic incentives (e.g. reward for mining/staking) - is fundamentally the coordination of economic activity. Davidson, De Filippi and Potts (2018) showcase the pervasiveness of the coordination accordingly: ‘property rights (ledger entry and private keys), exchange mechanisms (public keys and peer-to-peer networks), (native money (cryptotokens), law (code), and finance (initial coin offerings)’.

Throughout this thesis, this general view towards the blockchain will be adopted, while a focus is set on how people perceive the blockchain technology. Even though the blockchain is deemed an institutional technology, the current adoption is driven by economic advantages. These economic advantages usually relate to economic efficiency gains by

(19)

‘stripping out layers of activity no longer needed’ (Davidson, De Filippi and Potts, 2018, p. 11). The utilities deriving from blockchains usually relate to permissionless, or open, blockchains which are opposing permissioned, or closed, blockchains. Only open blockchain applications are not only architecturally decentralized but also politically decentralized (Buterin, 2017). Only in this case they have a close and meaningful interaction with their community (Allaby, 2016). For this reason, the term blockchain will always relate to open blockchains in this thesis. Open blockchains furthermore constitute the clear majority of blockchain applications (Allaby, 2016) and can be considered a decentralized diffusion system according to Rogers (1995, p. 335) classification and description of centralized and decentralized diffusion systems. As such, power and control is shared among the members, while innovation is driven via horizontal networks. These innovations stem from an iterative, problem-focused local experimentation of non-experts (Rogers, 1995, p. 335). The strong involvement of the community in the development and adoption of blockchain solution further supports the classification of the blockchain as a decentralized diffusion system. Consistently, Buterin (2014) argues that the adoption of blockchain will not be driven by one ‘killer app’, just as there was none for ‘open source’. Instead, a huge amount of marginal use cases will be developed, whose sum will be of significant impact. Consistent with this theoretical backdrop, the existing community’s focus continued to move from Bitcoin to alternative coins and use cases6.

(20)

Figure 2: Google search terms

Coherently, there are many projects emerging. They usually derive their added value by facilitating payment services (Beck et al., 2016), providing information on product backgrounds (Finley, 2016) or by being an inexpensive intermediary service (Tapscott and Tapscott, 2016). State-of-the-art applications cover a wide spectrum and range from payment providers (e.g. OmiseGo) over supply chain management (e.g. VeChain), identity management (e.g. TheKey), platform development (e.g. Ethereum) or decentralized news (e.g. Steemit) to charity management (e.g. Alice) and marketplaces for artificial intelligence related services (e.g. Effect.ai). Many of these applications are however, merely conceptual and without a working product. Several of these alternative coins (altcoins)7 also emerged as a response to the challenges Bitcoin was facing. The trend towards altcoins is evident in

(21)

the development of the Google search terms ‘blockchain’, ‘bitcoin’ or ‘btc’ and the various names of the 20 biggest altcoins8 depicted in figure 3 (Google Trends, 2018).

Bitcoin makes use of the firstly developed consensus mechanism of the blockchain – the so-called proof of work – which poses considerable difficulties to scaling due to increasingly high costs and slow transactions between network participants. Proof-of-work requires network validators to solve a ‘computational puzzle’ and reach consensus on the solution. To maintain a steady degree of difficulty the ‘computational puzzle’ is adapted according to the total computing power of all network validators. As network validators try to outcompete competitors by increasing their accessible computing power an arms race was created (Malone and O’Dwyer, 2014). Rising electricity costs and longer times needed to solve and spread the correct solution make Bitcoin transactions significantly slower and more expensive than e.g. Visa transactions (Davidson, De Filippi and Potts, 2018). Nevertheless, multiple other consensus mechanisms emerged from cryptoeconomic design theory (c.f. Davidson, Sinclair and De Filippi, Primavera and Potts, 2016; Conley, 2017), such as (delegated) proof of stake, proof of importance or the zero-byzantinean-fault-tolerance mechanism. These consensus mechanisms have their own advantages and disadvantages as they always need to balance availability, consistency and the degree of centralization (Tschorsch and Scheuermann, 2016). Bitcoin, for example, sacrifices transaction speed and volume for security, but does so in a very unfavourable proportion (Gervais et al., 2016).

8 The biggest altcoins have been identified with historical data from Coinmarketcap (2017). Therein the 19

(22)

Other consensus mechanisms sometimes offer a significantly better trade-off (Gervais et al., 2016). This consensus mechanism is one of the things specified in the cryptoasset protocol. Tschorsch & Scheuerman (2016) offer more insights on this topic by introducing a framework in which they assess and classify consensus mechanisms.

In summary, decentralized ledger technology and in particular the blockchain, are special, since they rely on no third-party authority, but ideally9 operate ‘trust-free by completing transactions on [the] basis of self-enforcing rules’ (Beck et al., 2016). They are thus transparent, secure, individual and with potentially minimal transaction costs (c.f. Pilkington, 2015). A decentralized, impermeable ledger has therefore been called ‘an instance of institutional evolution’ as it offers a new way to coordinate and govern behaviour that is distinct to the current system of firms and governments or markets and hierarchies (Davidson, De Filippi and Potts, 2018). The founder of Ethereum, Vitalik Buterin (2014), subsequently describes the blockchain as follows:

“A blockchain can upload programs and leave the programs to self-execute, where the current and all previous states of every program are always publically visible, and which carries a very strong cryptoeconomically secured guarantee that programs running on the chain will continue to execute in exactly the way that the blockchain protocol sp1ecifies. ... Blockchains are not about bringing to

(23)

the world any one particular ruleset, they’re about creating the freedom to create a new mechanism with a new ruleset extremely quickly and pushing it out.”

By combining distributed ledgers with computationally embedded features (e.g. programmable money, programmable contracts, virtual organizations10) blockchains are thus platforms to build bespoke economic coordination systems (Buterin, 2014; Risius and Spohrer, 2017; Davidson, De Filippi and Potts, 2018). However, such a technology requires mainstream adoption to reach its full potential. And to be adopted, the technology needs to – amongst other things - provide a benefit over the established system (Kahneman, 2003) and needs to be reliable – thus trusted (Pavlou, 2003).

2.2.

What is trust?

In order to research the relationship and dynamic between the blockchain and trust, as well as the relation between blockchain and the established institutions, it is important to properly introduce the concept of trust by clarifying this terminology. When, however, attempting to define trust a multitude of different definitions emerges. Even when restricting the search for trust in computer science, multiple, often-contradictory definitions are found (c.f. Artz and Gil, 2007). In general, the only thing most authors can agree on is the relevance of the concept of uncertainty (Lewis and Weigert, 1985; Malhotra,

(24)
(25)
(26)

2.3.

Trusting the system?

As algorithm-based trust is deemed to replace the incumbent forms of institution-based trust (Tapscott and Tapscott, 2016), these two governance formswill be compared against each other. Pavlou & Gefen (2004) researched online marketplaces and therein introduced the concept of third-party based trust – which is another term for institution-based trust11 (Shapiro, 1987; Zucker, 1987) - as the adoption of institutional mechanisms by third-party service providers. These mechanisms are considered a means to ‘increase the scope for regulating human behaviour’ (Fehr and Fischbacher, 2004) and originated in the institutional environment. The institutional environment is a concept of new institutional economics (NIE) (North, 1991) and describes the sum of all institutions within which organizations and institutional actors exist. Institutions themselves are defined as ‘humanly devised constraints’ structuring ‘political, economic and social interaction’ by providing ‘the incentive structure of economics’ (North, 1991, p. 97). These institutions thus define social interactions by presenting a system of ‘established and prevalent social rules’ (Hodgson, 2006).

To consistently assess institution-based trust a new institutional economics perspective (North, 1992) is adopted, that pays special attention to transaction cost economics (TCE)

11 Pavlou & Gefen (2004) do not differentiate between third-party based trust and institution-based trust in

(27)

(Williamson, 1981). Classical TCE (Coase, 1937, 1960) relies on differently (in)complete contracts (Chen, 2000) to overcome the opportunism of people (Williamson, 1975 and 1985). Contracts are thus dependent on institutions to enforce the conditions stated therein, resulting in an inherent connection between TCE and NIE. The ability and willingness of institutions to define and enforce these ‘rules of the game’ (North, 1991, p. 98) specifies the degree of uncertainty and subsequently the amount of transaction costs required for business transactions via contracts and the like (Williamson, 1981; Oxley, 1997). Uncertainty thus stems from the environment and can be actively managed by investing in transaction ‘insurances’ such as contracts or third-party mediators. This describes the first aspect of institution-based trust. Secondly, uncertainty arising from the institutional environment is managed by institutional pressures homogenizing institutional actors and organizations. In other words, institutional pressures emerge from the institutional environment and force institutional actors and organizations to align with the institutional environment (c.f. DiMaggio and Powell, 1983, 1991). Usually, three different ‘pillars’ or ‘pressures’ are identified that conceptualize the institutional environment in terms of regulatory, cognitive and normative dimensions (Scott, 1995).

(28)

Credit card guarantees are relating to the underlying institutional environment as the national government is implementing and enforcing a set of standards that ensure a recourse by financial institutions (Pavlou and Gefen, 2004, p. 38). The last mechanism identified by Pavlou and Gefen (2004), trust in the marketplace intermediary, is defined by transference of trust. The authors herein build on Stewart (2003), who showed that trust in online contexts can be derived from a generalized impression formed by relations to a more trusted organization12. Neglecting dyadic trust building over institution-based trust is conform with Pavlou and Gefen (2004) who further argue, that industry cross-contamination leads to a generalized trust in others (Rotter, 1967; Dasgupta, 2000) in ‘one-to-many’ relations (Pavlou and Gefen, 2004, p. 40). An example for industry cross-contamination would be the collapse of a textile production facility in a developing country creating a backlash against the entire industry rather than the one company sourcing from or operating the collapsed facility. Pavlou & Gefen (2004) continue this string of arguments by proposing that ‘a generalized trust belief’ is thus more important than any dyadic trust relationship.

In combination with the first two aspects of institution-based trust, managing uncertainty of the institutional environment with transaction ‘insurances’ and the homogenizing institutional pressures and mechanisms, this results in a generalized belief in institutional trust. This institutional trust provides the foundation for other forms of trust, such as

12 Pavlou & Gefen (2004, p. 38) identify eBay, Amazon and others as examples for these trusted organizations

(29)
(30)

knowledge and conflicting interests (Lamport, Shostak and Pease, 1982). It represents the relative absence of uncertainty (Christopher et al., 2016) and is thus a means to decrease transaction costs (c.f. Doney, Cannon and Mullen, 1998; Oxley, 1999; Woolthuis, Hillebrand and Nooteboom, 2005; Kaufmann, Kraay and Mastruzzi, 2011; Christopher et al., 2016). Institution-based trust is hence a precondition for functioning markets (Lane and Bachmann, 1996) and is consequently deemed the ‘most important mode of trust creation’ (Zucker, 1987). This ability of the institutional environment to create trust is usually described by relating to three institutional ‘pillars’ of Scott (1995) (e.g. Hofstede, 1980; Hall and Soskice, 2001; Meyer and Peng, 2005; Wright et al., 2005). By measuring these pillars with proxy criteria the institutional ‘quality’ (e.g. Levchenko, 2007; Sobel, 2008) can be assessed. It was herein shown that the unique institutional environments produce differing levels of institution-based trust (c.f. Hall and Soskice, 2001). This trust can both stem from third-party service providers adopting institutional mechanisms and from institutions themselves (Pavlou and Gefen, 2004). Subsequently, the first hypothesis emerges:

Hypothesis 1

(31)

2.4.

In code we trust?

(32)

them to risks? The concept of ‘algorithm-based trust’ is intended to describe trust in open-source blockchain solutions. When attempting to define this concept it becomes obvious that there is no established definition available from prior research. This is, however, not surprising considering the early stage of blockchain research.

(33)

can be ‘expressed and implemented’ in a simultaneously moral and technical demeanor (Kelty, 2008, p. 8 f.). The leading scholar in this research domain, Kelty, builds his thoughts on Taylor, who in turn is strongly influenced by Habermas and Warner. Accordingly, there is a partial ideological congruence between some of the contemporarily most prominent philosophers and the motivation of a community from a deeply technical area, such as open-source software development. Linking back to Sas & Khairuddin (2015), the authors transferred Zucker’s triadic conception of trust to human-computer interactions; e.g. a user trusting a cryptoasset or protocol. Even though this is highly relevant to describe trust in the blockchain, it fails to describe the unique interplay between the technological and social aspects in open source environments. Therefore, even the HCI conception of trust needs to be extended to properly define the term ‘algorithm-based trust’. To render this possible, this thesis will strongly draw from Kelty’s main research contribution - the development of the concept of a ‘recursive public’. A recursive public is constituted by a group of persons having a shared understanding and identity - or in the words of Kelty, a ‘moral imaginary’ (Kelty, 2008, p. 9) – about a technical infrastructure being a means of communication and an expression of their intent at the same time. The group derives their ‘everyday practical commitments and identities’ from this infrastructure (Kelty, 2008, p. 27 ff.). It is recursive13

and not entirely iterative, since the underlying technology or protocol sets the ultimate boundaries. Within these boundaries, the medium of communication and coordination – which equals the underlying protocol defined by the shared understanding – is ‘relentlessly

13 The Cambridge Dictionary (2017) defines recursive as: involving doing or saying the same thing several times

(34)

questioned’ and ideally independent (f. Kelty, 2008, p. 8). The goal is the creation of a playing field with ‘a certain kind of agency, effected through the agency of many different humans, but checked by its technical and legal structure and openness.’ Ultimately, therefore, a community of users identify with an imaginary that is represented by and created in the protocol (Kelty, 2008). Thus, it is of utmost importance for this community to have the ‘ability to build, control, modify and maintain’ this infrastructure or protocol.

Research of Lakhani and Wolf (2003) supports this notion as they conclude in their paper titled ‘Understanding motivation and effort in free/open source software projects’ that enjoyment-based intrinsic motivation was the ‘strongest and most pervasive driver’ for participation.

(35)
(36)

Figure 3: Theoretical fundament of algorithm-based trust

(37)
(38)

Hypothesis 2

H2: A strong institutional focus on ICT leads to high levels of algorithm-based trust. DV: Algorithm-based trust | IV: Institutional ICT focus

2.5.

Connecting the dots?

(39)

Gefen (2004) who showed that new ecosystems, such as the internet around the turn of the century, lack a transferability of established institutional mechanisms (C.f. chapter 2.3). While, institutional regulation caught up and complemented and amplified the new mechanisms developed within this ecosystem, the respective mechanisms are still existent and valid on their own. When Amazon distributes their ‘Premium retailer’ label, that alone is justification and insurance enough to safely purchase products from this retailer. In this example it does not matter in which institutional environment this ‘Premium retailer’ resides. It is yet to be seen, whether algorithm-based trust will represent an alternative or an extension of institutional trust. In either way, history has shown that new mechanisms to reduce uncertainty can be developed. It will be argued that the contemporary mechanism, institutional trust, currently opposes algorithm-based trust. This notion is further backed by anecdotal evidence. A brief investigation into the backgrounds of the parties demanding more institutional regulation of cryptoassets showed that most of these parties have both, vested interests and a ‘high-quality’ institutional environment (e.g. FleishmanHillard Fishburn, 2018; Partington, 2018). Summarizing, it can be expected to see an impact of institutional trust on algorithm-based trust. Hence, the last hypothesis emerges:

Hypothesis 3

H3: High levels of institutional trust lead to low levels of algorithm-based trust.

(40)

By combining these hypotheses the following conceptual model emerges. Hypothesis 1 expects a positive correlation between a high quality of the institutional environment and high levels of institution-based trust. Hypothesis 2 depicts a positive correlation between the institutional focus on ICT and the levels of algorithm-based trust. Hypothesis 3, connects the concepts of institution- and algorithm-based trust. A correlation between high levels of institution-based trust and low levels of algorithm-based trust is expected.

Figure 4: Conceptual model

Institutional environment Institution-based

trust Algorithm-based trust

H1

H3

(41)

3. Methodology

3.1.

Research strategy and design

(42)
(43)

3.2. Operationalizing the hypotheses.

3.2.1.

Institutional trust

To be able to measure the theoretically deduced correlations, it is crucial to operationalize the hypotheses. Coherently, the dependent variable institutional trust, as well as the independent variable, institutional quality need to be operationalized. To measure the institutional environment a strategy from Berg (2017) will be adopted that defines the institutional environment with the help of proxy data from the World Bank (World Bank, 2017). By matching the country of residence of the survey participant to this data an assessment of the quality of the institutional environment is possible. In combination, these variables can be expected to properly depict the institutional quality of a country. The variables are ‘Corruption’, ‘Government Effectiveness’, ‘Political Stability and Absence of Violence/Terrorism’, ‘Rule of Law’, ‘Regulatory Quality’ and ‘Voice and Accountability’.

(44)

assess institution-based trust. The formulation of these questions will be following the respective guidelines of the OECD (2018). The questions will be asked with a 7-point Likert scale where one means ‘do not trust them at all’ and seven means that the survey respondent ‘trust[s] them a great deal’. More information about the scale being used is given in the chapter ‘Questionnaire design’.

Figure 5: Measurement model of hypothesis 1

IV: Institutional quality Corruption Government effectiveness Political stability DV: Institution-based trust Absence of violence/terrorism Regulatory quality Primary data: Questionnaire Government Police Court system Large companies Banks Media

NGOs Trade unions

(45)

3.2.2.

Algorithm-based trust

To measure hypothesis 2 it is necessary to operationalize two more variables. The dependent variable, algorithm-based trust will be measured with a set of four-questions. The first two questions aim to measure algorithm-based trust for users of cryptocurrencies, while the latter two aiming to measure this concept for non-users. Question 8 and question 10 reflect the technical dimension, while question 9 and question 11 reflect the governance dimension of algorithm-based trust.

Q8 Even if there is no institutional regulation, I trust that I will receive the utility advertised by cryptoassets.

Q9 I would like to have more institutional regulations for cryptocurrencies to feel safe about using them.

Q42 I would never use cryptocurrencies before completely understanding how they are working and governed

Q43 I would never use cryptocurrencies before there is a strong set of institutional regulations in place.

(46)
(47)

will be derived and added to the questionnaire. The items of the proxies will be further used to conduct an exploratory factor analysis.

Figure 6: Proxy measurements for algorithm-based trust Proxy

measurement

Theoretical background Assessment

Crypto ideology

Distrust in the institutional system can be considered the absence of institutional trust. The items used to measure distrust (c.f. Edelman Intelligence, 2017) can thus be inversed, adapted and used to measure algorithm-based trust.

Not only is there a direct connection between distrust and trust, but algorithm-based trust is also deemed an alternative to contemporary forms of trust (c.f. Davidson, De Filippi and Potts, 2018). Measuring if a person perceives a blockchain based governance to be better, may thus be a strong proxy measurement for algorithm-based trust.

7-point Likert-type scale

Items: sense of justice, hope for the future, confidence, wish for adoption

Usage A technology will generally only then be used if the vulnerability going along with the usage is appropriate (c.f. Pavlou, 2003). As trust and vulnerability are closely interlinked (c.f. Rousseau et al., 1998; Christopher et al., 2016) it seems sensible to assess the concept of algorithm-based trust, which furthermore strongly builds upon the blockchain technology, with a technology acceptance model (Pavlou, 2003).

(48)

General

predisposition

Generally, it can be assumed that people with algorithm-based trust have a strong affinity and knowledge about information technology and would want the current system to reflect respective structures. Accordingly, a combined proxy of ‘IT knowledge’ and ‘Wish for reforms’ is deemed to represent a general predisposition towards algorithm-based trust.

7-point Likert-type scale Items: IT knowledge; wish for reforms Financial involvement

Another potential proxy for algorithm-based trust is the relative amount of money that is invested into cryptoassets. If a person invests a lot if his disposable income into cryptoassets and has lower than average ratings for the ‘gold rush metric’ it may be appropriate to deduce that the respective person has high levels of algorithm-based trust. 7-point Likert-type scale Items: Relative investment in cryptoassets; Gold rush metric

Crypto interest

If a person has a very high and continuous interest in actively seeking more information about cryptoassets, it may be sensible to assume a high interest. This interest might be stemming from a desire to use or invest in them or comes from a fascination about the technology. Since the technology is deeply interlinked with the ecosystem around cryptoassets and the ‘gold rush metric’ allows to filter for a pure speculative interest it can be followed that a high interest in cryptoassets may translate to high levels of algorithm-based trust. 7-point Likert-type scale Items: total invested time; focus on crypto related news

(49)

trust. To assess the prior, the institutional environment, the institutional focus on information communication technology will be assessed. To measure this, two equally weighed reports are merged. These reports are ‘Measuring the Information Society Report 2017’ (ITU, 2017a, p. 49) from the International Telecommunications Union and ‘The Networked Readiness Index’ created in a collaborative effort of the Johnson Cornell University, INSEAD and the World Economic Forum (c.f. Baller, Dutta and Lanvin, 2016). The first report has a focus on assessing the IT infrastructure and measures the ICT access, ICT use and the ICT skills with eleven sub categories. The ICT access is calculated by eight items14. The category ‘ICT skills’ is measured by three additional items15 (ITU, 2017a, p. 29). The second report, ‘The Networked Readiness Index’ particularly reflects the utility derived from this infrastructure (c.f. Baller, Dutta and Lanvin, 2016). It draws from a wide range of information sources such as the ITU, the UNESCO, the World Bank and other UN organizations16. Subsequently, a combination of these elements seems to be a sensible choice that will match the research purpose of this thesis. It is proposed to see higher

14 These items are the (1) fixed telephone subsciptions per 100 inhabitants, (2) Mobile-cellular telephone

subscriptions per 100 inhabitants, (3) International internet bandwidth (bit/s) per internet user, (4) percentage of households with a computer and (5) percentage of households with internet access. The ICT use is derived by merging information about the (6) percentage of individuals using the internet, (7) fixed-broadband subscriptions per 100 inhabitants and (8) active mobile-fixed-broadband subscriptions per 100 inhabitants.

15 These items are the (9) mean years of schooling, (10) secondary gross enrolment ratio and the (11) tertiary

gross enrolment ratio.

16 The drivers of the networked readiness are differentiated in ‘Readiness’ and ‘Usage’. The category readiness

(50)

confidence in the perceived ability to determine code from survey participants based in countries with an institutional focus on information communication technology.

Figure 7: Measurement model of Hypothesis 2

3.2.3.

Connecting the two

To measure the correlation between the dependent variable ‘Algorithm-based trust’ and the independent variable ‘Institution-based trust’ the DV will be assessed with primary data from the questionnaire, while the values for the IV will be adopted from the World Bank (2017). This will be consistently with Hypothesis 1.

IV: Institutional ICT focus Measuring the Information Society Report 2017 DV: Algorithm-based trust Primary data: Questionnaire Technology acceptance Crypto interest Financial involvement Secondary data:

(51)

Figure 8: Measurement model of hypothesis 3 IV: Algorithm-based trust Usage Crypto interest DV: Institution-based trust Primary data: Questionnaire Government Police Court system Large companies Banks Media

NGOs Trade unions

(52)

3.2.4.

Proxy measurements of algorithm-based trust

The first proxy measurement ‘Crypto ideology’ can be operationalized with the following set of questions.

Table 1: Measuring the proxy measurement ‘Crypto-ideology Perceived system

failure

Technical background of

blockchain Crypto ideology

Sense of injustice/

justice

Q12 The system is biased against regular people and in favour of the rich and powerful (c.f. Edelman Intelligence, 2017). Decision-making procedures can be hardcoded in the protocol. Decision-making is decentralised17 and thus truly

democratic18 (c.f. Risius and Spohrer, 2017).

Cryptocurrencies enable unbiased and truly democratic solutions.

Q13 The elites who run our institutions are out of touch with regular people (c.f. Edelman Intelligence, 2017). The governance of cryptoassets is inherently defined by the community (c.f. Risius and Spohrer, 2017).

The community plays a crucial role for the development and continuous operation of cryptoassets.

Lack of hope that

Q14 My hard work will be rewarded (c.f.

The blockchain restores causality and enables

Cryptocurrencies enable a fair

17 In reality, multiple concerns arise from a quickly growing wealth and thus influence (i.e. Ethereum

improvement proposal voting) of the first people having bought into cryptocurrencies (Ethereum, 2018). This can be balanced, but not negated.

18 According to the Cambridge Dictionary (2018) democracy is defined as: ‘the belief in freedom and equality

(53)

the future will be better/ Hope for the future Edelman Intelligence, 2017). conditional payments (c.f. Risius and Spohrer, 2017).

distribution of monetary rewards.

Q15 The country is moving in the right direction (c.f. Edelman Intelligence, 2017). The blockchain as a solution to contemporary problems (c.f. Risius and Spohrer, 2017). Cryptocurrencies are an important and necessary development. Lack of confidence in the leaders/ Confidence in governance Q16 I do not have confidence that our current leaders will be able to address our country’s challenges (c.f. Edelman Intelligence, 2017). The blockchain

ecosystem is a new and highly attractive field (De Castillo, 2017). It is likely that many well-educated people with a desire to bring about change will flock to this area (c.f. Risius and Spohrer, 2017).

Cryptocurrencies will change the world to the better.

Q17 I do not see any institutional efforts that could successfully tackle our country’s challenges (c.f. Edelman Intelligence, 2017). As a general purpose technology, or an institutional technology the array of potential areas of application is very vast (c.f. Davidson, De Filippi and Potts, 2018).

I am following multiple promising cryptocurrencies that could tackle our country’s challenges. Wish for reforms/ adoption Q18 We need forceful reformers in positions of power The blockchain as an evolution of ledger technology with its own

(54)

to bring about much needed change (c.f. Edelman

Intelligence, 2017).

and new surrounding ecosystem (c.f. Risius and Spohrer, 2017).

bring about much needed change.

Q19 We need to reset the way our country works (c.f. Edelman Intelligence, 2017).

We need to ‘reset the playing field’.

The second proxy measurement, ’Usage’, aims to assess algorithm-based trust by measuring three components of blockchain users with a technology acceptance model (Pavlou, 2003). These components are the perceived risk, usefulness and ease of use.

Table 2: Measuring the proxy measurement ‘Usage’

Perceived risk Q20 When making a transaction of cryptocurrencies I am anxious about making a user error that can not be remedied.

Q21 I am generally anxious about losing my cryptoassets in an

unplanned way. Perceived

usefulness

Q22 I can easily imagine multiple areas of application the blockchain

could improve.

Q23 I am very glad that the blockchain technology is gaining adoption. Perceived ease of

use

Q24 I think cryptocurrencies are easy to use.

(55)

The third proxy measurement measures the general predisposition towards algorithm-based trust. The combination of the individual’s ‘wish for reforms’ and their ‘IT

knowledge’ is aimed to depict this predisposition. Therefore, questions 18 and 19 will be used and enriched with data from the following three questions:

Table 3: Measuring the 'IT knowledge' component for the proxy measurement 'General predisposition' Q26 I have a strong educational/working background in or relating to IT. Q27 I am very familiar with reading programming code.

Q28 I completely understand how the blockchain works.

The fourth proxy measurement measures the respondent’s financial involvement into cryptoassets. To gauge the level of financial involvement these two questions are derived:

Table 4: Measuring the proxy measurement ‘Financial involvement’ Q29 Cryptoassets are the biggest position in my investment portfolio.

Q30 Ever since investing in cryptoassets the total sum of my invested money rose significantly.

(56)

Table 5: Measuring the proxy measurement ‘Crypto interest’ Q31 I am now spending significantly more time with reading news.

Q32 I would rather read news about cryptocurrencies than read general news.

Additionally to these proxies a set of control questions is introduced that aims to allow a ‘gold rush’ filtering. Ergo, the respondent’s interest in the blockchain as a technology and a means to get rich quickly will be assessed.

Table 6: Control questions for the 'gold rush’ metric Q33 I enjoy playing the lottery.

Q34 I enjoy gambling.

Q35 Currently, my most important goal is to become rich quickly.

Q36 I care more about the price development of cryptoassets than their technology development.

3.2.5.

Population

(57)

applications there is no better way to conduct user based research in this area. It is, nonetheless, possible to make some estimations about the total population, thus people using cryptoassets. The sample size can therefore be related to the total population in the lack of a clear sampling frame. For this purpose, the amount of unique Bitcoin addresses will be used as a starting point. BitInfoCharts (2018) shows 26,981,401 addresses with a certain stake of Bitcoin. It can be assumed that every cryptoasset owner is counted within this number due to two main reasons. Firstly, Bitcoin is by far the most popular currency, secondly, Bitcoin is the most commonly accepted exchange medium to convert fiat currency into altcoins. As there is oftentimes a float remainder left that cannot be sent without paying transaction fees that exceed the value, most altcoin users also own Bitcoin. This means that almost every owner of cryptoassets has a tiny amount of Bitcoin. Adopting the insight from Hileman & Rauchs (2017) that the average users has two wallets, the amount of total addresses will be divided by two, leading to a population of 13,490,700 people.

(58)
(59)

3.3.

Questionnaire design

(60)

Wansink, 1990). The scale goes from ‘I strongly disagree’, one point, to ‘I strongly agree’, seven points (c.f. Vagias, 2006).

(61)

deemed necessary for the individual questions some context about cryptoassets will be provided at the beginning of the questionnaire.

(62)

4. Empirical results

Most of both intentional and unintentional respondent errors can be prevented by assuring the anonymity and data security of respondents, drafting a well-designed and clearly formulated survey and having a guided survey answering process with prompters, graphical intersections and the like (Burns and Bush, 2014, p. 273). However, nonresponse can not be completely controlled for. This has been continuously identified as a major hurdle of conducting survey research (c.f. Yan and Curtin, 2010) and also impacts this study in the form of item omission. Some items of the survey are being skipped by the survey respondent (Burns and Bush, 2014, p. 274) even though the survey is designed to be as short as possible to maximize response rates (c.f. Deutskens et al., 2004). In 0,13% of all cases a survey participant skipped one item of the questionnaire. For these cases the missing values were replaced with the respective mean value. A further data quality inspection resulted in an exclusion of the common Likert-scale ‘yay or nay’ and ‘middle of the road’ patterns, where respondents either state extreme positions or are centred around the middle value of the Likert item (Burns and Bush, 2014, p. 282).

(63)
(64)

Figure 9: Gender distribution of survey respondents Figure 10: Education distribution of survey respondents

Figure 11: Income distribution of survey respondents

51

234

3

female male neither

3 43 135 89 18 39 37 34 43 50 41 44

no income less than 10k€ between 10k and

(65)

The respondents came from 46 countries from across the globe. The evident predominance of German and American respondents is explained by the choice of the snowball-sampling method and the authors nationality and connections.

(66)

Hypothesis 1 can be tested by assessing the correlation between the dependent variable, institutional trust, and the independent variable, institutional quality. ‘Institutional trust’ is a 7-point Likert-style scale with eight items. The items are a more than sufficient amount of Likert-style scale items, are furthermore well balanced (c.f. Johns, 2010) and are drawn from the OECD and the World Values’ Survey (c.f. OECD, 2011, 2017b, 2018; Inglehart et al., 2016). Respectively, the validity is high. The same holds true for the internal reliability (α =

0,868). Plotting the means of the Likert-style items results in the following histogram.

(67)

Taking the mean of ordinal data has been criticised, but is a commonly used method (c.f. Flora and Curran, 2004; Jamieson, 2004). The main point of critique states that it is impossible to quantify the exact distance between points on the Likert scale and thus ensure equidistance (c.f. Camparo, 2013; Sullivan and Artino, 2013). This led to an intense discussion about treating Likert data as ordinal or continuous (c.f. Heeren and D’Agostino, 1987; Flora and Curran, 2004; Derrick and White, 2017). To cope with this issue the scale has been designed very carefully – as outlined earlier. Subsequently it is acceptable to treat it as ‘quasi-parametric’ and thus continuous data (c.f. Costello and Osborne, 2005). It has, furthermore, be shown that most analysing methods are very robust (Derrick and White, 2017). In the attempt of the author to take a middle-road in this debate, single-item Likert-style scales will, however, be treated as ordinal data, while a multi-item Likert-Likert-style scale will be treated as continuous data. The mean value (3.69) for institutional trust is normally distributed (n = 288) with acceptable19 (George and Mallery, 2010) amounts of skewness (-0.323) and kurtosis (0,144). The standard deviation is acceptable (1.071). Further analysis shows that the mean scores for cryptousers (3,5) and non-users (4,4) differ. To further determine whether there are any differences in the institutional trust score of crypto-users and non-users an independent-samples-t-test was run, that confirmed that users (mean rank = 129,06) have significantly lower levels of institutional trust than crypto-non-users (mean rank = 195,42) (c.f. Figure 29 for procedure details).

19 If the statistical values of the skewness and kurtosis fall between the range of -2 to 2 the distribution is

(68)

4.1.

Quantifying institutional trust

However, H1 tests the correlation between the institutional quality (IV) and the levels of institutional trust (DV). As interesting as the differences for the institutional trust scores among crypto-users and crypto-non-users are, they are irrelevant to test this hypothesis. To assess this correlation, the countries are coded and matched with the levels of institutional quality. As outlined earlier the institutional quality is computed by taking the mean of the items, which have been scored by the World Bank (2017).

(69)

Figure 15: Descriptive statistics for the mean distribution of 'Institutional trust' and 'Institutional quality'

N Min. Max. Mean Std.

Deviation Var. Skewness Kurtosis

Statistic Statistic Statistic Statistic Statistic Statistic Statistic Std. Error

Statistic Std. Error

INS_T_mean 288 1,00 6,38 3,69 1,07 1,15 -,32 ,144 -,21 -,29

INS_Q_mean 288 15,83 98,71 76,72 17,91 320,80 -1,55 ,144 1,51 -,29

(70)

4.2.

Measuring algorithm-based trust

Before it is possible to measure H2 and H3 it is necessary to further focus on the assessment of algorithm-based trust. According to the theory derived in chapter 2.4 two Likert-style items have been developed to measure the two dimensions of algorithm-based trust. As outlined earlier it is not ideal to have single-item Likert-style scales, but can be sufficient. Generally speaking, the fit of a measurement depends on its reliability and validity. Subsequently the reliability of the measurement of algorithm-based trust can be increased by expanding the ordinal data from each one-dimensional, one-item Likert-style scale of Algorithm-based trust with other items of the proxy-measurements. By doing so, the results of the Likert-style items can be merged into one Likert-style scale and treated as quasi-parametric data (c.f. chapter 3.3). To further continue with this analysis, it is necessary to ensure that all Likert-scale items are coded in the same direction20. Whenever, this is not the case respective adaptations are being done.

(71)
(72)

Subsequently, a factor analysis (FA) is conducted. More specifically, an Exploratory Factor Analysis, which enables the researcher to come up with a factor structure that does not require an a priori theory about the underlying items. Therefore, it is perfectly suited to identify the best items to measure algorithm-based trust. The EFA is computed by running a Maximum Likelihood Analysis. It would be more appropriate to conduct this analysis with Structural Equation Modelling, what, however, exceeds the scope of this research. Nonetheless, the outlined procedure will shed light on how to further proceed with this analysis, may be a starting point for future research and is expected to drastically increase the validity of the measurement and thus its explanatory power. This is the case since all items in the FA are supposed to measure the same construct. The FA creates a set of artificial variables or latent factors, which can be used to identify relationships. As ‘Algorithm-based trust’ is a two-dimensional construct and the FA identifies the underlying latent factors on which it loads the various items, one FA with each dimensional factor of ‘Algorithm-based trust’ and all factors of the proxy measurements will be computed. In this case, the ideal factor rotation is Direct Oblimin which allows non-orthogonal factors (c.f. Costello and Osborne, 2005; ‘Exploratory Factor Analysis’, 2014).

The FA requires the underlying data set to meet four assumptions. After ensuring the applicability21 of the FA for this data set, the eigenvalue-one criterion is used to distinguish latent factors (Kaiser, 1974). The respective factor loading scores are indicating the

(73)

correlation between the respective item and the latent factor. The cut-off value for FA loadings has frequently been chosen at 0,4 (c.f. Walker and Maddan, 2006). As this ‘rule-of-thumb’ is only of limited conclusiveness the correlation matrices (structure matrix) are furthermore inspected for homogenous patterns. As the pattern matrix of the SPSS output relates to the regression coefficients, the structure matrix is - in this case – the better choice to interpret the factor loadings (c.f. Thompson, 2004). Consistently with the theoretical introduction of the proxy measurements, a total of six latent factors has been identified, which are depicted in the scree plots of Figure 17 and Figure 18. The six latent factors with eigenvalues above 1 account for 49,8% of the entire variance.

Figure 17: Scree plot of the FA of the institutional dimension of

algorithm-based trust Figure 18: Scree plot of the FA of the technical dimension of algorithm-based trust

(74)

Figure 19: Factor loadings of the EFA with the technical dimension of Algorithm-based trust

(75)

the primary loading is at least 0,2 larger than the secondary loading. Both of the newly composed, merged Likert-style scales have sufficiently high scores for the internal reliability (αtech = 0,71 and αinst = 0,73) and can thus be used for the further analysis.

(76)

Before continuing with the testing of Hypothesis 2 and Hypothesis 3 the other latent factors of the two FA’s will be further assessed. Factor 2 of the FA which includes the technical dimension of algorithm-based trust is composed of the items from the questions 35 (loading of 0,82), 36 (loading of 0,97), 37 (loading of 0,56) and 26 (loading of 0,24).

Referenties

GERELATEERDE DOCUMENTEN

Due to this feature, the indistinguishability relation for the epistemic constructs gets a dynamic semantics by taking the communicated keys and cryptographic terms in the

The study found that through organisational commitment and team commitment, respectively, trust in co-worker has a positive effect on organisational citizenship

[r]

Thirdly, this study expected a positive moderating effect of interdependence on the relationship between relational trust and relationship performance, based on

In het zevende en laatste thema gaat het om de effecten van het mede- dingingsbeleid: ‘zonder mededinging vaart niemand wel’. Allereerst wordt door Jarig van Sinderen en Ron

The ferroelectric and piezoelectric properties of textured sol–gel, textured PLD and epitaxial PLD PZT thin- films, deposited on Pt/Ti/SiO 2 /Si and SRO/STO/Si sub- strates

Furthermore, Figure 21 shows that in the heating curve, the plateau corresponding to the colloidal behavior of the chains is observed at high temperature (around T = 55 –

The narrow bandwidth of the vacuum ultraviolet source yielding high-resolution spectra, the high sensitivity of the laser-induced fluorescence method, and the simplification of