• No results found

Hands off the internet! : how the securitization of an internet governance approach impacted the World Conference on International Telecommunications

N/A
N/A
Protected

Academic year: 2021

Share "Hands off the internet! : how the securitization of an internet governance approach impacted the World Conference on International Telecommunications"

Copied!
108
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Hands off the internet! How the securitization of an

internet governance approach impacted the World

Conference on International Telecommunications

Harm Hoksbergen

Thesis supervisor: Dr. Stephanie Simon Second reader: Prof. Dr. Marieke de Goede

Thesis submitted in partial fulfilment of the requirements for the degree of Master of Science (MSc) Political Science (International Relations) at the Graduate School of Social Sciences, University of Amsterdam.

(2)
(3)

Of is dat het niet?

Het is gewoon een kwestie van perspectief. Zie je rommel op de ruit of zit er stof op je oog? Het is maar net waar je in gelooft.

Ieder zijn belevingswereld, goed of slecht Het is maar net hoeveel waarde je eraan hecht.

- Glenn de Randamie (2007)

Three thousand miles of wilderness, overcome by the flow A lonely restitution of pavement, pomp and show

I seek a thousand answers, I find but one or two I maintain no discomfiture, my path again renewed.

(4)
(5)

Abstract

The 2012 World Conference on International Telecommunications (WCIT-12), hosted by the International Telecommunication Union (ITU), a United Nations specialized agency, ended in a puzzling outcome. The aim of the conference was to revise an old International Telecommunication Regulations (ITRs) treaty, the outcome was accepted by many member states, while others rejected. The ITU mostly deals with non-controversial technical issues, which made the conflicted outcome and failure to reach consensus remarkable. How did his happen? This study analysis a securitization process in the run-up, during and after WCIT-12, which aimed to securitize a ‘hands-off’ internet governance approach. A coalescence of actors used succesfull securitization moves to secure the hands-off vision on internet regulation at WCIT-12. Resulting in an unworkable environment to reach consensus on a new ITRs treaty. The study provides a new perspective in understanding the outcome of WCIT-12, and gives new insight in the employment of securitization theory and internet governance conceptions.

Keywords: WCIT-12, ITU, ITRs, internet governance, securitization theory

(6)

Acronyms and Abbreviations

ccTLD Country Code Top-Level Domain

CS Copenhagen School

DNS Domain Name System

EC European Commission

EP European Parliament

EU European Union

gTLD Generic Top-Level Domain

IANA Internet Assigned Numbers Authority

ICANN Internet Cooperation for Assigned Names and Numbers ICT Information and Communication Technology

IETF Internet Engineering Task Force IGF Internet Governance Forum IR International Relations

ITR(s) International Telecommunication Regulation(s) ITU International Telecommunication Union

ITUC International Trade Union Confederation NGO Non-Governmental Organisation

SG Secretary-General

TCP/IP Transmission Control Protocol/Internet Protocol

TLD Top-Level Domain

UAE United Arab Emirates

UN United Nations

US United States (of America)

WCIT/ World Conference on International Telecommunications 2012 WCIT-12

WGIG Working Group on Internet Governance WSIS World Summit on the Information Society

(7)

Table of Contents

Abstract ... 4

Acronyms and Abbreviations ... 5

Preface... 9

Introduction ... 12

Chapter 1: Conceptual framework ... 16

1.1 Conceptualizing internet governance ... 16

1.1.1 Theorising the internet as technical infrastructure ... 17

1.1.2 The emerging internet governance debate ... 19

1.1.3 Reformulating the IR perspective on internet governance ... 23

1.2 Conceptualizing securitization theory... 26

1.2.1 Securitization in three steps ... 27

1.2.2 Facilitating factors in securitization theory... 29

1.2.3 Refining speech act theory ... 29

1.3 Reflection on the usage of securitization theory ... 33

1.3.1 Securitization theory and the handling of identities ... 36

1.3.2 Discharging desecuritization as normative aim ... 38

1.3.3 Setting out the research objective ... 40

Chapter 2: Research design ... 42

2.1 Case study research design ... 42

2.1.1 Level of analysis ... 44

2.1.2 Unit of analysis ... 45

2.2 Outlining methods ... 46

2.2.1 Discourse analysis as analytical tool... 46

2.2.2 Analysing extra-linguistic forms of a speech act ... 51

2.2.3 Data collection & presentation... 52

Chapter 3: Sound the alarm, the ITU plans an internet takeover! ... 56

3.1 Commencing the securitization move ... 56

3.1.1 Leaking WCIT-12 documents ... 57 6

(8)

3.2 Campaigning against WCIT-12 ... 58

3.2.1 Google’s ‘Take Action’ campaign ... 59

3.2.2 Continued securitization in the media ... 62

3.3 Parliamentary securitization contributions ... 64

3.3.1 The US Congress concurrent resolution ... 65

3.4 ITU’s attempt to desecuritize WCIT-12 ... 66

Chapter 4: Stonewalling WCIT-12 and its aftermath ... 70

4.1 Don’t mention the I-word ... 70

4.1.1 Progress in the midst of controversies ... 71

4.1.2 Measuring the temperature in the room ... 73

4.1.3 Crumbling consensus ... 75

4.2 WCIT-12 a failure or a success? ... 76

4.2.1 Pointing the blame ... 77

4.2.2 Controlling the hands-off approach ... 80

Conclusion ... 83

6.1.1. Theoretical conclusions and future research ... 85

6.1.2 Implications of the findings for the internet governance debate ... 86

Appendix ... 102

Appendix 1: ITU member states and WCIT-12 attendance... 102

(9)

List of Figures and Tables

Figure 1: Penrose Triangle (Wikimedia Commons, 2007) ... 11 Figure 2: ‘On the spot’ signatories and non-signatories of the WCIT-12 Final Acts (ITU, 2012b). ... 13 Figure 3: Schematic reflection of a successful securitization process in which an illocutionary and a perlocutionary speech act are used. ... 31 Figure 4: Fragment from the Google ‘Take Action’ website (Google, 2012a). ... 60 Figure 5: Fragment from the Google ‘Take Action’ website (Google, 2012a). ... 61 Figure 6: Delegates of different ITU member states show boards in support of Resolution 3 as part of ‘measuring the temperature’ in the room (ITU, 2012f). ... 74

Table 1: Different element in a speech act sequence (Vuori, 2008, pp. 77-89) ... 49 Table 2: Four strands of securitization (Vuori, 2008, p. 76) ... 50 Table 3: Top-10 of biggest ITU member state delegations in number of participants at WCIT-12 (ITU, 20WCIT-12c). ... 73

(10)

Preface

The past three and a half years have been an intriguing encounter with the academic field of political science and International Relations. Past years of studying have given me a full array of interesting new insights on the academic and the empirical level. This journey now comes to an end with this master thesis. Before presenting my research in detail I would like to seize this opportunity to briefly reflect on my past years as a student political science/International Relations.

My first class political science in 2010 was quite a disorienting experience. As I learned in class when talking about contested concepts, there is a total lack of consensus on fundamental concepts like power in the academic field. I felt social scientific research was flawed from the start if we were not able to settle such important definitional matters beforehand. If we could not agree on what it is we want to study, how can we study it? I discussed my indignation with my lecturer after class. In the discussion which followed I was asserted the answer to this question is quite difficult, but ‘there will be enough opportunities in the years to come to discuss this matter’ my lecturer assured me. Of course in retrospect my lecturer was right, looking back studying political science (and social sciences in general) is all about this question; how and in what way can we explain or understand the social world around us.

This lesson in pluriformity of academic insights, made clear to me thinking in a black and white dichotomy is generally too simplistic. For every viewpoint there is also an oppositional interpretation. The ‘truth’ for that matter is probably grey instead of black or white. Academic thinking also taught me the relativity of most social scientific research. As Jackson notes referring to the so-called ‘demarcation problem’ in scientific research: “Unfortunately, philosophers have come to no global consensus about what defines a field of inquiry as a “science” or a practice of knowledge-production as “scientific”” (Jackson, 2011, p. 11). This might be interpreted as an admission of weakness but instead offers a good learning school in life itself. How many times truth in daily life is also just a matter of interpretation? Although this might sound like Nietzschean thinking, I would not go so far to proclaim an ‘anything goes mentality’ or be needless pessimistic about the social scientific enterprise in general. Rather I would like to think of ‘practicing’ social science as a game of Scrabble in analogy of Friedrich Kratochwils metaphor:

(11)

“We begin with a concept that makes certain combinations possible. In crisscrossing, we can “go on”, and our additions are justified by the mutual support of the already existing words and concepts. Sometimes, we cannot proceed as our attempts to continue are stymied. Then we begin somewhere else and might, by circuitous route, reach again some known terrain. Potentially where are innumerable moves, and no two games are the same, since moves at different times will have different consequences. On the other hand, none of them is free in the sense that anything can happen. But none of them could have been predicted by the “view from nowhere” as everything depends on the words that are put in place, the site that is chosen for extending the game, and the time” (Kratochwil, 2007, pp. 49-50).

Every student of International Relations (IR) learns about the different ‘grand theories’ in the academic field. The different ontological and epistemological debates in IR were amply dealt with in my years of studying. As I learned every approach has its own strengths and weaknesses. This is why I always favour academic thinkers who critically reflect on their own work and openly admit the limits in their thinking.1 In my opinion a good social scientist guards against dogmatic thinking and dares to cross scientific borders so often created by certain ‘labels’. Classifying the field of IR using different schematic labels (e.g. neorealism/ classical realism) above all is an artificial exercise which shouldn’t obstruct fruitful scholarly exchange (Knudsen, 2001, p. 356).

Because the full array of ontological/epistemological positions in the field of IR we should be careful to attack each other’s work purely on the ground of these academic differ-ences. Otherwise the academic debate easily falls flat in a game of ‘oh yes it is’ - ‘oh no it isn’t’. Even anti-foundational approaches should be wary their deconstructing attempts and critically assessments of ‘value-free’ social scientific research, do not result in just endless academic debates on unbridgeable viewpoints or ‘paradigmatic wars’ as they are sometimes characterised (Lake, 2013). Rather these approaches could render interesting new insights by their own premises. I would not say conflicting views should not be expressed; rather I would like to refrain from academic strives which solely end in contrasting viewpoints but do not constitute any new insights.

Finally to give an answer to my own question; there is no need for an eclectic con-ception of ‘the truth’, instead every academic viewpoint renders its own valuable contribution

1

See for example the work of John Gerard Ruggie (1998, pp. 37-39).

10

(12)

by its own premises. I would say my present academic position is more agnostic and looks like a Penrose Triangle (Figure 1). Unifying academic research into one exhaustive truth is like the Penrose Triangle itself; simply impossible. Still every approach contributes its own valuable insights into the academic discipline and should therefore be considered a welcome contribution to the whole.

Figure 1: Penrose Triangle (Wikimedia Commons, 2007)

At this moment I would like to briefly acknowledge (but not exclusively) the help of some persons who made my ‘scientific endeavour’ possible. First of all I would like to thank both my parents who always supported me throughout the years in every possible way, and gave me the opportunity to break new paths. Secondly I owe much gratitude to Maaike, who helped me to regain sight in times I loosed track, and encouraged me to pass through by helping in any way she could. To conclude I would like to thank Stephanie who provided me a free reign to develop this thesis, but also provided helpful direction when I started to wander off. And without her watchful eye the title of this thesis might have implied the internet has hands ;-)

(13)

Introduction

It was with a ‘heavy heart’ and a ‘sense of missed opportunities’ the delegation from the United States walked away from the World Conference on International Telecommunications (WCIT).2 The 2012 conference held in Dubai was organised by the International Tele-communication Union (ITU), a specialized United Nations organization, and aimed to revise an International Telecommunication Regulations (ITRs) treaty dating from 1988.3 Back then the internet was still in its infancy and the old ITRs treaty therefor mainly focussed on the interconnectivity and interoperability of voice telecommunications (ITU, 2012a). Since 1988 much has changed, the telecommunications sector became a liberalized market in many countries and a shift has taken place from fixed to mobile telephony. Another transformation occurred in the ICT infrastructure which increasingly relies on IP-enabled (data) networks instead of traditional voice telecommunications networks (ITU, 2013). This new reality created a global impetus to update the ITRs treaty in order to face challenges related to global ICT use.4

The ITU is a typical intergovernmental organization, which entails all member-states have one equal vote.5 At an ITU conference it is the tradition and practice decisions are reached by consensus, since the ITU mostly deals with non-controversial technical issues it is usually possible to reach consensus. But a majority decision can also be casted by using a simple majority of votes. In the closing days of WCIT-12 (or ‘wicit-twelve’ as it is usually pronounced), it became apparent consensus had made room for distrust and discord between the different ITU member states. The newly proposed ITRs treaty was adopted by 89 ITU member states in an unusual majority vote (ITU, 2012b). This outcome was not acceptable for the US and other ITU member states, who refused to sign the new treaty (see Figure 2). It marked a split between ITU member states, which seemingly is an unusual outcome of the negotiations since most of the treaty contained a consensual package deal. And ITU member

2 This thesis is written in UK English based on the Oxford English Dictionary (2005) using APA Fifth Edition

citation style.

3

The tasks of the ITU include the allocation of the global radio spectrum and satellite orbits, developing technical standards that ensure network interconnectivity and advancing connectivity globally especially in deprived regions.

4 These topics include: Human right of access to communications, Security in the use of ICTs, Protection of

critical national resources, International frameworks, Charging and accounting, including taxation, Interconnection and interoperability, Quality of service and Convergence (ITU, 2012a, p. 2).

5 The ITU is an inter-governmental body with 193 full membership countries. In addition over 700

private-sector entities, non-governmental organizations and academic institutions are member of the ITU but these members have no voting right.

12

(14)

states can make provisions on the new treaty, whereby a member state can decide not to commit on unwelcoming pats of the treaty. Furthermore the ITU has no regulatory enforcement capabilities to look after the compliance of a treaty. The member state is self-responsible for ratification and implementation of an ITR into national legislation. In brief one would think there is little to fear of the new ITRs treaty, but reality seems to portray a different picture.

Figure 2: ‘On the spot’ signatories and non-signatories of the WCIT-12 Final Acts (ITU, 2012b).6

After the conference different actors stated the new treaty had put the ‘free internet in jeopardy’ (US Department of State, 2013). Already in the run-up of the conference a heated debate had ignited in which different actors declared the ITU tried to ‘take over the internet’ (Albon, 2012). It was feared that ‘authoritarian’ ITU member states would gain control over internet regulation issues through the ITU. Adherers of a ‘hands-off’ internet governance approach argued the internet was built on a ‘multistakeholder’ model which had to be shielded of by top-down government regulation. The internet had its own unique governance

6 No political statements (e.g. on the political status of territories) are intended using this particular world map.

The map is made using Google Sheets.

: Non-signatories : Signatories : Non-participants

13

(15)

architecture which had to be shielded off (hands-off) by attempts of ‘old-fashioned’ international regulation bodies to control the internet.

Whether these concerns are true or false is probably a question of perspectives and therefor difficult to answer. What remains puzzling though is how this security concerns over the hands-off approach to internet governance dominated WCIT-12. As the ITU kept repeating it did not seek to ‘control the internet’ nor wanted to inflict with the existing governance architecture of the internet (Touré, 2012c). Given the limited scope of the new ITRs treaty and the impossibility to impose any regulations if a member state does not want to, why was there so much heated debate on WCIT-12?

To understand this process this study builds on securitization theory (Buzan, Wæver, & De Wilde, 1998).7 This theory presumes speech acts or specific rhetorical structures present an issue as an existential threat. When an actor initiates a securitization move an issue is lifted out of normal politics by claiming the issue needs to be treated by extraordinary means. In the case of WCIT-12 this would signify the conference poses a ‘threat’ to the ‘hands-off’ internet governance approach, therefor ‘normal’ international negotiations on new ITRs have to be exempted to protect the hands-off approach towards internet governance. The following research question thereby leads this study:

How did securitization of the hands-off internet governance approach interfered with the World Conference on International Telecommunications in 2012?

This study thus seeks to understand the securitization process of the hands-off internet governance approach in the run-up, in between and in the aftermath of WCIT-12. Why is an analysis of his empirical case relevant from an academic perspective? First of all this study uses an adapted form of the much debated securitization theory. By delineating extensively how securitization is applied, this study aims to give input in the academic debate on the status of securitization theory and shows using the theory can generate interesting insights. Secondly WCIT-12 provides a unique case to study the clash between different internet governance approaches. Analysing this event tells something over which approach dominates the current global internet governance architecture. Third studying the case provides interesting empirical results which better our understanding on what happened at WCIT-12. Most published work on WCIT-12 focusses on analysing the ITRs treaty texts, and what it

7

The use of securitization in this study has not to be confused with the use of the term in financial practice.

14

(16)

signifies for current internet regulations (Bennett, 2012) (Fidler, 2013) (Hill, 2013) (2014). This study does not seeks to uncover whether the new ITRs treat actually impose stricter internet regulation. This study merely aims to understand how actors in the field think about internet governance issues and what this conveys in terms of lifting issues of the ‘normal’ decision making agenda.

The adjoining broader aim of this study is to provide an understanding of what happened at WCIT-12. The results of this study could be used to provide input in a debate on the expedience of the used hands-off internet governance approach by certain actors. The findings of this study can thus provide a ‘thinking space’ for the reader in order to reflect of the outcome of WCIT-12 and how this fits with the broader current internet governance architecture.

The study builds on a case study research design focussing on the securitization of the hands-off internet governance approach before, during and after WCIT-12. Therefor the timeframe of May 2012 – May 2013 is chosen in which written documents and audio-visual material is gathered and analysed. This material will be analysed using an adapted framework by Vuori (2008) building on discourse analysis. The reader who is going through this thesis might occasionally wonder why some parts in this study are so extensively delineated. First of all this is not done to ‘bully’ the reader, it is meant to be thorough in explaining why certain choices are made in this study. It helps to increase the reliability and the possibility to replicate the findings of this study. The reader is thus invited to skip certain parts in this study when thinking for example it is not relevant to read about research choices when one is only interested in the findings of this study.

In the following chapters first the term internet governance will be defined by theorizing the internet as a technical infrastructure after which the programmatic debate on internet governance is stipulated. These sections are meant to clarify what internet governance is about. The IR perspective on internet governance is reviewed and the definition of the internet as a global governance architecture is introduced. The next section defines securitization theory and refines the use of the theory in this study. A reflection on the operation of the theory in this study will clarify why and how theory is used in this study. The research design chapter will outline the employment of the used ‘methods’ in this study, by building on a form of discourse analysis. The findings are presented in two chapters, the first in analogy with the run-up of WCIT-12, and the subsequent chapter dealing with the conference itself and the ‘aftermath’. The conclusion will draw on the findings and presents a conclusive answer on the research question.

(17)

Chapter 1: Conceptual framework

This chapter will conceptualize the two most important concepts in this study. The first concept which will be outlined is ‘internet governance’. The term internet governance is mostly ill-defined and is often used as a fashionable catch-all phrase. The concept will be therefor put under close scrutiny. The second conceptualisation this chapter addresses is the operationalization of securitization theory. Foremost securitization is a conceptual move on how to understand ‘security’. Both internet governance and securitization are the basic building blocks in this study, because of the complexity of both terms they are conceptualized in detail.

1.1 Conceptualizing internet governance

The term internet governance is inherently connected with programmatic aims or normative ideas on the ‘governance of the internet’. Internet governance namely implies there is ‘governance’ of the internet. But as some will argue the internet has to be conceived as a ‘self-regulating network’, which has to be shielded of from any ‘governance’ attempts. This brings a duality in meaning between the ones conceptualising internet governance as analytical lens which concerns the empirical ‘governance’ of the internet, and others using the term as normative device to prescribe or prevent certain programmatic ‘governance’ attempts.

This study uses internet governance as an analytical device by conceptualising the term as ‘global governance architecture’. But reference to certain opposing vision in the practical use of the term are important to address, since these visions play an important role in contested ideas about the future working of the internet. To give a clear understanding of the multi-faced meaning of internet governance, the term is peeled off by defining its three basic elements; first ‘the internet’ as technical infrastructure is conceptualised. Secondly the controversy surrounding the programmatic ideas intertwined with internet governance as a term, are outlined. Because the term internet governance is closely related with empirical governance issues, this section will be somewhat descriptive and empirical. The last section will sketch the use of ‘internet governance’ in the IR literature, and will conceptualise internet governance as global governance architecture.

(18)

1.1.1 Theorising the internet as technical infrastructure

The internet in this study is defined by outlining its three main empirical characteristics. First characteristic is the decentralized and non-hierarchal architecture, second the global scope of the internet and third the use of ‘critical internet resources’. Although these characteristics are sometimes elusive due to their multi-faced nature, they guide the broad technical characteristics of the current internet architecture.8

To understand the decentralised and non-hierarchal character of the internet first the technical functioning of the internet needs further conceptualisation.9 The internet is a global distributed network, which consists of different autonomous networks that are interconnected through a digital networked communication method called packet switching. Before a message is sent it is divided into different packets. Each packet is transmitted individually and can use different routes over the network towards its destination. To accomplish this task a set of rules are agreed which are formulated in different technical protocols, from which the Transmission Control Protocol/Internet Protocol (TCP/IP) is the most widespread protocol package.

The TCP/IP protocol is based on the ‘end-to-end’ principle. This entails the technical design of the internet is based on a ‘dumb’ network. First the Internet Protocol addresses the different nodes in a network; it ensures unique identifiers (IP addresses) to all the different nodes in the network (DeNardis, 2010, p. 6).10 Having a unique identifier the nodes in the network are able to send packets to all other nodes in the network, without requiring the network to maintain the status of a transmission. The network thus functions as a ‘neutral’ intermediary pushing through traffic without understanding its nature (Mueller, Mathiason, & Klein, 2007, p. 247). The communication on this transmission takes place between the different end points in the network. For example based on information in a received package from another node a client can know whether there is more to come or the transmission is complete. The network itself doesn’t know or give such information. This reflects there is no

8

Formulating what the internet is brings in the danger of conceptualising the internet in relation to a certain programmatic idea about what the internet should be. For example the internet could be conceptualised in analogy with the telephone system which would suggest regulation of internet content is not desirable as it is not commonly accepted to interfere with telephone conversations either. Conceptualising the internet as an ‘information superhighway’ would in contrast legitimize stricter regulation, since a safe and good highway needs strict traffic controls (Kurbalija, 2010, pp. 22-27). Therefor this study conceptualizes the internet solely by its main empirical characteristics.

9 Since some of these technical aspects are strongly connected with the governance issues of the internet they

need to be outlined in more detail. Although for reasons of time and space constraint only the most relevant technical aspects will be outlined briefly. More exhaustive resources on the technical working of the internet are given by (Mueller, 2002) and (DeNardis, 2009).

10 Although a node is a complex technical term, for the sake of convenience it is used in this study as the

end-point or redistribution end-point in a network. A node can be for example a computer, router or even a ‘smartphone’.

17

(19)

hierarchy in the different end-points in the network. The network acts as a simple intermediary for all the nodes on an equal basis. Moreover the network is decentralised. All end-points communicate directly with each other without central steering.

Related to the end-to-end architecture of the internet is its global and non-territorial character. Nodes communicate based on an IP-address which has no geographical reference. This makes the internet in essence geographically unbounded and global in character. Naturally a subtle distinction on the boundless character of the internet needs to be made to account the physical infrastructure of the internet. This physical infrastructure consisting of power grids, routers, fiber and optic cables (etc.), obviously is geographically bounded. This highlights the multi-faced character of the internet. Sometimes the internet is territorially fixed when for example the internet is ‘blacked out’ in a certain region due to a power breakdown. In another instance the internet is truly global in character, for example when a computer virus spreads throughout cyberspace it does not pay attention to geographic boundaries.

The last important character of the internet is the existence of so-called ‘critical internet resources’. An intrinsic feature of the technical infrastructure of the internet is the need for some unique resources for without the internet cannot function. The first critical resource is the management of the different IP-addresses. These addressed needs to be allocated by some mechanism to the different nodes in the network. Without centralising the issuing of IP-addresses it is impossible to preserve the uniqueness of every IP-address, since nobody knows which address is used and which is free. A special institution is setup for this task; the Internet Assigned Numbers Authority (IANA) which is a department of the Internet Corporation for Assigned Names and Numbers (ICANN). The ICANN is responsible for allocating clusters of IP-addresses to Regional Internet Registries for regional assessment (DeNardis, 2010, p. 5).

Another critical internet resource which needs central coordination is the Domain Name System (DNS). The DNS is built merely to make it easier for humans to use the internet. The DNS links a numerical IP-address to an alphanumeric domain name.11 Just as IP-addresses these domain names need to be unique in order to function properly. If there is no univocal addressing system, the technical infrastructure will be plagued by unresolvable conflicting requests. The DNS serves to perform the critical function by keeping track of all

11 For example the domain name uva.nl is linked with the IP-address 145.18.12.42. The website of the UvA can

be accessed by using only the IP-address but since this is difficult to remember the domain name serves as a useful mediator between man and machine.

18

(20)

the global distributed domain names. In simple terms the DNS is a huge database or telephone book in which information of the different IP-addresses and domain names is stored (Klein, 2002, p. 195). Like the allocation of IP-addresses the DNS is coordinated in a hierarchical way. On top of the hierarchical tree is the ICANN managing the root of the DNS (Mueller, 2002, p. 42). The DNS root dictates which generic top-level domain names (gTLD) and country code domain names (ccTLD) can be assigned.12

The management of the DNS and the IP-addressing inform us the internet is organised decentralized and non-hierarchical only to some extent. For the network to function some sort of hierarchically structured architecture is needed in order to maintain a unique addressing system to coordinate the scarcity of available IP-addresses and domain names. This duality of (contrasting) governance structures rooted in the technical infrastructure of the internet, highlight once more the multi-faced character of the medium.

1.1.2 The emerging internet governance debate

In the present architecture the IANA/ICANN stand on top of the tree in managing scarce critical internet resources. The ICANN is a private, non-profit and US-based organisation. This signifies the most critical internet resources fall under US jurisdiction. Even more than that, the US government retains ‘residual authority’ over the DNS root zone (Mueller, 2002, p. 186). When the ICANN was setup in 1998 it was delegated policy authority over the DNS and IP-addressing system. But not without the US Department of Commerce having a final unilateral oversight on the core critical internet resources the ICANN coordinates.13 The ICANN is a private self-governing entity based on a global operational ‘multistakeholder’ model, but the ultimate authority over the core internet resources remains in the hands of the US government (idem, pp. 197, 211).

In a desire to avoid existing international institutions the US-government blocked off earlier attempts to incorporate the coordination of critical internet resources into a mere traditional intergovernmental model. The International Telecommunication Union (ITU) resolves technical and financial questions related to Information and Communication Technology (ICT) by arranging agreements between sovereign states. Since the ITU plays a key role in the coordination of telephone networks, the ITU anticipated the internet

12 Examples of generic TLD’s are .com, .net or .org. Examples of ccTLD’s are .us for the United States and .nl

for the top-level domain of the Netherlands.

13

In 2009 the US Department of Commerce relaxed its supervision over the ICANN. In the so-called ‘Affirmation of Commitments’ the US government surrendered most formal and visible legal control over ICANN, while less visible power structures remain in place. The ICANN promised to remain located in the US, which states a commitment to remain under US jurisdiction, and thus the US shadow of hierarchy remains in place (Froomkin, 2011).

19

(21)

infrastructure would be a new domain in need for its coordination. The organisation tried in an early stage to establish an UN-based DNS governance structure.14 But the US-government effectively side-lined this initiative by creating the ICANN. The US has a long-standing apathy towards the ITU, despising its ‘one-country-one-vote structure’ and centralised intergovernmental character which clashes with the ‘decentralised and privatized domain of the internet’ (Mueller, Mathiason, & Klein, 2007, p. 239). By creating the ICANN the US downplayed the very need for any governance of the internet.

The unilateral creation of the ICANN and the indirect US authority over the critical internet resources it coordinates created fierce debates over the legitimacy and expedience of this arrangement. Some argue the US acts as a ‘unilateral hegemon over cyberspace’, exerting their political leverage to impose a marked-based internet governance approach based on self-regulation (Drissel, 2006, pp. 116, 118). Others are concerned with the stability of the internet and portray the situation as a ‘ticking time bomb’, due to the danger of political interference with the coordination of critical internet resources (Mueller, 2002, p. 223).15 Altogether these critiques on the present ICANN regime range from questions over accountability, legitimacy, democratic control and internet stability in general. Out of geopolitical tensions and rising criticism on the mandate of the ICANN, the ‘internet governance’ debate emerged.

To give a better picture of how the internet governance debate was ‘institutionalised’ this section gives a brief overview on its historical development. During the World Summit on the Information Society (WSIS), an unusual UN-hosted two-phased conference held in Geneva (2003) and Tunis (2005) aimed to bridge the ‘global digital divide’, critique on ICANN’s mandate was expressed by several governments (Mueller, Mathiason, & Klein, 2007, p. 240). Developing countries like China, Brazil and South-Africa expressed their preference to recast internet regulation architecture into existing international frameworks like the ITU. Other European countries also denounced the current ICANN mandate, and endorsed the regulation of critical internet resources needed a new international treaty or charter (idem. pp. 239-240). Although the restructuring of the DNS in the 1990s had already led to the recognition internet governance is not primary about technical issues but includes

14 This came to be known as the International Ad Hoc Committee (IAHC), which came up with a Generic

Top-Level Domain Memorandum of Understanding gTLD-MoU, in which a procedure for allocating and managing domain names was constituted. The initiative became irrelevant and soon dissolvent with the creation of he ICANN.

15 The adoption of the .xxx TLD marked the latest tense relationship between the ICANN as ‘independent’

organisation and the US government. The new TLD was approved by the ICANN but its introduction was delayed due to protests by the US government. See (Richards & Calvert, 2011).

20

(22)

legal and political controversies (Paré, 2003, p. 1), during the WSIS the critique on the role of the ICANN in internet governance arrangements gained formal international recognition. The disaffection with the current regulation regime under the ICANN created the impetus to setup a Working Group on Internet Governance (WGIG), which marked the kick start of a formal institutionalised global internet governance debate.

The UN-based WGIG developed a broad definition of what internet governance should entail; its recommendation was to create a new ‘multistakeholder forum’ to deal with internet issues.16 This multistakeholder forum needed an open character in which all relevant stakeholders, from the private sector civil society and governments from both ‘developing’ as ‘developed’ countries could participate. The WSIS mandated the creation of a so-called Internet Governance Forum (IGF), which institutionalised the internet governance debate. The IGF first held in 2006 and mandated till 2015, is held yearly and functions as a policy dialogue on internet regulation issues based on an open multistakeholder model. The results of the IGF remain inconclusive, as some argue there is ‘continued deliberation’ without ‘contributing concrete change to ICANN’s operationalization’ (Mueller, Mathiason, & Klein, 2007, p. 242).17 But the IGF being mainly a ‘talking shop’ is partly an evident outcome given the lacking decision-making authority of the forum, on the other hand the ‘lack of progress’ marks the contesting views on internet regulation.18

The first view on internet regulation is strongly related with the genesis of the internet. The internet originated mainly in the realm of the academic world and the private sector without much government interference. In this early stage the internet was solely governed through ‘running code’ based on ‘rough consensus’.19 This means the intrinsic technical usefulness of an adaption, decided on an ad-hoc base how the internet was designed. The adage ‘if it isn’t broke don’t fix it’ marks the preference of this vision, which still dominate in standard setting organisations like the International Engineering Task Force (IETF). In this vision the technical operationally is more important than anything else. The US government initially remained passively letting the internet engineers ‘do their thing’

16 The WGIG created the following definition: “Internet governance is the development and application by

Governments, the private sector and civil society, in their respective roles, of shared principles, norms, rules, decision-making procedures, and programmes that shape the evolution and use of the Internet” (WGIG, 2005, p. 4).

17 Some state the UN-led internet governance debate was flawed form the start, due to the ‘harnessing power’ of

some powerful actors which pushed for a ‘neo-liberal entrepreneurial spirit’ (McLaughin & Packard, 2005, p. 358).

18 Some authors argue the IGF is somewhat a ‘red herring’ for it is only a dialog on diverse ranging topics

without any policy-making authority (DeNardis, 2010, p. 3).

19 As claimed by Dave Clark in the early 1990s being the motto of the IETF: “We reject kings, presidents and

voting. We believe in rough consensus and running code” (Hofmann, 2007, p. 4).

21

(23)

(Goldsmith & Wu, 2006, p. 32). This decentralised and anarchic character of internet operationalization led many to believe the internet was a new ‘uncontrollable’ medium. This so called ‘cyberlibertarianism’ was most famously articulated by John Perry Barlow in his ‘declaration of independence of cyberspace’.20

With the growth of the internet the amount of different actors stepping into the internet governance debate has grown correspondingly. Particularly the eagerness of governments to interfere with internet regulation issues, created tension with the visions of cyberlibertarians. However the US government holds an exclusive position.21 By prioritised self-regulation, the US government in general favours privatised and marked-based internet regulation over strong centralised public steering. This view is upheld by arguing internet governance is about technical issues, which are best coordinated by technicians. Internet governance in this sense is thus a bottom-up and pragmatic matter, which has to be free from ‘suffocating’ government restrictions. Although self-regulation has to ‘leave room for political oversight and dispute resolution’, in order to facilitate the self-governing framework (Holitscher, 2004, p. 7). It marks a complex public-private partnership which favours the present status quo situation on internet governance, and can be best summarised as the ‘hands-off’ approach.

The antithetical view of the hands-off approach is a stew of different views on internet governance issues. The only base-line all these visions share is a general dissatisfaction with the current internet regulation framework. The reason there is not one antithetical vision contrasting the hands-off approach, is a coalescence of actors involved in the internet governance debate. For example civil society organisations argue the current ICANN regime is undemocratic and illegitimate, they push for a ‘true’ multistakeholder model. Another position in this spectrum is the stance of some nation-states, arguing internet regulation issues should have a stronger intergovernmental mandate.

A second cause of the cacophony of contesting vision in the internet governance debate is a difference in perception of which issues have to be addressed. So far the focus has

20 In this pamphlet the cyberspace is declared a ‘sovereign’ space, which should be shielded of from government

interference. The following much quoted part illustrates this vision: “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather.” […] “Governments derive their just powers from the consent of the governed. You have neither solicited nor received ours. We did not invite you. You do not know us, nor do you know our world. Cyberspace does not lie within your borders. Do not think that you can build it, as though it were a public construction project. You cannot. It is an act of nature and it grows itself through our collective actions” (Barlow, 1996).

21 This ‘exclusive position’ derives from early critical internet innovations and early adoptions of these

innovations which took place in the US. For an analysis on why the US provided the perfect breeding ground for the development of the internet see (Mowery & Simcoe, 2002).

22

(24)

been on the ‘narrow’ technical issues of internet regulation, but there are extensive other issues which can be incorporated into the governance debate. This broader ‘governance mosaic’ is ‘multi-layered, fragmented, complex and generally highly distributed’ (Dutton & Peltu, 2005, p. 5). Centrally three types of internet governance issues can be distinguished.22 The first type of issues is internet centric, and focuses for example on the management of core internet resources, web standards, and the functionality of the internet in general. The second category is internet-user centric. These issues deal with what is legal and illegal behaviour on the internet and whether or how this behaviour should be regulated, for example cybercrime, fraud or malicious internet attacks. The third category of governance issues are non-internet centric. These issues originate on the intersection between local and international policy contexts, for example issues of censorship, political expression, intellectual property rights, trademarks or hate speech (Dutton & Peltu, 2005, p. 7).

This ‘broader’ vision of what internet governance should encompass only has one thing in common; a willingness to change the current governance architecture. This study therefor summarises this vision as the ‘on’ approach, being the opposite of the hands-off approach. This vision tries to open up by the internet governance debate and proposes a reconfiguration of the current status quo in internet architecture. It can be understood as ‘an open-ended, collective process of searching which aims to fill a global regulatory ‘void’ both conceptually and institutionally in a legitimate way’ (Hofmann, 2007, p. 2). Although the broad goal of this quest overlaps (namely changing the current status quo), this vision harbours a range of diverse stances towards internet governance issues (narrows versus broad) and governance arrangements (centralized versus decentralized).

1.1.3 Reformulating the IR perspective on internet governance

The concept of internet governance picked up current in the academic literature, merely as a result of new academic collaborations and funding initiatives which were a spin-off of the WSIS and the IGF.23 The programmatic debate on internet governance is thus closely intertwined with the academic literature, which makes it sometimes difficult to distinct

22 Some categorize the issues by empirical themes into different ‘baskets’; see (Kurbalija, 2010). Others map the

different empirical issues on two axes. The Y-axe differentiates types of international governance tools ranging from ‘soft’ to ‘hard’, and the X-axe differentiates the scope of international governance arrangements, ranging from narrow to broad (MacLean, 2004, p. 88).

23 In 2006 in conjunction with the first IGF the Global Internet Governance Academic Network (GigaNet) has

been set up. This scholarly community tries to promote the development of internet governance as interdisciplinary field of study, stimulate dialogue and support different theoretical and applied research on internet governance issues (GigaNet, 2014).

23

(25)

theoretical discussions from programmatic aims.24 Different authors ‘quarrel’ over how to interpret the term internet governance (see for example (Drake, 2004) (MacLean, 2004) (Van Eeten & Mueller, 2012)). An important puzzling factor on how to use the term is the interdisciplinary nature of the concept internet governance. Computer scientists who focus on technical operationally for example take a different approach on the term than some IR scholars focussing on international power structures. This study operationalizes internet governance as an ‘analytical device’ by building on the definition of a ‘global governance architecture’ (Biermann, Pattberg, Van Asselt, & Zelli, 2009, p. 16). To comprehend this operationalization the place in the existing literate will be shortly outlined.

The classical question often asked in the IR literature when it comes to ‘internet governance’ issues is ‘who controls the internet?’. Some authors depart from a classic realist stance examining the role of the state in internet governance issues. This perspective claims state authority has not declined with the growth of new global networks like the internet. Instead these authors claim nation-states still, or have regained, autonomy over internet governance issues (Drezner, 2004) (Goldsmith & Wu, 2006) (Rundle, 2005). Other studies take a more diffuse point of departure. Rooted in the governance literature these studies focus on the changing role of the state in the governance order; a role that changes from top-down steering towards post-regulatory ‘gardening’ (Christou & Simpson, 2009, pp. 601-602).

The opposite vision to the realist stance can be found in what is usually labelled the ‘governance’ literature. This branch of literature expands the focus on new centralised and formalised institutions which are not based on government but on governance. Departing from concepts like networked governance these studies try to grasp the working of new modes of steering in which institutions like the ICANN, WSIS and the IGF are involved. See for example (Klein, 2002) (Kleinwächter, 2004) (Malcolm, 2008) (Mueller & Schmidt, 2013) and (Wilson, 2005). Other base their studies on regime-theory analysing the process of formation of these new institutions by analysing certain governance regimes (Franda, 2001) (Mathiason, 2009) (Mueller, 2002) (Mueller, Mathiason, & Klein, 2007).25

A third group of authors take up this ‘lagoon’ in the literature by departing from a ‘techno-governmentality’ perspective. This stance of literature is mainly concerned with the analysis of implicit power structures which are rooted in the internet governance architecture

24 A good example of more ‘applied’ studies on internet governance which try to ‘develop a coherent strategic

vision for internet governance’ are the internet governance paper series, published by the think tank Centre for International Governance Innovation (CIGI, 2014).

25 These studies build on the work of Krasner who defines regimes as: “[…] a composed of sets of explicit or

implicit principles, norms, rules, and decision-making procedures around which actor expectations converge in a given area of international relations and which may help to coordinate their behavior” (Krasner, 1983, p. 275).

24

(26)

(Brown & Marsden, 2013) (DeNardis, 2009) (2012) (Elmer, 2010) (Galloway, 2004) (Shah, 2013). Technological structures are not perceived as neutral artefacts, but inextricably inhibit all kind of implicit political choices and preferences. These authors analyse the technical infrastructure underlying the working of the internet. By analysing for example how the technical protocols of the internet are designed, these authors expose the political preferences which are ingrained in the technical infrastructure of the internet. Internet governance in these studies is thus understood as a technological form of ‘governmentality through code’.

The different approaches towards internet governance can be grasped in an eclectic approach which captures all different explicit and implicit notions of governance. First has to be noted this study perceives the binary between non-regulation versus regulation artificial and unhelpful, because for the internet to function some sort of steering or ordering inherently has to be present.26 Whether these governance arrangements are loosely decentralized organised, steered from the top-down, or implicitly organised by a coalescence of actors cannot be determined a-priori. Internet governance arrangements are complex and often a mix of different arrangements and therefor better captured by a broader definition. To capture all different forms of steering into one approach this study uses the term ‘global governance architecture’, which can be defined as an:

“[…] overarching system of public and private institutions that are valid or active in a given issue area of world politics. This system compromises organizations, regimes, and other forms of principles, norms, regulations, and decision-making procedures” (Biermann, Pattberg, Van Asselt, & Zelli, 2009, p. 15).

A global governance architecture is located between two concepts of governance often used in IR literature; namely regimes and orders. First of all the concept is broader than regimes capturing more than distinct institutional elements, by including the larger governance architecture. On the other hand the concept is narrower than the notion of order which focuses the entire organisation of the entire system of international relations (Biermann, Pattberg, Van Asselt, & Zelli, 2009, p. 16). A global governance architecture is more than single regimes but does not reach beyond the ‘global order’.

A second advantage of using global governance architecture is the possibility to analyse conflict over different norms, principles, responsibilities and capabilities inside

26

Evidently this does not entail a normative statement on the preferability of a certain governance arrangement.

25

(27)

distinct governance architecture. This is exactly what this study aims to do, namely analysing a securitization process in the global internet governance architecture. A third advantage of the concept is its more nuanced character.27 The concept of ‘international order’ often ‘implies an optimistic bias regarding the coherence and international coordination of the international system’ (Biermann, Pattberg, Van Asselt, & Zelli, 2009, p. 16). While architecture accounts for the whole ‘order’ which includes non-intended or adverse effects as well.

From now on this study will use the term internet governance as global governance architecture. It signifies this study reformulated internet governance to use it as an analytical device in order to escape from the strict programmatic vision connected with the term.28 Conceptualising internet governance as a broad architecture on the contrary helps to understand the clash of different governance approaches within this governance architecture, namely the hands-off versus the open-up approach. This study tries to illuminate how these contradicting governance approaches are conflicting with each other. This is done specifically by analysing the WCIT-12 conference in which a hands-off approach internet governance approach dominated the negotiations. To understand how this approach ‘stonewalled’ the conference securitization theory is used in this study.

1.2 Conceptualizing securitization theory

At the end of the Cold War new conceptions of security beyond the traditional political-military outlook mushroomed in the security literature.29 The securitization theory used in this study is an offshoot of this ‘widening-deepening’ debate in security studies.30 A circle of European scholars widely labelled as the ‘Copenhagen School’ (CS) coined the idea of securitization in the 1990s.31 The core theorists of the CS are Barry Buzan and Ole Wæver

27 Albeit a more nuanced conceptualisation of internet governance is a trade-off between adding more

complexity and running the risk of disorder or being more inclusive which helps to be nuanced. Because this study deals with internet governance as a controversial phenomenon, nuance is considered more important than conceptual rigidity. Therefor interne governance is conceptualised as a governance architecture.

28 E.g. internet governance is more than what happens at the IGF, it is about the complete global governance

architecture.

29 The field of security studies is notably influenced by the end of the Cold War. During the Cold War security

studies were set in a nation state framework oriented towards problem-solving theories and strategic thinking. The concept of security was woolly, commonly underdeveloped and unproblematized (Buzan B. , 1983, pp. 1-9). The end of the Cold War saw an influx of new approaches to security in which the conception of security was broadened and widened (Buzan & Hansen, 2009, p. 13).

30

For a comprehensive overview of the widening and deepening debate in security studies see the work of Buzan & Hansen (2009).

31 Two works by CS theorists Buzan and Wæver are acclaimed to be the ‘founding’ texts of securitization

theory. First the article of Wæver (1995) ‘Securitization and desecuritization’ and second the book by Buzan, Wæver & De Wilde (1998) Security a New Framework for Analysis.

26

(28)

bounded to the Copenhagen Peace Research Institute (COPRI) (Wæver, 1995) (Buzan, Wæver, & De Wilde, 1998) (Buzan & Wæver, 2003) (Buzan & Wæver, 2009).

1.2.1 Securitization in three steps

A securitization process consists of three main components or steps. First there is a speech act or specific rhetorical structure in which an actor presents an issue as an existential threat. The focus is on explicit expression of speech which present an issue as a matter of survival upon ‘we must act or otherwise it will be too late’. The second step is lifting the issue above politics by claiming the issue needs a treatment by extraordinary means. The securitizing actor claims the right to secure and defend the issue at stake and therefor it is legitimized for the securitizing actor to break the normal rules of the political game. But a securitization move is only successful when the ‘audience’ accepts it as such. This third step focused on the intersubjective social constitution of a security issue. The CS emphasizes acceptance by an audience does not mean forms of coercion or dominance are ruled out, rather securitization is only successful when emergency measures taken would not be legitimized or accepted when not being securitized. So if there are signs of a securitization process but a securitizing actor fails to convince an audience an issue is an existential threat it becomes an unsuccessful securitization move (Buzan, Wæver, & De Wilde, 1998, pp. 23-26). Out of this conceptualization of the securitization theory the CS defines the following research agenda;

“Based on a clear idea of the nature of security, securitization studies aims to gain an increasingly precise understanding of who securitizes, on what issues (threats), for whom (referent objects), why, with what results, and, not least, under what conditions (.i.e., what explains when securitization is successful)” (Buzan, Wæver, & De Wilde, 1998, p. 32).

Securitization theory therefor fits in the discursive, socially constituted and intersubjective realm. Security rests neither on the object of security or the subjects but among the subjects. But the CS does not descend into a phenomenological approach towards linguistics. A speech act according to the CS is not rendered to be about linguistics alone; it is situated in a context. This makes the internal features of a speech act (e.g. the quality of the rhetoric structure) are not completely determining a successful acceptance, it is the ‘social magic’ some speech acts draw on some base of authority and get accepted while others would not (Buzan, Wæver, & De Wilde, 1998, p. 46).

(29)

The understanding of a speech act by the CS draws heavily on the work of John Austin (1962). Other influential theorists of the CS theorists are Carl Schmidt (2007), Pierre Bourdieu (1991), Jacques Derrida (1982) (1988) and Judith Butler (1996) (1997). Without elaborating too much in the legacy of these theorists, for a good understanding of the CS towards what a speech act is and more importantly how a speech act operates, key aspects of a speech act will be examined more thoroughly.32

A speech act consists of an utterance (or sentence) which can be constative (simply describing something) and or performative (‘doing’ something) (Austin, 1962, pp. 3-6). A performative utterance performs or does something instead of simply describing or reporting something. For example when saying ‘I do’ before an alter or registrar one is not simply reporting on a marriage but actually indulging in it (idem. pp. 6). Other examples of performative utterances are naming a ship, declaring a war or betting money. The action of such an utterance comes forward in or by saying it. It is useful to analyse the performative power of a speech because it goes beyond the question whether a statement is true or false. In analysing what a speech act does one prevents burning ones fingers on interpreting the truth condition of a speech act (as this is always contestable).

According to Austin a performative speech act can contain three elements; the locutionary, illocutionary and perlocutionary act. The locutionary is the actual meaning in the sense of the verbal, syntactic and semantic meaning. The illocutionary act is the act performed in saying it (e.g. ‘I declare war’). The power in the performative act is thus self-referential; in saying it something is done. The perlocutionary act is performing the act by saying it (for example ‘please open the window’). This kind of performatives are mere relational and build on persuasion. Something is done by saying it were the hearer performs the action (Austin, 1962, p. 108). The CS takes no notice of the terminology of different elements in a speech act as defined by Austin and mainly focuses on the illocutionary speech act, while elements of the locutionary speech act are used to formulate the ‘facilitating condition’ for a securitization move to succeed (Buzan, Wæver, & De Wilde, 1998, p. 32).

The central element in securitization theory is to analyse how an illocutionary speech act transforms an issues into a security issues. The main focus of analysis is therefor on the performative power of a speech act. In line with Austin the CS distinguishes the meaning and the force of a speech act (Austin, 1962, p. 100). The meaning of a sentence is referential or

32 Much of the misinterpretation and misunderstanding (as well as misuse) of securitization theory can be

ascribed to the often ill-defined and ill-explained theoretical roots of the CS. For example most of the information on the theoretical use of a speech act is hidden in a footnote in the most prominent book on securitization theory (Buzan, Wæver, & De Wilde, 1998, pp. 46-47).

28

(30)

situated and therefore not universal, while the performative power of a sentence or what it does is more univocal and therefore a useful point of reference.

1.2.2 Facilitating factors in securitization theory

In a speech act there is a performativity force but the ‘magic’ happens in the intersubjective interpretation of the speech act or in the extra-linguistic domain (Butler, 1996, p. 124). “Anybody can shout in the public square, "I decree a general mobilization,'' and as it cannot be an act because the requisite authority is lacking, such an utterance is no more than words; it reduces itself to futile clamour, childishness, or lunacy” (Bourdieu, 1991, p. 74). To include this aspect of the role of the social order in securitization theory the CS introduces ‘facilitating conditions’ which are made up of three elements.

The first element focuses on feasibleness of the internal aspect of a security argument. Security threats are always about the future, they are hypothetical and counterfactual. A security argument consists of an observation of the threat and a proposed security action. This makes a security argument to be interpreted as a matter of degree on how threatening an issue is. And a qualitative question whether labelling something as a security issue and take the proposed course of action is the attainable way to deal with the threat (Buzan, Wæver, & De Wilde, 1998, pp. 32-33). The second facilitating factor regards the position of authority for the securitizing actor. This factor examines the relationship between different actors including formal position and resources.33 The third element is whether the alleged threat is held to be threatening in general. This external factor of a security argument can either facilitate or impede securitization (ibid. p. 33). When for example a certain issue is not considered to be a threat in a certain discourse, the chances are slim this issue can be securitized. In general securitization theory studies the social and intersubjective process of securitization. For a good understanding of a securitization process both the utterances of speech act as the facilitating conditions which make them possible need to be taken into account.

1.2.3 Refining speech act theory

When considering securitization an intersubjective process between a securitizing actor and an audience it seems the acceptance of a securitization move is an essential element in a

33 While the conception of security draws from a state-centric conception, the focus is in whose name the

security operation is conducted. Asking in whose name a securitization operation is conducted is a fundamental different question than asking for whom security is provided. The first focuses solely on the process of securitization as a practice, while the latter suggests we can determine objectively for whom security is provided. The referent object in securitization theory is simply in whose name a securitization move is made (Wæver, 1996, p. 107).

29

(31)

successful securitization strategy. Indeed securitization theory takes into account the capabilities of the securitizing actor like position and power resources. But this is not considered to work as a mechanism of persuading power targeted at an audience. The reason that the role of the audience in securitization theory is ill-defined stems from the one-sided focus by the CS on the illocutionary speech act, resulting in an ‘insufficiently’ theorised role of the audience in securitization theory (Taureck, 2006, p. 19).

Including the perlocutionary speech acts in this study gives the ability to study both forms of a speech act are part of a securitization process. Especially in an empirical field with multiple actors possessing different power positions and resources it is more likely no single actor can individual push for securitization. Securitization in this study therefor can be better staged as a collaboration between different actors in which different forms of speech acts can play an important role. An interesting contribution of this study is the focus on the interplay between these different types of speech acts. Including different types of speech acts gives a new perspective of a multi-phased view of speech act theory, in which different speech acts complement or reinforce each other in a mere organic process.

Thierry Balzacq proposes a complete recast of the speech act model in order to integrate the perlocutionary form of a speech act into securitization theory (Balzacq, 2005, p. 173). This endorsement to completely reformulate securitization theory stems from two different views on securitization theory. The original CS idea of illocutionary speech acts that possessing their own performative power is often labelled the internalist view. The so-called externalists or sociological view adhered by Balzacq and Stritzel argue securitization is about perlocutionary speech acts in which securitization takes place in a intersubjective process (see Figure 3) (Roe, 2012, p. 254). But to include both types of speech acts doesn’t necessarily entail a hard either-or choice. The original framework can be preserved while integrating the perlocutionary speech act. This incremental path prevents throwing out the baby with the bathwater.

(32)

Figure 3: Schematic reflection of a successful securitization process in which an illocutionary and a perlocutionary speech act are used.

To include the perlocutionary speech act the role of the audience in securitization theory becomes more relevant, because a perlocutionary act is all about persuasion. Balzacq describes it as a discursive technique to induce or increase the public minds to swing the audiences support toward a policy or course of action (Balzacq, 2005, pp. 172-173). But who has to be convinced in a securitization process or what is the audience? This question cannot be answered beforehand because the audience varies according to the political system or

Successful Securitization

Illocutionary

Speech act Referent object

Securitizing move (Self-referential) Constraint by facilitating conditions Securitizing actor Securitizing actor Audience accepts securitization Perlocutionary Speech act Audience Securitizing move (Persuasion) Constraint by facilitating

conditions Referent object

Successful Securitization

Referenties

GERELATEERDE DOCUMENTEN

My research expected that effective corporate governance mechanisms would have a positive influence on the likelihood of a spin-off, because these mechanisms can prevent

Some correlation could be seen – for instance many actors mentioned that disruptions to supply is a potential security problem and a threat – but when it comes down to

Om nu de kosten per GB per jaar te kunnen vergelijken met die van magnetische tape dataopslag zou eerst een grens moeten worden opgesteld voor het aantal keer dat de data

The objective of this research is to contribute to the knowledge and understanding of municipal residential energy sector governance in cities that faced

Door de resultaten van groep vijf wordt in dit onderzoek ook bevestigd dat de therapie integriteit en het verbale gedrag van de therapeut niet van belang zijn voor de uitkomst op

A local optimization on the undulator length for a hole radius of 135 μm is shown in figure 3 where we plot the peak recirculating power (left axis) and the average output power

Goed en slecht ko- men beide voor: bijna een vijfde van de melkveebedrijven heeft bijna alle grond aan huis en ruim een derde van de bedrijven heeft een huiskavel die minder dan

Based on these findings IT orientation can be defined as ‘the degree to which firms focus on business intelligence, IT system configuration, IT management, digital marketing and