• No results found

Cybernetic Urbanism in the Smart City. Urban Dashboards and the Reformulation of Reason and Vision

N/A
N/A
Protected

Academic year: 2021

Share "Cybernetic Urbanism in the Smart City. Urban Dashboards and the Reformulation of Reason and Vision"

Copied!
68
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Cybernetic Urbanism

in the Smart City

Urban Dashboards and the

Reformulation of Reason and Vision

By

Clemens Buchegger

UNIVERSITY OF AMSTERDAM

Graduate School of Humanities New Media and Digital Culture

Date of completion: June 26th, 2017 Supervisor: dhr. dr. M.D. Marc Tuters Second reader: dhr. dr. J.A.A. Simons Programme: New Media and Digital Culture (Media Studies) University of Amsterdam

(2)

Table of contents 1. Introduction ... 3 2. Methodology ... 7 3. Infrastructure, Space and Code ... 12 3.1. Infrastructuralism ... 13 3.2. Code / Space ... 15 4. The Smart Mandate ... 24 4.1. Cybernetics as Universal Science ... 26 4.2. Cybernetics and the City ... 29 4.3. Reformulation of Reason and Vision ... 33 5. Urban Dashboards ... 39 5.1. Choice of Objects ... 41 5.2. Technology Stack ... 42 5.3. The Paradigm of Control ... 50 5.4. Closing the Gap ... 55 5.5. Transparent Realist Epistemology ... 56 5.6. Cybernetic Urbanism ... 58 6. Conclusion ... 63 7. Bibliography ... 64

(3)

1. Introduction

In 2014 the Dutch architect Rem Koolhaas was invited to give a talk before the EU High Level Group about his thoughts on the “smart city”. In this talk he suggested that by calling the city of the future “smart”, the existing “city is condemned to being stupid” (Koolhaas) and continued to argue that the smart city presents itself as “apolitical in its declarations” (Koolhaas) and its implicit political values have “barely been registered” (Koolhaas). The rhetorical question Koolhaas posed, is, what kind of urban intelligence is promoted through smart cities and how does it affect the urban environment? As the smart city is one of the prevalent concepts for tackling urban development in the 21st century this thesis wants to pick up Koolhaas’ critique and further explore the aspects of urban intelligence in smart city practices and its influence on the city. The definitions of the smart city are quite diverse and can also be argued as a discursive label for a “political agenda of ‘high-tech urban entrepreneurialism’” (Hollands, “Will the Real Smart City Please Stand Up?” 314). In this thesis I want to focus primarily on the smart city concept which describes urban processes that are mediated and synchronized by computational technologies (Gabrys 31). Furthermore, I want to investigate the “smartness” of the smart city concept that refers to the use of computation for analyzing urban data. This brings forth practices that aim to do “better” governance based on seeing the city as a source of “data feeding the algorithms” (Mattern, “Interfacing Urban Intelligence”) such as algorithms that are used for statistical calculations as well as visualizations of the city and its data to “infuse intelligence into decision-making” (Council on Foreign Relations). The faith currently held in using data and computation to solve large and complex problems (Franke, Tuszynksi, and Hankey 12), such as public safety, energy or transport efficiency, are at the heart of the smart city discourse (Halpern, “The ‘smart’ mandate.” 230). The smart city by virtue of its name proposes being smart. Smart is a commonly understood but also rather vague term. It can be associated with intelligence, acumen or wit. But what does smart mean in the context of the city and why is computational analysis associated with smartness? Halpern postulates the “smart” mandate as a current mode of epistemology that is based on data and computation (Halpern, “The ‘smart’ mandate.”). The smart mandate suggests that computers help to be “a necessary corrective to the human inability to deal with ‘large-scale problems’” (“The ‘smart’ mandate.” 230). It proposes the city to be an urban environment imagined as a networked “world of neural

(4)

processing” (“The ‘smart’ mandate.” 223). For Halpern the smart mandate is “one of the governing notions of global polity, creating novel infrastructures, and methods, […] which now underpin our design practices across many fields and have come to organize environment, economy, energy-supply chains, food and medicine distribution, and of course, security” (“The ‘smart’ mandate.” 223).

This leads to the - rephrased - questions posed by Rem Koolhaas: What kind of knowledge emerges from the smart city? How does the smart mandate influence and shape urban governance? This thesis proposes to show that these questions are strongly connected and in close interplay with each other. To answer the research question this thesis will rely on Foucault’s concept of problematization “to objectify what is considered to be objective” (Deacon 127). It should guide the thesis to problematize current computation and data based smart city practices which present themselves as apolitical and objective. Foucault regards practices that involve spatial ordering and urban regulation as inherently political and entangled in specific constellations of knowledge and power. Foucault’s concept is not used to uncover a truth or develop a normative statement about smart city practices and power, but, by investigating smart city practices with a focus on data and computation in the urban context, this thesis wants “to reveal what is so obvious and so superficial that it is passed over and accepted without further comment” (129). The way digital technologies are adopted in our society do not just appear out of thin air. Rather, they “are underpinned by their own particular, distinctive discursive regime” (Kitchin and Dodge 19). The smart mandate can be considered as such a discursive regime. Consequently, the thesis relies on Halpern’s terms and will use her critique of the smart city to investigate current smart city practices. Specifically the thesis will look at urban dashboards (UDs). UDs display data about cities on a screen. They are often web-based software applications and use a plethora of different types of visualizations such as graphs, gauges or maps to visualize urban data. They are a practice which is gaining widespread popularity as a tool to deal with the growing demands of analyzing increasingly bigger amounts of urban data (Kitchin and McArdle 4). For the rest of this chapter I want to briefly give an overview of the structure of the thesis.

In the second chapter I will discuss the methodology used for the thesis more profoundly and clarify the key terms introduced by Foucault and explain why his concepts are useful for this thesis. I will discuss how Foucault uses the terms power, knowledge and governmentality and what he means by “power-knowledge vectors” and why it is fruitful for his method of problematization.

(5)

In the third chapter I want to present concepts that explain the understanding of infrastructure, space and code which will guide this thesis. The concepts that will be presented, first, should allow to critically examine the distinctive smart city narrative and, second, enrich the problematization of smart city practices. In the first section I want to elucidate the relationship between media and infrastructure. Smart city artifacts (smart infrastructure) are not apolitical technological objects but are objects that underwent negotiations and reflect political partisanship and situatedness (Peters). I will use John Durham Peters concept of infrastructuralism and position it within the context of media studies and Science and Technology Studies (STS). In the second section I will turn my attention to the relationship between code and space. I will show why software influences and forms spatial relations (Kitchin and Dodge). Since the technological implementation of the smart city relies heavily on software, I want to show how it shapes the city and why it is productive to analyze it. For this I want to elucidate Kitchin and Dodge’s concept of “code/space” which describes a dyadic relationship of software and space. As software has become an integral part of everyday life it matters as a social-material process. Specifically it matters in its ontogenetic function of bringing spaces into being by collecting data and applying computation. Lastly I will present Rob Kitchin’s concept of the digital socio-technical assemblage to conceptualize UDs as an assemblage and discuss them in context of their framing.

The fourth chapter will show the assumptions the smart mandate has implies about the world and how these affect the smart city. First, I will start by tracing the history of how the smart mandate came into being through cybernetics and how its claim as a universal science influenced other scientific fields. Second, I will outline how cybernetics reformulated reason and intelligence and consequently redefined the world as a chance and indeterminate world. Third, I want to show how the postwar period influenced architecture and urbanism at a “techno-social” moment (Dutta) and brought computation and data to urbanism.

The fifth chapter will problematize UDs by applying the critique of the smart mandate to two cases. The conceptualization of UDs as a digital socio-technical assemblage allows the thesis to frame the software in the context of the smart mandate. Two Cases – the KCStat Dashboard (KCD) and the DublinDashboard (DD) – will be discussed. Through these cases the thesis wants to contribute to the discussion of the relationship of space and software in the context of techno-scientific urbanism (Brenner and Schmid 156) such as the idea of the smart city represents. This thesis is not intended

(6)

to be read as a treatise against data or visualization in the context of the city per se but rather wants to answer what the consequences of the most common UDs

(7)

2. Methodology

Many city governments today have a smart city division or initiative running and are thinking of ways to embed smart city practices (Hollands, “Will the Real Smart City Please Stand Up?” 304). Smart city practices are rarely contested (Greenfield) but are rather encountered and marketed as being the solution for the city of tomorrow (Vanolo). The smart city narrative of an urbanism enhanced with computation being the future of the city can be considered a taken for granted phenomenon. While the views of what a ‘bad’ or a ‘good’ smart city is diverge (Glasmeier and Christopherson; Vanolo; Hollands, “Critical Interventions into the Corporate Smart City”), the idea of the smart city itself is rarely contested: “The critique of the smart city and the suggested resolution of the problem both largely foreclose the actual question of technology, assuming the necessity, even the demand, for attentive, even nervous stimulation as an answer to social questions” (Halpern, Beautiful Data. A History of Vision and Reason since 1945 243). For this thesis this means to look at the smart city in a productive and critical way, it needs to problematize these foreclosures. To accomplish this I will use Foucault’s concept of

problematization because it allows the thesis “to examine phenomena that are taken for

granted” (Deacon 127) and “to objectify what is considered to be objective” (127). Subsequently, the thesis will rather look at the smart city through the minutiae of its practices than its dominant discourses. A genealogy helps analyzing current urban practices within its conditions of possibility. Via a problematization of smart city practices and their embedded urban imaginaries this thesis wants to describe the power-knowledge vectors that frame these. To problematize taken for granted assumptions of smart city practices, this thesis will analyze the wide-spread practice of urban dashboards (Kitchin, Lauriault, and McArdle; Kitchin, Maalsen, and McArdle). In order to position Foucault’s method of problematization I will start by discussing Foucault’s key terms and by showing the significance of power-knowledge vectors. Subsequently I will discuss, first, what problematization is and, second, how a genealogy furthers the research objective of this thesis.

When thinking about power and government, it is mostly connected to an implicit, common sense theory of the state. The basic assumptions are that the state is a sovereign body which holds a monopoly of power over a specific terrain and that there are specific institutions forming a political authority and a particular division between rulers and

(8)

ruled (Dean 16). Foucault has a different approach to seeing the government. He understands government as the “conduct of conduct” (qtd. in Dean 17). Put more precisely,

“[g]overnment is any more or less calculated and rational activity, undertaken by a multiplicity of authorities and agencies, employing a variety of techniques and forms of knowledge, that seeks to shape conduct by working through the desires, aspirations, interests and beliefs of various actors, for definite but shifting ends and with a diverse set of relatively unpredictable consequences, effects and outcomes” (Dean 18).

This means when looking at the smart city the analysis must look beyond what a city government’s policies are and extend its view to “practices that try to shape, sculpt, mobilize and work through the choices, desires, aspirations, needs, wants and lifestyles of individuals and groups” (Dean 20). Smart city practices form such a “conduct of conduct” as they rationalize forms of governance by basing them on data and computation, which represents a certain form of knowledge. For Foucault power and knowledge are not identical or reducible to each other. At the same time they can not be identified as separable entities (Koopman 37). They rather have to be conceptualized and analyzed as being in an interactive relation to each other. Consequently, power is not knowledge or vice versa, but they are entangled in a coproduction of power-knowledge vectors (37). Foucault’s intention was not to find a theory of power or knowledge but to “establish historical points concerning the relations that have contingently held between certain prominent forms of power and knowledge in our modernity” (37). With Foucault the thesis can inquire the relation between power and knowledge to analyze these conditions. “We should admit . . . that power produces knowledge […]; that power and knowledge directly imply one another; that there is no power relation without the correlative constitution of a field of knowledge, nor any knowledge that does not presuppose and constitute at the same time power relations” (qtd. in Koopman 35). For Foucault power-knowledge vectors can be a multiplicity of practices:

“including ritualized temporal ordering in monasteries, spatial ordering as instantiated in military encampments, spatial partitioning techniques developed in architecture, mechanisms of narration and confession in

(9)

religious practice, surveillance strategies originating in industrial and educational contexts as well as in military contexts to control against desertion, planning strategies perfected in urban regulations for controlling plague, medical techniques of individualization and practices of examination developed in clinical contexts, numerical technologies perfected by the rise of the statistical sciences, and religious themes of caring for a flock of wayward souls” (35)

These vectors are not static, they form matrices “in which transformations become conceptualizable” (42). By increasing the complexity of an object these vectors become sayable and “gain a sense of historical transition” (42).

To understand smart city practices as a “conduct of conduct” it is helpful to problematize aspects of it. Problematization focuses on “how and why, at specific times and under particular circumstances, certain phenomena are questioned, analyzed, classified, and regulated, while others are not” (Deacon 127). It helps to take a closer look at “taken for granted” phenomena that seem to be objectively clear and understood. To examine these phenomena Foucault suggests an analysis called genealogy. Genealogy is not concerned with a search for origins but problematizes what is considered a given. It wants to disrupt, fragmentize and show the heterogeneity of a previous homogeneous object (127–128). In contrast to a traditional history that looks for large-scale patterns, genealogy targets details that would act as counterpoint to the grand narrative and undermine proposed continuities (Prado 40). Foucault did not want to find a secret or hidden truth as he did not believe to find such a kernel to be uncovered within a discourse. Rather he was interested in showing the relations within the discourse (Deacon 129). His focus was on what people considered to be true and to “problematize” it by looking at the discursive and non-discursive practices that made it an object of knowledge (131). This should lead to a description of the totality (“apparatus”) of (discursive or non-discursive) practices which include “discourses, institutions, architectural forms, regulatory decisions, laws, administrative measures, scientific statements, philosophical, moral and philanthropic propositions,” and “strategies of relations of forces supporting, and supported by, types of knowledge” (qtd. in Deacon 131). The analysis then should be a history of the “present to trace the conditions of the emergence of the presence” (Koopman 24). Genealogy shows not only that these practices of knowledge and power are contingent and subject to change, but it also shows how these are formed (44). This

(10)

enables us “to make available the materials we would need to constitute ourselves otherwise” (44). By analyzing the conditions of possibility, bringing forth power-knowledge vectors helps understanding current practices better (57).

Orit Halpern (Beautiful Data. A History of Vision and Reason since 1945) developed a genealogy of cybernetics correlated with a genealogy of vision. She problematized the assumptions of cybernetics and how they influenced contemporary forms of interactivity and perception (249). Furthermore she developed a view that aligned the history of cybernetics with the history of a new form of rationality. This thesis builds on what she later called the “smart” mandate (Halpern, “The ‘smart’ mandate.”; Chapter 4) which tied a reformulation of cognition, rationality and vision to a critique of the smart city. I will use Halpern’s Foucauldian analysis of cybernetics to investigate the smart city practice of UDs. It should not necessarily uncover whether this practice is good or bad but rather discuss what Koolhaas hinted at by stating that the smart city is "condemned to being stupid“ (Koolhaas) and denaturalize the assumptions that are embedded in UDs. I want to show the relations of the smart mandate and its assumptions with concrete smart practices. UDs are objects of knowledge as well as they are objects of power. They propagate a specific form of vision and rationality which this thesis will explore.

To structure the problematization I will align the analysis with the conceptual framework of programmable urbanism which Rob Kitchin suggested for understanding and researching relationships between code (software) and the city (Kitchin, From a

Single Line of Code to an Entire City: Reframing Thinking on Code and the City; Ch. 3.2). He

argues that “[D]igital technologies and services augment and facilitate how we understand and plan cities, how we manage urban services and utilities, and how we live urban lives” (2) and that “[s]oftware is used to produce, mediate, augment, and regulate systems and tasks” (2). Furthermore, he emphasizes that the relationship between code and the city is complex and diverse and that most research is either focused on the study of code and fetishizes it, or focused on the study of the city neglecting “the constitution and mechanics of the code” (12) which affect the city. His approach intends to bring together software studies and urban studies (12) since “none of these can be fully understood without being considered in relation to one another, nor outside of their wider context” (6). For Kitchin this means, to analyze an application, first, one has to position it within its technological environment (“technology stack”), second, the technological environment has to be framed within its context, and third, for interpreting

(11)

the application in its contexts, an analysis has to draw “on ideas and empirical insights from a range of fields within critical social science and science and technology studies” (8). His suggestion is “to produce a holistic analysis, examining how programmable urbanism is framed within a wider discursive, political and economic landscape (the rhetoric of smart cities, for example) and how it is built, functions and has effects in practice” (11). The term programmable urbanism connects code and the city by showing “their dense interconnection and embedding within a wider discursive, political and economic landscape” (11). In this study I will examine two urban dashboard implementations as such an application. I will explore the technological environment, as well as frame it within the context of its production and use. To accomplish this I will look at the interface, the software, the data and the platforms it uses. Furthermore, the main focus of the analysis will draw on the ideas of Halpern’s smart mandate (Chapter 4) to problematize its assumptions and their consequences and frame urban dashboards within that wider context.

(12)

3. Infrastructure, Space and Code

Koolhaas – as an architect well aware of material and its effects – when discussing the “smart city”, looked at how the material of the architect (the wall, the floor, the ceiling, the stair) is “evolving in the current moment” (Koolhaas). He concluded that a “necessary component” of future homes will be a Faraday Cage “to retreat from digital sensing and pre-emption” (Koolhaas). While at first this seems to be an extreme thought, the idea addresses several dimensions this thesis likes to explore. It connects software and space through infrastructure. Digital sensing is not an abstract process happening on screens but it involves material processes such as microprocessors or electromagnetic waves. The Internet of Things (IOT) is advancing rapidly and besides the smartphone as a constant accompanying digital sensor, more and more infrastructure is equipped with sensing technology while at the same time being networked via the internet. The urban environment is part of that digital expansion. This thesis deals with space, the city and urban development. On a material level this means objects like buildings, transportation systems, environmental sensors but also software as interfaces. To ask the question of what kind of knowledge emerges in the smart city and how the “smart” mandate manifests itself materially, I want to explore the infrastructure that shapes the city. In the smart city infrastructure - the material - and data collection will be increasingly tightly knit. When planning urban infrastructure data collection, environmental sensing, and digital infrastructure are part of those considerations. Thus in the following chapter I want to expand the view of what media is, so as to elucidate the relationship between media and the city. UDs represent a coupling of software and space, and to investigate its consequences an expanded view of what media is, is required.

To this end, I would like to begin here by explaining the concept of

infrastructuralism as it is defined by new media theorist John Durham Peters. For Peters,

infrastructuralism informs and helps to understand media as more than means for sending messages but as “providing conditions for existence” (Peters 14). By this it broadens the view of what media is and includes “mundane” artifacts. In particular, it allows this thesis to look at smart city artifacts not as seemingly apolitical technological objects but recognizes them as objects that underwent negotiations and reflect political partisanship (38). “Smart” infrastructure is not a black box added on top of the city but is part of forming and influencing the city by forming the conditions of urban existence.

(13)

3.1. Infrastructuralism

In the context of overlapping contemporary debates within both the field of media studies as well as the related field of Science and Technology Studies (STS), materiality has been defined “as the physical character and existence of objects and artifacts that makes them useful and usable for certain purposes under particular conditions” (Lievrouw 25), which puts the focus on the materiality of things in contrast to “the materiality of practices or social or institutional forms” (26). Since, as the media theorist Leah Lievrouw notes, most scholarship in media studies follows a “broadly constructivist, culturalist line, privileging the technologies’ social and cultural meanings” (24), she characterizes these debates concerning the material character of media as “something of an unfinished project” (24). By contrast, she claims that STS recognize the co-determinacy of the material together with the social through the stabilization and standardization of material objects (32). She argues that much of media studies has tended to focus on “message production and consumption, the generation and contestation of cultural discourses, and the institutional/political power of media industries and regulators” (33). Moreover, Lievrouw contends that what she refers to as “medium theory”, as inaugurated by the “Toronto School” of communication research (Harold Innis, Marshall McLuhan) and enriched by German Media Theory, addresses itself to the question of material infrastructure or “infrastructuralism”, a term that John Durham Peters, a preeminent scholar at the forefront of these current debates, has coined in order to refer to media as “modes of being” (Peters 17).

In 1964 Marshall McLuhan famously exclaimed that the “medium is the message” (McLuhan 9), thereby ontologizing media. He understood media as more than only sending and receiving messages and in turn pluralized the understanding of what media can be (Peters 16). John Durham Peters concept of media as “modes of being” builds on and extends McLuhan’s premise. He takes “media less as texts to be analyzed, audiences to be interviewed, or industries with bottom lines than as the historical constituents of civilization or even of being itself” (18). This is a far reaching interpretation of media and is strongly influenced by the French paleoanthropologist Leroi-Gourhan’s view of an “intertwinement of embodied practice and technical objects” (17) and, by consequence, the inseparability of nature and culture. This strand of theory sees “media as the strategies and tactics of culture and society, as the devices and crafts by which humans and things, animals and data, hold together in time and space” (18). This builds on the view held by

(14)

Harold Innis, a Canadian political economist, to put infrastructure in the main focus of media theory and that factual existence of media was more important than what is transmitted (18). Innis also argued that media has the power of leverage by documenting or not documenting, by gathering data or not gathering data. Media have “narrow pass- through points” (21) that bear the possibilities for control: “data can both picture and manage the world” (20). Media are not seen as objects that react on the world, they are rather understood as intervening by representing and representing by intervening (20). Friedrich Kittler adds to this view by seeing media as “world-enabling infrastructures; not passive vessels for content, but ontological shifters” (26). He goes as far as saying that “media studies was a privileged form of seeing being as mediated” (27). In the context of the city UDs are ontological shifters. They can make specific parts of the city more and others less visible. This theoretical approach allows to interpret UDs as not just reacting – that is showing – the city but also as intervening by representing the city and representing by intervening in the city. Thus, when analyzing UDs it is no just their content which is relevant but also how they function as a “world-enabling infrastructure” which is actively involved in making the city.

This ontological shift of media allows to pluralize the objects that can be understood as media. Consequently, Peters suggests to “understand media we need to understand fire, aqueducts, power grids, seeds, sewage systems, DNA, mathematics, sex, music, daydreams, and insulation” (29). John Durham advances the definition of media by introducing the term infrastructuralism. When one talks about infrastructure large objects like railways, airports, water systems or electrical grids come to mind. They can be defined as “large, force-amplifying systems that connect people and institutions across large scales of space and time” (qtd. in Peters 31). While large scales are a defining feature, infrastructure often appears through comparably small interfaces like water faucets, electrical outlets or computer terminals. Infrastructure has a tendency to be boring, mundane and, despite being there all the time, being “behind the scene”. Infrastructuralism is interested in these demure objects. Peters introduced the term infrastructuralism to put up against structuralism and post-structuralism to call out attention to “the basic, the boring, the mundane, and all the mischievous work done behind the scenes” (33). Like media, the modus operandi of infrastructure is withdrawal “to sacrifice their own visibility in the act of making something else appear” (34). As infrastructuralism is interested in how the taken-for-granted gets constructed, it helps uncovering the hidden negotiations which the everydayness conceals (34). Infrastructure

(15)

is often invisible by nature (35). It can be buried in the ground (sewers) or blended with the scenery (mobile network antennas as trees). Infrastructuralism sees the work of media as fundamentally logistical. Logistical media organizes, sorts and arranges. Thus they organize relationships among people and things (37). At the same time they claim to be neutral, abstract and given. Infrastructuralism intends to make the hidden negotiations, the political partisanship and the situatedness of infrastructure visible (38). Smart city processes are part of the infrastructure of the city. Sensors and other means of data collection are deployed throughout the urban space. Objects such as traffic cameras or environmental sensors are such an “invisible” part of the urban environment as well as they are part of smart city processes which collect, analyze and visualize this data. In the following section I will explore the relationship between software and space in more detail to show the influence of software on the city.

3.2. Code / Space

Software is omnipresent and yet it still remains invisible. While it stays hidden inside machines, it forms, controls and influences the material visible world. Thus software and the space it operates in have a relationship. In smart cities we encounter software that interacts with space directly. Urban dashboards and similar software, which visualize, analyze and order space, are part of that relationship. In this section I want to elucidate this relationship between code and space, and show how it has been theorized by select authors in new media theory. My focus will be on the ontogenetic understanding of space as it grounds the analysis of urban dashboards as material to the urban environment.

Historically space has been understood and theorized in multiple ways. Spatial relations have been most thoroughly investigated in human geography where three different conceptions of space can be found: absolute, relative and relational space (Elden 264). Until the 1950s only little philosophical work on conceptualizing space has been done and it was implicitly understood as a naturally given container “in which life took place” (Kitchin, “Space II” 269). In the 1950s the conception of absolute space came up stating that space can be understood and fixed through measurement and calculation (Elden 264). An example for this conception of space are the x, y and z coordinates of Euclidian geometry where space and objects are distinct entities that can be located, described and analyzed objectively and scientifically (264). In this conception of space

(16)

cities are then understood as an “absolute system of geometry” (Kitchin, “Space II” 269) which shapes spatial processes that can be calculated and predicted. Relative space is a concept that is based on the idea of absolute space but recognizes that space is not just an empty container, rather its objects have relations and that space is dependent on that relation to objects (Elden 264). It grants the observer a key role and acknowledges that the experience of space is relative to other things such as time, economic factors, social interactions or cognition (264). The city is then seen as consisting of absolute spaces but the spatial processes happening in city are relative to how people experience and imagine it spatially (Kitchin, “Space II” 270). Further theorization of space led to perceive space as relational. In this conception of space "objects exist only as a system of relationships to other objects" (Elden 265). Space follows as a product of these interrelations. It is produced and constructed through social relations and practices, and, moreover, it is constitutive of such relations. The environment does not simply exist and meaning is added on top of it, as the conception of relative space suggests, but the environment is produced as spatial relations by material and discursive practices. The environment again shapes these social relations (Kitchin, “Space II” 270). According to this conception cities are composed of relative spaces “produced in contingent and relational ways by people” (Kitchin, “Space II” 270). The relational understanding of space renders the seeking of spatial laws less important than seeking answers to how space is produced and managed inducing certain sociospatial relations (“Space II” 270). Henry Lefebvre developed a thorough understanding of space as relational with his book The Production of Space. For Lefebvre space is a product of three elements that are entwined with each other: “spatial practices”, “representations of space” and “spaces of representation”. Spatial practices are processes, flows, movements, behaviors of people and things. Representations of space are discursive media i.e. images, books, films or maps which try to make sense of the world. These representations adhere to ideologies that legitimize particular spatial practices. Spaces of representation are spaces that are produced by the body in everyday life. They are lived and felt by people in their everyday life. These three elements are in a complex relationship influencing each other. Theses spatial relations are historically fixed and thus change over time. Lefebvre connects Marx’s periodization of capitalism to different configurations of spatial relations. This shows that spatial configurations have to be understood and analyzed in context of time and place (“Space II” 270).

(17)

In the smart city these conceptions of space overlap. The GPS system used for location based services by pinpointing exact locations on a geographic coordinate system via longitude and latitude owes its concept to a thinking of space as absolute, while relative and relational space overlap in location based annotation systems such as Google Maps, Yelp or Foursquare. Software complicates these spatial relations as it is increasingly becoming the “background of the world” (Thrift; Kitchin and Dodge). New media theorists like Nigel Thrift, Martin Dodge and Rob Kitchin conceptualized space against the background of a growing influence of software in the world. In the second half of the 20th century the proliferation of software has been facilitated by advantageous developments. On the one hand, the developments in production and innovation of technical objects helped: computation has become faster and cheaper while at the same time memory and disk space became abundant. In addition, network speed and network bandwidth grew immensely (Kitchin and Dodge 7–9). This lead to a rise in computing power and an increasing ubiquity of hardware and software (Thrift 587). On the other hand, software-enabled technologies became widely accepted as being a part of everyday life. Improved interface usability has made usage easier and a favorable physical design, that furthered miniaturization of computational devices, helped wide-spread adoption and advanced the integration of software in everyday life. As such software can be considered to mediate “almost every aspect of everyday life” (Kitchin and Dodge 9).

In the second half of the 20th century cities have also been part of this transformation. In the 1980s computational processes started to gain a bigger role in urban planning and networked or computable cities appeared as regular features in urban development plans (Gabrys 30). Smart city infrastructure and software operate in this domain and influence the spatial and social relations. More recently the conceptions of absolute, relative and relational space have been further developed. The idea that space is a fixed and definable entity is questioned by an understanding of space which fosters the view that “space is not ontologically secure” (Kitchin, “Space II” 272). The concept of ontogenetic space asks not what space is but how space becomes. Space is in a constant process of becoming “a material and social reality forever (re)created in the moment" (“Space II” 272) and emerges from a process of ontogenesis. An ontogenetic position sees the city not as a space that is but as a space that constantly emerges and reconfigures through overlapping spatial practices (“Space II” 274). Smart city practices work with space and constantly add to a reconfiguration of the city as it influences social and spatial

(18)

relations. The ontogenetic interpretation of space enables this thesis to position software in the production of space as part of the constant process of coming into being.

An ontogenetic understanding of space leads to additional questions when software is understood as a constant background of human activity. The notion of software as a constant background has been further developed by Nigel Thrift who introduced the term qualculation to describe the relationship between software and the world. Thrift sees a new calculative background that creates a world based on continuous calculation (583). He attests a growth of artificial “paratextual forces” that form the background of human activity. This background “has been filled with […] ’artificial’ components’” such as roads, lighting and pipes (584) and they build a form of a “technological unconscious” as they “no longer receive attention because of their utter familiarity” (585). The logic of the system of this background will gradually and continually become the logic of the world (586). Computing power through the medium of software has been widely applied and the “sheer amount of calculation” (586) has become part of everyday life where it produces a new calculative sense which Thrift calls qualculation. This ontogenetic process - through a new type of background – produces a new sense of space (595). It seems second nature now to navigate the world with navigation software which requires extensive calculations and a world that is understood as addressable via GPS coordinates. A certain predictability is expected from location based services as they allow to pinpoint by the minute when one arrives at any chosen destination. This calculative background forms the way one relates to space and time and consequently how one sees the world.

Dodge and Kitchin further this view by proposing that software does not just produce space but it transduces it, by which they mean that it transforms space from one state to another. Software constantly tries to solve relational problems and helps bringing different formations into being (Kitchin, “Space II” 273). The two media theorists also assume a certain technological unconscious. They argue that the nature of software appears to be “automagical” (Kitchin and Dodge 5), by which they mean that its inner workings are not clearly visible, while at the same time is leading to complex outcomes that influence everyday life. Kitchin and Dodge suggest a framework of concepts to help understanding this influence, specifically in relation to the making and reconfiguration of space. This thesis will use Kitchin and Dodge’s concept of “code/space” to contextualize the smart city as being engaged in a domain that is influenced by software and that

(19)

software itself has material effects on the city. They differentiate between 4 levels of activity of software in everyday life: coded objects, coded infrastructures, coded processes and coded assemblages (5). Coded objects are objects that rely on software to function. (5)

Coded infrastructures are infrastructures that “are monitored and regulated, fully or in

part, by software” or networks which connect coded objects. Coded processes are transactions and flows of digital data via coded infrastructures. (6) Coded assemblages appear where coded infrastructures converge via coded processes to create highly complex systems. With these terms software and its entanglement with the material world can be described more precisely. Software is programmed to read inputs and react to them with outputs. These reactions consist of pre-written algorithms but they function on themselves, giving software a degree of autonomy. In coded assemblages these outputs can be very complex and substantial. As software shapes and influences everyday life, it can be observed that software empowers and disciplines. It can empower by making “social and economic processes more efficient, effective, and productive” (10) or by enabling “many new forms of creative technology and novel kinds of art, play, and recreation” (10).

While a great amount of new forms of objects and expressions are created through software, it also is used to sort, categorize and regulate more efficiently. The same empowering possibilities software opens, allow for “new modes of invasive and dynamic surveillance” (10) to discipline. Just as a social network such as Facebook facilitates social processes, it makes users willingly and voluntarily “subscribe to and desire their logic, trading potential disciplinary effects against benefits gained” (11). It is hard to argue against the many possibilities software opens up in everyday life, so that it is easy to ignore or oversee the disciplinary effects it has (10). Kitchin and Dodge argue that this power of software unfolds in social and in spatial contexts. Furthermore when discussing the relationship between software and space they identify an overemphasized focus on the study of software and attest a lack of focus on the spatial context software operates in (12). While they acknowledge that software studies show the embeddedness of software in everyday life and that software is a social-material production - that “code makes digital technologies what they are and shapes what they do” (13) in relation to social processes - they argue that “[a]ll too often, however, they focus on the role of software in social formation, organization, and regulation, as if people and things exist in time only, with space a mere neutral backdrop” (13). Software influences social formation, organization and regulation as well as spatial relations. People and objects exist in space and, as such

(20)

they cannot be considered independent of space. Context and practices create space by putting people and objects in relation to each other and software is part of this process by contributing to discursive and material practices. “Software matters because it alters the conditions through which society, space, and time, and thus spatiality, are produced” (13). They further this ontogenetic position by giving software an important role and argue for an understanding of the relationship between software and space which assumes that software “mediates almost every aspect of everyday life” (“Space II” 9) and consequently, as part of the process of becoming space, software has to be taken in regard when contemplating spatial relations.

Kitchin and Dodge introduce two central terms to specify the relationship between code and space. The first term is code/space and describes a space which becomes when “software and the spatiality of everyday life become mutually constituted, that is, produced through one another” (16). It describes a dyadic relationship between code and spatiality that is dependent on this relationship and mutually constituted (17-18). This relationship is neither universal nor deterministic; rather, it can be thought of as situated. Code/space always emerges in specific contexts and relations and is dependent on code. A check-in at the airport which delineates a space does not work anymore without the software which processes a boarding ticket. This differentiates itself from coded spaces which occur when software augments a space but it is not dependent on software in its entirety. The transduced space is still mediated by software but also works without it. A lecture hall is transduced by a presentation software but would still be a lecture space if the software fails. The perspective of Kitchin and Dodge on how space is becoming enables this thesis to understand smart city processes not just as black box processes on top of the city. Creating “smart” spaces does not just mean adding sensors to a space that make the city more efficient or safe. An ontogenetic view of space rather helps to see these processes as processes that are part of ontogenetic space, which constantly bring spaces “automagically” into being by collecting data and applying computation. Instead of treating smart cities as black boxes that have a fixed input and output relation goal and, consequently, create a fixed space, they can be interpreted as being in a dyadic relationship of code and spatiality. In consequence, UDs do not just describe or enhance a city but their depiction or visualization of the city influences the spatial relations. For instance, heat maps that overlay a map showing different intensities of a parameter on a map are at first a visualization that enables a visual overview of an otherwise difficult to

(21)

grasp relation of that space and a specific parameter. If this parameter would be a crime statistic, the heat map would instantly change the way this space is perceived and treated in a city. UDs inscribe information into space when they add such spatial annotations.

These annotations make some places more visible than others. The critique of this “stratification of space” usually turns to representation and inclusion (Barreneche). Instead Barreneche suggests to shift “the focus from the politics of representation to an analysis of the politics of code and appearance”. Thus, the focus will shift on the inner workings of technologies such as algorithms. Barreneche shows in his case study of Flickrs “interestingness” algorithm that to be included in a database does not suffice to be visible. The algorithm which determines the visual order of photo listings has a much greater impact and to be in the database has a lesser effect than “the specific ordering logics of the visible" (Barreneche). Consequently, how code orders and visualizes data should be engaged critically. He concurs with Kitchin and Dodge that an ontogenetic understanding of space is needed to see “how the technicity of software brings space into being” (Barreneche). To critically engage with the inner workings of a technological object helps to better understand the mediated space of the city. “[H]ow space as such is coded and computed, directly frame[s] what spaces are geoannotated (visible space) and subsequently what can we know about them (epistemic space)” (Barreneche). UDs rely on such geoannotated visualizations which influence the spatial relations.

This thesis discusses not only the relationship between software and space, but the relationship between software and the city. A city is a complex web of interactions, processes and systems. Based on the concept of code/space and infrastructure I want to conceptualize UDs as a digital socio-technical assemblage (Fig. 1) as suggested by Rob Kitchin (Kitchin, From a Single Line of Code to an Entire City: Reframing Thinking on Code

(22)

Fig. 1. Conceptualizing the constitution of a digital socio-technical system. Graph from Rob Kitchin, From a Single Line of Code to an Entire City: Reframing Thinking on Code and the City. 9, N.p., 2014. Web. A digital socio-technical assemblage consists of a technology stack and the context which frames this stack. Kitchin defines the digital technology stack to consist of multiple layers: a material platform, a code platform, data and databases, software, an interface, and

reception and operation (7). These elements work together and have effects on each other.

A specific hardware allows only specific operating systems, an operating system limits the choice of programming environments, a code platform shapes data structures which in turn frame and enable specific interactions or presentations. Kitchin advocates this holistic understanding of an application as a digital stack (7) and argues that software is just one – although critical - element of it (8). To study a smart city software means to not limit the analysis to an interface or a database, but to reflect its interrelationships with all elements of the stack. Moreover, to understand a data based product “one needs to examine its entire data assemblage” (6). Data are the product of an assemblage and the assemblage is structured to produce data. They are “mutually constituted, bound together in a set of contingent, relational and contextual discursive and material practices and relations” (6) The apparatus and elements (Fig. 2) interact with each other and are connected through a “contingent and complex web of multifaceted relations” (6) to build a data assemblage.

(23)

Fig. 2. The apparatus and elements of a data assemblage. Table from Rob Kitchin, From a Single Line of Code to an Entire City: Reframing Thinking on Code and the City. 7, N.p., 2014. Web. This shows that a specific digital social-technical assemblage is not only a complex set of relations but that it also has hidden negotiations embedded. When a digital socio-technical assemblage is part of the smart city, its elements have a relationship with space and bring space into being. To problematize such an assemblage then means to uncover those hidden negotiations and relations, that is to put the smart city processes in the context which frames them. The following chapter will focus on the context of the smart mandate which is relevant to the case study of UDs.

(24)

4. The Smart Mandate

The city and software are in a dyadic relationship. Software mediates the becoming of space and thus influences its spatial relations. In smart city practices computation and data play an essential role and are part of mediating and making the city through specific spatial practices which they bring forth. The spatial relations of the city are dynamic and not in a statically fixed configuration. They are constantly reconfiguring and changing over time (Kitchin, “Space II” 272) and depend on context (274). With the rise of cybernetics in the second half of the 20th century the world has been reimagined as a world of neural processing and “smart” networking (Halpern, “The ‘smart’ mandate.” 223). Orit Halpern calls this the smart mandate which “underpin[s] our design practices across many fields” (223) today. This context of the smart mandate is in full effect in today’s showcase smart cities such as the Songdo International Business District in South Korea. Its spatial relations are influenced by reimagining the city as networked and “smart”. The core idea of the marketing of a smart city, such as Songdo, is a “fantasized transformation in the management of life - human and machine - in terms of increased access to information and decreased consumption of resources” (Halpern et al. 278). It is a fantasy fueled by technology providers like Cisco who “envision a totalizing sensory environment in which human actions and reactions […] can be traced, tracked, and responded to in the name of consumer satisfaction and work efficiency” (279). This is met with a modernist interpretation of an ideal city concerned with indexing and archiving for better administration where the “[s]ystematic management and its material techniques and technologies” (Mattern, “Indexing the World of Tomorrow”) are understood as “powerful modernist city-building tools” (Mattern, “Indexing the World of Tomorrow”). Both of these visions, a world filled with sensors and the systematic management through these sensors, can be found in today’s smart city practices such as urban dashboards:

“The models of technologized efficiency, automated information management, and ‘scientific’, ‘fact-based’ decision making […] are today apparent in our data-driven, cognitive-computing-powered government and corporate offices, online retailers, dating sites, logistical systems, social services, schools, and universities” (Mattern, “Indexing the World of Tomorrow”).

(25)

The smart mandate influences the current configuration of spatial practices which make the smart city. By a deeper investigation of the smart mandate and by following Halpern’s genealogy of it, I pursue to extend my framework for analyzing UDs. Halpern’s analysis of how reason and vision changed in the second half of the 20th century denaturalizes the way today’s forms of interactivity and ubiquitous computing are seen. She argues that “a great deal of work, and imagination, went into convincing us that vision could be designed and machines and minds both ‘thought’" (Halpern, Beautiful Data. A

History of Vision and Reason since 1945 136). Halpern’s analysis denaturalizes what is

assumed as natural in smart city practices which are based on a specific form of epistemology, perception and cognition (Beautiful Data. A History of Vision and Reason

since 1945 17). She traces three movements that stem from the influence of cybernetics,

first, the “reconceptualization of the archive and the document“ (17), second, the “reformulation of perception and the emergence of data visualization and the interface” (17) and, third, the “redefinition of consciousness as cognition“ (17). UDs as a common smart city practice function and are legitimized on the grounds of philosophical assumptions that stem from these movements. Halpern’s analysis of change of reason, intelligence and vision in the second half of the 20th century provides a fruitful starting point for exploring the often hidden implications of data and computation that urban dashboards entail. Since cybernetics is of great importance when analyzing the smart city this chapter has two intentions. On the one hand, I want to trace the influence of cybernetics on other sciences, generally, and on architecture and urbanism, in particular. On the other hand, this chapter will complete the framework for critically investigating urban dashboards by discussing cybernetics implicit assumptions as they are problematized by Orit Halpern. This will enhance the analysis by allowing to denaturalize specific implementations of urban dashboards along Halpern’s analysis. More specifically, first, I want to draw a history of the smart mandate to show the circumstances out of which it developed and came to its status today. I will draw a path of the development of cybernetics and its influence on other sciences and thus show how it became to position itself as a universal science. Second, I will show the influence of cybernetics on architecture, urban development and urban governance in the second half of the 20th century and how that relates to the smart city. Third, I want to discuss the three themes that Halpern identifies as being “the foundations for producing new techniques of calculation, measurement, and

(26)

administration” (17) such as urban dashboards: “reconceptualization of evidence, vision, and cognition” (17).

4.1. Cybernetics as Universal Science

In this section I want to draw the path of the development of cybernetics and its influence on other sciences. For Halpern cybernetics was the basis of an “epistemic transformation involving the relations between temporality, representation, and perception” (Halpern, Beautiful Data. A History of Vision and Reason since 1945 12). These transformations frame and source the faith in data science and analysis being used in smart city practices today. As one of the early influences on cybernetics, Claude Shannon, a researcher for Bell Labs, developed a theory of information in the late 1940s. He developed this theory to conceptualize the technical transmission of messages and communication. In its basic form it says that a message is sent from a sender to a receiver via a communication channel. Along its channel a message is exposed to noise which is “the chance fluctuations, interference, and transmission errors that inevitably degrade signals as they make their way through an error-ridden and analog world” (Davis 82). Furthermore his second theorem says that “any message can be coded in such a way that it can be guaranteed to survive its journey through the valley of noise” (82). In order to guarantee a successful transmission redundant information has to be added to the message (redundancy). Redundancy itself is only limited by the capacity of a communication channel (bandwidth). In the 1950s and 1960s this abstraction of communication as a basis of information theory started to influence other fields, such as social science, psychology, biology and management theories, with its concepts and language. Its appeal lied in the reformulation of complex problems concerning communication, learning, thought and social behavior as a system of information processing: “The budding technocracy of postwar society seemed to have found its lingua franca: an objective, utilitarian, and computational language of control with which to master the carnival of human being” (83).

The translation of a societal problem in into a mathematical equation was not a trivial philosophical problem to solve. While information in the computation world is understood as a set of correctly transmitted binary codes, in the world of social science and of interpreting the world, a successful transmission depends on successful conveyance of meaning (84). As a mathematician Shannon was mainly focused on

(27)

creating a mathematical equation for information transmission in the context of a technical analysis of transmitting messages electronically. He conceptualized that genuine information primarily has to be new but only in part uncertain to the receiver - what the receiving end receives, has to be somewhat predictable (84). As an example, the structure of a language gives us a certain amount of structural redundancy such as grammar and syntax to allow for successful communication. He paired the concept of uncertainty with the probability that a message is received (85). “At the heart of information theory, then, is probability, which is the measure of the likelihood of one specific result […] out of an open-ended field of possible messages” (85). Information needs to be new and familiar to a certain degree, so it can be understood. Shannon’s information theory was very influential and was generalized from its technical application to be applied to other fields.

In 1954 Norbert Wiener published the book “The Human Use of Human Beings” in which he coupled information theory – transmission of information – with conveyance of meaning. Wiener’s basic view of the universe was, that it tends towards entropy – randomness and disorganization - and that information keeps entropy at bay. Furthermore, he connected entropy to meaninglessness by saying: “In control and communication [cybernetics] we are always fighting nature's tendency to degrade the organized and to destroy the meaningful" (Wiener 17). Thus he heightens information from mere transmissions to saving the universe from entropy - even though he was convinced that in the (very) long run maximum entropy is inevitable (38). In his book he continues “to stray beyond the dispassionate scientific measure of bits and provocatively links the behavior of information systems to meaning, value, and life itself” (Davis 87). In addition, he extended his idea of information and information systems to viewing human beings as “information-processing machines” (88). He argued “biological, communicational, and technological ‘systems’ could all be analyzed with formalized descriptions of how such systems processed and stored messages, memories, and incoming sensory data” (88). He subsumed these different objects as being systems and called the science of these formalized descriptions cybernetics or the science of “control and communication”. In cybernetics the main epistemological dilemma is “how to relate the micro actions and macro systems” (Halpern, Beautiful Data. A History of Vision and Reason since 1945 36). Wiener saw the cybernetic practices of feedback and control as a way of solving that dilemma. In the cybernetic view of systems, feedback loops are central to understanding and conceptualizing the processes within them.

(28)

These systems have constant loops in which the output of the system (or parts thereof) is fed back as new input. Via these feedback loops a system can adjust itself, “learn” and improve its output. Those circuits are constantly adjusting themselves by processing and communicating information through their feedback loops (Davis 89). This constant stream of transmissions controls the system for efficiency to “resist the evil deathlord of entropy through information, communication, and feedback” (89). The term cybernetics originates from the Greek word kubernetes meaning steersman and signifies its connection to the idea of steering or controlling, thus the science of control. The mechanics of feedback and control negotiate the difference between the expected outcome of a cybernetic model and the actual outcome (Halpern, Beautiful Data. A History

of Vision and Reason since 1945 36). This type of epistemology is “encoded into the very

infrastructural logics of [smart] cities like Songdo” (36) and thus is essential to the understanding of contemporary smart city practices.

The strength of cybernetics did not just lay in its convincing epistemology but by creating a language to analyze and compare systems that would be applicable and exchangeable to diverse fields like biology, cognitive science, mathematics, robotics and urbanism. By explaining (cybernetic) systems with the language of patterns and processes it made itself available for universalizing its applicability. Cybernetics created the language of feedback, communication, control and system which “provided a site where this exchange could occur” (Bowker 116). It allowed for a “discontinuous transmission of ideas: conceptual tools could be yanked out of one context (philosophy of mind) and plugged into another (automata theory), with the translation into the language of cybernetics doing the work of glossing the discontinuity” (116). The universalization of cybernetic thought went along with consequences to the idea of the distinction between mind and machine (Davis 90). “Where metaphysics deals with the bilateral expressions of essence and instance, cybernetics assumes a baseline multilateralism across broad swaths of different instances interacting and influencing one another” (Galloway 115). Electrical circuits, DNA or human interaction are all systems that can be connected by a similar language. “Where phenomenology addresses itself to the relationship between subject and world, cybernetics concerns itself with systems of multiple agents communicating and influencing one another” (115). For cybernetics the world is a system of mediation and the “cybernetic language” was needed to span the realms of the mind and the machine at “an historical conjuncture where machines were

(29)

becoming sufficiently complex and the relationship between people and machines sufficiently intense” (Bowker 117).

The universalization of cybernetics and its ability to transgress fields of science enabled cyberneticians to spread two types of rhetoric. First, cyberneticians argued that other sciences should be subsumed under cybernetics and, second, that cybernetics “could play the same structural role as mathematics in the quantifiable sciences - or statistics in many of the social sciences” (121). The first argument was based on the transgression between disciplines organized by the concepts and language of cybernetics. The second argument stemmed from the idea “that the computer provided a new technology that spanned all of knowledge” (122). As such cybernetics is meant to provide the analytical tools for other disciplines. Both arguments aimed at establishing cybernetics as a universal science:

“Cyberneticians, like other interdisciplinarians, directly appropriated both religious and political discourse, and argued that their science produced the most faithful possible description of society.” (123)

By taking the cue from Tiqqun’s publication of the same name, Galloway calls the extension of this universalization the cybernetic hypothesis, which refers to a specific “epistemological regime in which systems or networks combine both human and nonhuman agents in mutual communication and command” (Galloway 111). This, in turn has “come to dominate the production and regulation of society and culture” (111). This idea of universalizing the principles of cybernetics “impacted everything from postwar architectural movements to genomics to politics” (Halpern, Beautiful Data. A History of Vision and Reason since 1945 47). In the following section I will look more closely how this universalization showed its impact on architecture and urbanism and subsequently show how this impact influenced the idea of the smart city today. 4.2. Cybernetics and the City In the previous section I showed how cybernetics positioned itself as as universal science applicable across different scientific fields. In this section I will demonstrate how the smart mandate and its cybernetic worldview had influence on architecture and urbanism as one of those fields. The way smart cities and its practices are built today

(30)

stems from this cybernetic influence on architecture and urbanism, and it subsequently changed the way the city was seen and treated. After the second world war the idea of an ideal city was based on a “neo-Corbusian city: streamlined, rational, orderly, efficient” (Mattern, “Indexing the World of Tomorrow”). This idea met with a cybernetic interpretation of the world and consequently the city. To trace this influence, first, I will show how the postwar period influenced architecture and urbanism at what Robert Dutta calls a “techno-social” moment (Dutta 2). Second, I want to show how the Architecture Machine Group at the Massachusetts Institute of Technology (MIT) heavily influenced by cybernetic ideas, was very influential in implementing computer-aided design as a cybernetic practice and furthered cybernetic thinking in architecture and urbanism.

In the first half of the 20th century modern architecture developed and, amongst other themes, was based on the notion that “form follows function”. As one of the most acclaimed representative the French architect Le Corbusier stood for modernism in architecture. In his book “The City of Tomorrow and its Planning” published in 1929 Le Corbusier imagined the “city of tomorrow” as a city “that would be perfectly statistically managed, showcase the latest technologies, and eliminate disorganization and could be built and replicated through systemic, machine-like principles and the application of careful statistical social science” (Halpern, Beautiful Data. A History of Vision and Reason

since 1945 9). In his definition the home was a “machine for living” for which he invented

a method to solve the architectural problem of scale. This would allow his designs to be used for an individual home as well as for an entire city (9). This machinic interpretation fit well together with desires for “’scientific rationality, technological progress, modernist aesthetics, industrial design … consumer prosperity, and … corporate capitalism’ in spatial form, via rational urban planning and progressive civil engineering, modernist architecture and sterilized suburbs” (Mattern, “Indexing the World of Tomorrow”) of the 1940s.

After WWII the architectural discourse developed a tremendous interest in “linguistic, behavioral, psychological, computational, mediatic, communicational, and cybernetic paradigms" (Dutta 2) - a "techno-social" (2) moment in postwar architecture. The discipline underwent a professionalization that was based on expertise strongly connected to “assembling, collating and processing larger and larger amounts of data” (3). This implied an “emphasis on the systemic and mathematical elements of judgment" (3). It applied the cybernetic worldview of solving complex problems through computation to avoid human misjudgment in the urban context because "[c]omplex systems are

Referenties

GERELATEERDE DOCUMENTEN

Hij beschrijft in dez e serie v erschill ende methoden die kunnen worden toegepast bij vegetat iekundi g onderzoek in netuurtuinen.. We hebben deze artike lenserie voor u

– research results indicate that on a theoretical level all of China’s agricultural aid and economic cooperation measures, translating the country’s three bi- lateral

While there has been a large focus on states carrying out drone strikes in non-conflict areas (mostly the US) what seems to be lacking in the current body of literature

Second, the study examines whether the distribution of first- born, middle, and youngest children in the group of admitted intoxicated adolescents with siblings differs from

The exchange of data is made possible by these functional building blocks such as tags that identify citizen, sensors that collect data about citizens, actuators

Topic Bachelor thesis about effectiveness of smart mobility interventions in Amsterdam regarding the trend in CO 2 emissions and the perception of local citizens. Goal of

Heel veel uitdagingen waar we voor staan, daar hebben we wel wat ideeën over, de antwoorden die je zou kunnen geven maar waar men niet precies weet wat voor antwoorden er

The Crisis Communications Playbook: What GM’s Mary Barra (and Every Leader) Needs to Know. Harvard Business Review, 2-4.. Using framing and credibility to incorporate exercise