• No results found

The Numbers Game - A Qualitative Study on Big Data Analytics and Performance Metrics in Sports.

N/A
N/A
Protected

Academic year: 2021

Share "The Numbers Game - A Qualitative Study on Big Data Analytics and Performance Metrics in Sports."

Copied!
141
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

The Numbers Game

A Qualitative Study on Big Data Analytics and Performance

Metrics in Sports.

11107766 Kyra Teklu

Faculty of Humanities

MA New Media and Digital Cultures

(2)

Table of Contents

Chapter 1. Introduction 3

Chapter 2. The Influence of Statistics 7

2.1 The Role of Quantification 7

2.2. The Role of Statistics and Metrics 10

2.3. Data and The Databases 16

Chapter 3. The Commercialisation of Sports 25

3.1. The Early Years 25

3.2. Media Rights and Sponsorship 29

3.3. Big Data shapes the Sport Industry 38

Chapter 4. Methodology 43

Chapter 5. Findings 54

5.1. The Tools 54

5.2. Player Recruitment 61

5.3 Rise in More Interesting Data 67

5.4. Unpredictability of Data 72

5.4. Economic Value 74

Chapter 6. Conclusion 78

References. 81

Chapter 7. Appendices 93

Appendix A-Examples of Metrics 93

Appendix B-Description of Participants 94

Appendix C-Participant Demographic Table 96

(3)

Chapter 1. Introduction

A New Science of Winning

The image you see on the first page of this thesis is a visualization of passes. This web depicts the England football teams passes during the first half of a game.1 The blue arrows indicate successful passes and their direction. Red indicates the failed attempts. Such examples of statistical visualizations focusing on performance are not uncommon nowadays, due to the advanced nature of data analytics and performance metrics within the professional sports industry. This is perhaps best indicated by the fact, today, 19 of the 20 Premier League23 teams use Prozone (Medeiros 2014). Prozone is a performance analytical company, which in their own words; “empower data with meaning to deliver insights that create competitive advantage on and off the field” (Prozone Sports 2016).). Prozone highlights a competitive nature that is innate in sport, whilst also alert to a gain ‘off’ the field. These gains are economic, monetary and even moral. To this end this research seeks to gauge how current uses of data in sport are impacting the sports industry as a whole, from sports management to the game itself.

This research provides a theoretical perspective to quantification and how the practice of quantifying; turning qualitative differences, phenomena, into numerical information, enables measuring (Espeland and Sauder 2007). Today, society is awash in quantification and measurement, but these practices facilitate ‘ordering’ (Beer 2015), which encourages the ‘ranking’ (Guyer 2010) and ‘comparison’ of individuals (Espeland and Stevens 2008). It is through the collection of quantified information that individuals can use this information in statistical analysis (Porter 1995). The use of statistics4 is key to this research in the commercial and technological influence it is has on the sports industry. This key influence comes in the introduction of ‘metrics’.

1 On 11 October, 2013, England played a World Cup qualifier against Montenegro at Wembley Stadium, here are some insights on the game from Prozone’s analysis.

2 The Premier League is the highest English professional league for men’s association football. 3 Further to this, each club has its own team of performance analysts and data scientists looking for the indicators that quantify player performance, the events that determine matches and trends that characterize seasons (Medeiros 2014)

4 Statistics is a branch of mathematics concerned with collection, classification, analysis, and interpretation for numerical facts and for drawing inferences on the basis of their quantifiable likelihood (probability) (Business Dictionary 2016).

(4)

This specific enquiry looks at the role of big data and performance metrics, and how these concepts facilitate measuring, leading to a new ‘value’ of data., Rob Kitchin (2013) details that Big Data is: “huge in volume, consisting of terabytes or petabytes of data; high in velocity, being created in or near real-time; diverse in variety, being structured and unstructured in nature; exhaustive in scope, striving to capture entire populations or systems” (262). By performance metrics; I refer to ‘systems’ of measurement that sportsmen are assessed by (Beer 2015). For example, in football metrics such as ‘passing accuracy’5 and ‘player shooting accuracy’6 are common place. Fundamentally, metrics are statistical formula (Tracy 2016).

The commercialisation of the sports industry will be detailed to strengthen the scope for discovering how and why data analytics7 and metrics descend from old forms of quantification and statistics. I describe the commercial development of American professional sports; baseball, basketball and American football, as the three contain rich commercial histories, and are most influential in the introduction and commercialisation of statistical analysis to sport. With a historization of the amalgamation of commerce, sports and data, I provide a framework for critiquing the impacts of big data analytics and performance metrics on the sports industry as a whole.

The merger of quantification, statistics and metrics with the commercialization of the sports industry will provide a framework to answer what is really being done with data, outside of the immediately obvious: scoring a goal, or making a touchdown. Furthermore, the sport of football will be looked at closer in The Netherlands as football is often cited as the most participated and consumed sport in Europe. In the empirical study I undertake, a selection of individuals are interviewed based in The Eredivisie.8 This research focuses on the individuals who collect, analyse and organize data, as these are the data professionals in the field whom use data day-to-day in influential ways.

5 Percentage of attempted passes that successfully found a fellow teammate (WhoScored 2016) (Squawka 2016).

6 A calculation of Shots (goal attempt) on target divided by all shots (excluding blocked attempts) (Opta Sports 2016).

7 Analytics is the process of analyzing /studying information (data) (Gartner 2016) 8 The Eredivisie is the highest league of professional football in the Netherlands.

(5)

I undertook this research topic due to being an enthusiast of sports and career experience in the industry. However, a lack of insight into the ‘science’ behind sport led to curiosity into how this element is being used today. With a growing economic debate behind data, and with sports already a formidable economic contributor, this proved an in-depth area of study. The significance of the social aspect of sport, a nation past time that now endorses new media technologies, is intriguing, as to what extent are these technologies changing the sport sphere? In big data studies there is little focus on individual sports, industries in themselves, that use big data. Therefore, the Dutch Context in which my research focuses on football, is a comprehensive study toward the impact of big data and metrics on the sports industry.

The Research Question:

“What is the socio-economic impact of big data analytics and performance metrics on the sports industry?”

Chapter 2 introduces a framework to the study of big data and performance analytics. The concepts of quantification, statistics and metrics provide a socio-economic approach that allows to unravel big data and performance metrics that are constituted today. Additionally, these concepts will convey the increased importance of quantification and why this led to statistics. Crucial ideas such as ‘ranking’ and ‘comparison’ will highlight the difference in uses of statistics and metrics, past and present. In terms of the present, I then align the turn to data, big data and databases to reflect a shift in technological infrastructures for measurement. I then move on to focus on the sports industry and how the latter concepts hold impetus in Chapter 3.

Chapter 3 focuses on the significance of the professionalization of sports and how financial incentives; entry fees, sponsorship etc. allowed for the progression of the commercialisation of sports. Statistics has a powerful commercial role and this will be isolated in this chapter. I will depict how statistics came to be introduced into sports and how statistics thus helped commercialize sports. The shift from statistics to big data in sport is a key economical shift that will be isolated, as big data begins to shape the sports industry.

Chapter 4 offers an extensive methodological framework. This chapter follows the revelations of chapter 3, in the economic significance of big data and metrics. The main research method conducted was interviews. This method allowed for an inclusive

(6)

understanding of how data professionals whom work in football, collect, analyse, and ultimately use data. Additionally, this chapter notes the details of my research, and the limits to my research method.

Following the trail of data shaping the sports industry in Chapter 3 and the influence of the economy on sports, Chapter 5 describes the relation between contextualized literature in Chapter 1 and key concepts in Chapter 2. This chapter is devoted to discussing the results of my empirical research. Sections illustrate the main themes connected to my research question, and how these themes define a new value of data.

Chapter 6 concludes my thesis by answering what is being done with data that is collected, as opposed to how teams are winning world sporting competitions with advanced statistics and metrics. These results provided an explanation and offered a critique on a new discipline of statistics, far from the days of ‘traditional’ statistical analysis, to today, where data has a different source and provenance. How this data is made valuable is key to uncovering the current foundation of the sports industry amidst analytical fruition.

(7)

Chapter 2. The Influence of Statistics

2.1 The Role of Quantification

This chapter will introduce the subject, firstly by briefly explaining the difference between cardinal and ordinal numbers (Guyer 2014), and how the two types of number function differently. Following on from this realization I will discuss the concept of ‘quantification’, associated with the arguments of Wendy Espeland and Michael Sauder (2007). With this concept, the process from quantification to statistics, which is the mathematical processing of quantified information (numbers) will be analyzed. To end this chapter, I will consider how the mutual shaping of quantification, statistics and metrics has assisted the move toward big data and the significant economic dimension of data. To this end, how the practice of quantification has been facilitated by and now constitutes the economy, in the form of big data.

Guyer (2014) constitutes cardinal numbers as being based on “spatio-temporal particulars and thingness” and a “uniformity of practice” (159), for example 8 puppies, 14 friends, simply: ‘counting numbers’. Wendy Espeland and Mitchell Stevens (2008) describe a cardinal number as leaving a ‘mark’. Espeland and Stevens state “the numerical marks are used to identify particular persons, locations, or objects” (407). These ‘marks’ can be found arbitrarily on football jerseys, distinguishing players from other players on TV screens and team sheets. Such numbers can begin to take on a “character of names” (407), as when commentators referred to David Beckham as “Number 7”. In essence cardinal numbers merely tally and sort. However, the ordinal number, to “order’ and position and familiar to most in modal reasoning (Verran 171), holds more significance in its proliferation of ‘ranking’.

Guyer (2010) describes the origin of ordinal numbers and these numbers encouraging ‘ranking’. Ranking emerged from an era of commercial trades in “extractive commodities of slaves, ivory and gold” in Atlantic Africa (123). These ‘ranks’: a powerful form of numerical expression in mathematics, that position and have “benchmark or ‘tropic’ thresholds (123), were used to derive and justify relative monetary valuation in Atlantic Africa. For example, The Igbo and Ibibio people in Eastern Nigeria, developed a formalized and monetized scale of ranking in the late nineteenth and early twentieth centuries (Guyer 2004). Guyer (2004) notes, the intervals in ordinal ranking were quantified in money and associated with competitive

(8)

acquisitiveness (69). The difference between numerical calculations and ordinal numerical expressions, invoke different political philosophies. But what is the role of quantification in these number forms?

Quantification

As established numbers can take different forms, but how does the concept of ‘quantification’ allow these numbers to function? ‘Quantification’ is the practice of turning qualitative differences, phenomena, into numerical information. Before and during World War I, financial and governmental pressures intensified the movement toward quantification. Quantification became almost mandatory as a response to World War II conditions of mistrust and disunity (Porter 1995).9 A key factor to this development was the increasing legitimacy of quantification (Porter 1995). For example, via the idea of numerical information allowing for impersonality thus ‘objectivity’, quantification was accepted (Porter 1995). Porter defines objectivity as the “subordination of personal interests and prejudices” (74). For example, Porter cites Karl Pearson’s argument for the school curriculum to be reorganized around science, to encourage “no interested motive” (75), and truth seeking from an unbiased standpoint (Pearson 1888).This suggests quantification offers neutrality thus accuracy.10

Mark Smith (1994) points out that in order to comprehend the circumstances under which quantitative objectivity has become a need, we need to look not only at the pursuit of objectivity, but more importantly at the social basis of ‘authority'. Certain scholars argue quantification allows for a ‘uniformity’, a result of numerical quantification. For example, “Churches concerned with observing Lent, and states needing to track the timing of taxes and military service”, these individuals benefited from a more precise and rigid calendar (1116). Thus the standardization of time was closely aligned with the discipline demanded of industrial workers.

Further to the rise of standardized quantification, Porter (1995) illustrates the example of American Accountants after the Great Depression whom feared intervention by new Securities and Exchange Commission and were forced to adopt

9 Among the next decade’s research findings started to circulate across continents and oceans and quantification served international communication well (Porter 1995)

10 Porter’s analysis is of quantification’s meaning for Western Europe and North America, in which quantification is suited for far beyond the “boundaries of locality and community” (ix), he thus leaves a significant part of society out in his research.

(9)

rules standardizing accounting practice that constrained their expert judgment. Although this shift toward standardized objectivity, meant a loss of autonomy and as Porter describes a failure of the accounting profession (1995). Economic quantification developed as “the natural language” (Porter 1995), but also strategy for producing uniformity under conditions of intense political conflict. Accounting has long provided a means for monitoring the performances of distant and diverse subordinates. It is evident that the practice of rigorous quantification is of higher demand because subjective discretion is not trusted, whereas ‘faith’ lies in numbers (Porter 1995).

Quantification and written assessments first emerged in education around the beginning of the 17th century, amidst controlling human performance. It was not until the 19th century were these practices incorporated into the workplace (Hoskins 1996). Nowadays following this period, quantative authority and its link to accountability and evaluation are so bound up with being modern that we have trouble imagining other forms of coordination and discipline or other means of creating transparency and accountability (Espeland and Sauder 5). By accountability I refer to Strathern (2000) and the meeting of the financial and moral in the principles of economic efficiency and ethical practice, and being held responsible. Accountability can be linked to higher education, notably with assessments likened to audit; reviewing. This audit administrative bloat, is found in universities responding to the demand for audit by appointing new quality assurance officers, creating monitoring committees inventing structures, evidence systems etc. (Shore and Wright 2000).11 This form of accountability derived from protocols of financial accountability which suggests ‘internal controls’ in the form of monitoring techniques are in place (Strathern 2000).As has become clear quantification provided authority, but this is authority as Barry Barnes (1984) defines it: not power plus legitimacy, but power minus discretion (23), as will become evident. The collection of quantified information is necessary to produce result and to use this in statistical analysis. The following section describes how refined methods of quantification revolving around statistics and metrics were introduced.

11 Not only were several new layers of administration created but also lecturers and researchers are spending more time producing auditable records instead of spending that time teaching or doing research. (Shore and Wright 2000)

(10)

2.2. The Role of Statistics and Metrics Statistics

In the scale of this research regarding the practice of examining and processing large sets of data, the position of statistics is central. Statistics is a branch of mathematics concerned with collection, classification, analysis, and interpretation of numerical facts and for drawing inferences on the basis of their quantifiable likelihood (probability) (Business Dictionary 2016). Ultimately, statistics is the mathematical processing of quantified information (numbers). There is a need to understand our relationship with quantification, which is central to such practices of ‘ranking’, ‘accountability’ and ‘objectivity’. Only by deciphering these associations and convergence between quantification, statistics and performance metrics, can we really comprehend what is being done with data and performance metrics.

The social study of the number largely focuses on the situated histories of statistics, regarding large numbers and probability (Porter 1986). The first incident of statistical analysis was the Summarisk Tabell developed in Sweden, being the first ever systematic collection of statistics (Rosling 2011). Since the year 1749 the Tabell has collected statistics on every birth, marriage, and death, recording information from every parish in Sweden. This was the first time any government could get an accurate picture of its people (Rosling 2011). With states expanding, centralizing and responding to new demands for public services in the 19th Century, there was a requirement for better information about their empires. (Espeland & Sauder 2007). The result of this climate was an ‘avalanche” of numbers which profoundly transformed “what we choose to do, who we try to be and what we think of ourselves” (Hacking 3-10).

According to Rosling (2011) it was our past leaders who began collecting quantified information and then using this in statistical analysis/calculation, in the first place in order to monitor us. Rulers and church leaders had extensively tried to “calculate how many men were eligible for conscription or the volume of crops or acreage subject to taxation or tithe” (Espeland & Sauder 4). Revealingly these social statistics were treated as secrets in such institutions, being used to describe and evaluate the performances of nations (Espeland & Sauder 2007). One can already see an emerging competitive nature due to the statistical and evaluative measuring of nations. Interestingly, this is already prevalent decades prior to commercialization.

(11)

In the 18th century the application of statistics to legal issues such as “crime, prostitution and suicide” (Hacking 3), contrasted in statistics applied to health and morals in the 19th century. With 19th century progression, measurement found its place in physics – the study of light, sound, heat, electricity, energy, matter, which was facilitated by manufacture in “mining, trade, health, railways, war, empire” (Hacking 3,5). Out of this statistical period, the concept of ‘probability’ gathered support. with decisions couched in terms of likelihoods being made by officials on “military strategy” and “environmental impact” (4). ‘Probability’ became telling of forthcoming measurement possibilities, in its capacity to enhance society.

The use of statistics is often linked to development in probability12 and the overall success and development of nation-states and disciplines (Alonso and Starr 1987). As mentioned, statistics meant the systematic collection of demographic and economic quantified information of states, and bore inferential statistics. Following World War I the concept of ‘probability’ developed, to which I refer to ‘likelihoods’, and ‘chances’ assumed as a result of what statistics produce. Not only the 1940s, but also in the 1950s there were times of extreme disorganization and disunity: Europe was literally divided in two and the United States was in the grip of the Cold War (Porter 1995). According to Porter (1995), “the extraordinary modern success of inferential statistics must be understood partly as a response to conditions of mistrust and exposure to outsiders” (200), similar to the history of accounting. Quantification served as a remedy to the conflicts of the twentieth century, because it made observations and experiments repeatable, and hence scientific knowledge was not dependent on faith (Porter 1995).

This probability mathematics, can be critiqued in that probabilities in every case are artifacts, created (but not arbitrarily) by instruments and by well disciplined human labor (Porter 213), thus are interfered. Hans Rosling (2011) argues statistics tell us whether the things we think and believe are actually true. Though according to Ian Hacking (1990), statistics have a manufactured nature and are not objective nor always truthful thus should be thought of as historically constructed. With this Hacking refers to assertions made upon chance, which are drawn from a period of flux; as

12 In order to formulate the law of probability for a statistical estimate, one must first agree on certain laws for the mathematical errors for each observation, and then combine them

mathematically, from this deducing a law for the statistics being calculated (Desrosieres 1998).

(12)

outlined previously nation state’s massive reliance on quantification was the precondition for advances in statistics. It is important for a historical approach to be considered for widely used numerical forms such as statistics in all contexts, as they mediate public understanding and aspiration. Thus, there is a necessity to hone in on the purpose of statistics in order to assess its role and influence on society, politics and the economy as a measurement system.

Metrics

As has slowly become clear with the growing reliance on quantification, and statistics scale of dominance over the centuries, new possibilities for measurement arose. The combination of the role of quantification and statistics, and concepts of ‘probability’, and ‘objectivity’ have enabled new measurement practices. Stevens (1951) offers a classic definition of measurement simply “as the assignment of numerals to objects or events according to rules.” (22), or the action of measuring the size, length or amount of something. Measurement is “the process by infinitely varied observations are reduced to compact descriptions or models that are presumed to represent meaningful regularities in the entities that are observed (Fiske et al. 181). Primarily measurement needs numbers and the quantification of information to produce meaningful results.

For the purpose of this thesis I refer to measurements as metrics, the results obtained from measuring. In this section I isolate metrics, a metric being a statistical formula (Tracy 2016). For example, metrics can be found in market performance, such as customer lifetime value (CLV) or customer retention rates (Donlon 2013). Martin Zwilling alerts to the significance of discipline and skill that is necessary to collect and analyze the relevant business data, known as metrics. Zwilling (2011) quoted the key metrics for businesses are ‘sales revenue’, ‘operating productivity’ and ‘size of growth margin’. The content marketing industry shows further examples of metrics. One example is ‘consumption metrics’ that look at the number of readers who consume your content, the channels they use, and the frequency of their consumption (Deshpande 2015). For example, for a website or blog one would look at

(13)

page views, unique visitors, and average time on your website using a tool like Google Analytics.13

Sociologist David Beer asserts metrics to be a “form of measurement and pattern recognition” relating to “social ordering” (Beer 1,2). In “Productive measures: culture and measurement in the context of everyday neoliberalism” (2015) Beer specifically equates metrics as central to the performance of football players and the recruitment and organization of teams. The role of metrics is in essence used to manage performance. In this instance performance is referred to how well a sportsman plays, in terms of skill and effort. Beer’s work on metrics is more related to contemporary society, especially as his research focuses on football and the current climate of ‘big data’. In a blog for London School of Economics, Beer describes the productive role of metrics in their ability to, “vindicate and limit, cajole and incentivize, and legitimate and justify’ (2015). Metrics accomplish this as a system of measurement. Metrics are frequently used to manage performance, to facilitate competition, to judge or compare individuals with others. As established already, statistics similarly allow for ‘ordering’ and ‘probability’ counts.

Metrics allow the collection of ‘meaningful data’ for trending14 and analysis (SAS 2016). Today, one would find such metrics being used in a variety of industries such as business management in recruitment. Metrics have allowed for management teams to fine tune, and focus people and provide organizational efficiency (Kua 2013). These metrics can be found in the financial sector, relating to ‘profit’’. Analyzing ‘gross and net profit margins’ and measuring ‘cost effectiveness’ by finding the best ways to reduce and manage your costs both allow for financial improvement. Metrics are a form of technology: they produce information, in this case. However, as development consultant Patrick Kua (2013) notes, with a love of numbers came metrics with limitations in that they detract what is present. See Appendix A for further example of a business metrics. Financial sector metrics, generated by statistics promote “classifications and enumeration” of individuals (Hacking 3), through measurement. For example, Beer signals the micromanagement of footballers by

13 Google Analytics is a freemium web analytics service offered by Google that tracks and reports website traffic (Google Analytics 360)

14 Trending analysis is the practice of collection of quantified information from multiple time periods and plotting the information on a horizontal line. The intent of this analysis is to spot actionable patterns in the presented information (Bragg 2013)

(14)

metrics. ‘The Secret Footballer’15 describes, “[I]n open play, a huge amount of study, from my own experience at clubs, is devoted to the calculation of what are described as final third entries, penalty box entries and regains of the ball in the final third16” (Beer 6). Beer adds, that this statistical analysis then extracted from football games is used to “coach and shape playing practices” (7). Statistical use in “income, wealth, educational attainment and scores on standardized tests”, in which individuals are judged “higher” or lower, have a hierarchical relationship in which classification is inevitable (Espeland and Stevens 2008, 410). Metrics facilitate classification, through their statistical organization and subjectivity. It is significant to reflect on how metrics are regularly used to facilitate competition, manage performance, and to judge us or to compare what we do with others, thus we should question the central dynamic of metrics in the current technological climate we live in (British Politics and Policy/ LSE). In order to further the research, I now align specifically to ‘performance metrics’ I highlight performance metrics found today where individuals are assessed on their performance, referring to efficiency and speed (Eusgeld et al. 2008). Performance metrics emerged in 1943 when the International City Management Associated published an article on measuring the performance of municipal activities. During the Kennedy administration, systems analysis processes were introduced to the Department of Defense which fueled interest in performance measurement in the federal government (Poister 2008). Performance measurement was reenergized as the demands for holding government entities accountable to publics increased. A number of resolutions were passed by associations such as the National Academy for Public Administration, urging government to set goals and measure their performance and in 1993, The Government Performance and Results Act was passed by the federal government requiring their agencies to become involved in strategic planning, goal-setting, and performance measurement (Poister 2008).

Furthermore, performance metrics are evident in education. Within universities metrics have been widely adopted, not merely for institutional

15 ‘The Secret Footballer’ is a pseudonym for an anonymous Premier League football player, whom regularly contributes to news articles

16 A ‘final third’ is the 1/3 of the football field that contains the other teams goal. A ‘penalty box’ is the rectangular area marked out in front of each goal, within which a foul by a defender involves the award of a penalty kick and outside which the goalkeeper is not allowed to handle the ball. These terms are used when discussing football tactics and strategy.

(15)

benchmarking but also, increasingly, for managing the performance of academics (Wilsdon, J., et al. (2015). The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management (2015) attributes this state of affairs to the increasing pressure on universities to be “more accountable to government and public funders of research” (Wilsdon 2015).17 The use of metrics as targets for individual academics to achieve, is another major concern of commentators.18 There are concerns that such targets can even undermine the mental health of those struggling to meet the goals. The latter argument was voiced particularly vociferously at the end of last year with the case of Stefan Grimm19. Queen Mary and King’s College London have both run into controversy in recent years for using metrics, including grant income, to select academics for redundancy. One can note that metrics can play a variety of roles, skewing individuals behaviour but also structuring and controlling lives.

With the expansion of public measures, one can observe that the relationship between metrics in our daily lives, from business to education are not dissimilar. The latter illustrate how our lives are formed by statistical metrics (Beer 2015). Espeland and Saunder (2007) confirm a chronic tension of measures, as has been conveyed, a tension between “validity” and “accountability” (35). Whether one scientific measure is legitimate or not, and who can be held accountable for quantified measures, is the result of metrics. Mass-scale information collection is of high demand, metrics have proven to derive from a desire to reach one’s optimum performance level, which statistical advancements are facilitating. There is a growing argument to suggest a blurring between object and measurement threatening the validity of measures. As is evident with statistical influence, technologies that were devised with a certain purpose in mind – and objective, meant metrics ascending from a sociotechnical and historical context. With technological advancements, statistics have developed as have the metrics created, with this began an ongoing move from statistics to data.

17Also due to the financial pressures imposed on institutions by constrained funding and

globalization.

18 Critics fear that this increases incentives to cut corners or to cheat outright, undermining the integrity of research literature.

19 Grimm, Professor of Toxicology at Imperial College London, committed suicide after being told that he was failing to bring in the level of grant income expected of an Imperial professor.

(16)

2.3. Data and The Databases

The last section showed a demand for statistical analysis allowing for meticulous metrics. This has accelerated due to the fact nowadays through statistics,

(17)

one can turn ‘data’ into information. A dictionary defines data as, “facts or figures from which conclusions may be drawn”, but data are not facts. The pattern associations, or relationships among all this data can provide information. For example, analysis of retail point of sale transaction data can yield information on which products are selling and when (Han and Kamber 2000). This key shift in statistics and data comes in methods of collection. The advent of Web 2.020, resulted in an explosion of social media, photos, video, in addition to the data gathered by sensing devices including smart phones. For example, Facebooks development as a platform, a process Anne Helmond (2015) calls platformization; the rise of the platform as the dominant infrastructural and economic model of the social web. Data became ‘big’ in the transformation of networked digital technologies of the internet. These developments created the technological conditions of possibility for big data to emerge.

As alluded to in the introduction, Big Data concerns statistical information; huge in volume, consisting of terabytes or petabytes of data; high in velocity, being created in or near real-time; diverse in variety and exhaustive in scope (Kitchin 2013). Big data describes data sets so large, so complex that require such rapid processing21. With this development came the recognition and necessity of big data analytics. New methods of statistical analysis have transformed how ‘new’ statistics are being used.

As illustrated earlier in this chapter, numbers merely formed ‘marks’ on home addresses, the accelerated move from statistics to data analytics reflects a shift in technological infrastructures of measurement and data collection. With today’s technology, it’s possible to analyze your data and get answers from it almost immediately. But even in the 1950s, decades before anyone uttered the term “big data”, businesses were using basic analytics (essentially numbers in a spreadsheet that were manually examined) to uncover insights and trends (SAS 2016).

An examination of ‘the database’ can better explain a shift in technological infrastructures of measurement and analysis. A database can be defined as “a collection of information that is organized so that it can easily be accessed, managed and updated” (Rouse 2006). Regarding ‘databases’ that existed at the end of the 19th

20 Web 2.0 – “the network as platform, spanning all connected devices and making the most of the intrinsic advantages of that platform: delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including

individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an “architecture of participation,” (O’Reily 2005)

(18)

century, public and private bureaucratic institutions were growing increasingly large and geographically dispersed, creating a rising demand for new systems to manage information (Driscoll 7). In this early period, analog data collection began with the ‘paper clip’, and ‘filing cabinet’ (Beniger 1986). The US census in 1890 anticipated the volume of data that would soon be coming in from around the country. It was necessary to replace the “unwieldy handwritten paper rolls” and ideas for information processing were suggested (Driscoll 8).

Herman Hollerith22 introduced an ‘electro mechanical machine’ for tabulating information which was then recorded onto punched cards. Punched card machines provided random access to any single record, unlike the paper roll which needed to be parsed serially from the beginning in order to reach a given record (Driscoll 2012). The need for random access to and an expressive ad hoc query method continued to characterize the development of information processing systems, the punched cards became in effect not only a copy of the information, but it’ s principle medium23. So much so, that punch card tabulators were soon adopted by bureaucracies through the federal government, and so began the mobilization for large scale information processing.

The arrival of mass-scale information processing was urged on in the 1930s with a nationwide forming of social programs (Driscoll 2012). For example, the insurance industry largely depended on its ability to run statistical analyses on large populations; telephone and utilities companies used tabulators to bill large numbers of customers for small amounts; and manufacturing first developed real-time accounting methods for tracking fluctuating costs and sales across large multidivisional corporate structures (Beniger 1986). Fear and anxiety marked much of the popular response to database technology in the early period.24 Punched card tabulators and later programmable computers were essential tools for central economic planning.

22 U.S Census employee, Herman Hollerith drew inspiration for tabulating information onto punched

cards from the railway ticket system at the time. Hollerith proposed that each record in the 1890 census be stored on an individual punched card.

23 This was emphasized by the fact that punch card tabulators were soon adopted by bureaucracies

through the federal government, and so began the mobilization for large scale information processing.

24The punched card provided material evidence of the dehumanizing power of the bureaucracy, rather

than communicate with individuals, in all their messiness, large bureaucracies preferred inscrutable slips of perforated paper…such systems were used to track taxes, insurance rates and debt etc. (Driscoll 10)

(19)

In the late 1950s early programmable computers further abetted data measurement, collection and analysis (Driscoll 2012). During the intervening decades, mechanical tabulators and magnetic storage were replaced by general purpose computers (Driscoll). IBM25 dedicated significant resources to the development and design of new data storage and information processing technologies. Prevailing database management systems required some facility with a programming language like SQL26 in order to make queries about the information they contained. By the 1980s organizations found themselves with databases that had grown to the point where the amount of data they held had become too large for humans to be able to analyze it on their own (Finlay 2014).

In the 1980s there was a popularization of personal computing, databases were imagined in one or two forms; remote collections of information accessed via a telecommunication system, or locally produced, grassroots systems for the storage analysis and production of information. The personal database ran on a single-user home computer with no hard disk. Personal database software, invited home and small business users to produce their own databases. It was from this hands-on experience with small-scale databases that proponents of the computing counter culture hoped non-specialists might develop a critique of much larger information processing systems. With the introduction of analog and later digital computers, mass scale computational processes, in calculation and information processing, were introduced.

Big data became big with the process of digitization: “converting information into a digital format” and progressed information’s organization into discrete units of data (Tech Target 2007). The digital revolution can be twinned with ICT and with that the dawn of ‘network computing’. As defined by technopedia “Network computing refers to the use of computers and other devices in a linked network, rather than as unconnected, stand alone devices” (2016). ARPAnet27 was the first large- scale, general-purpose computer network to connect different kinds of computers together (Lehtinen et al 2006). With this advent further technological transformation in network

25 IBM (International Business Machines), was the construct of Hollerith’s firm Computing-Tabulating-Recording Company (CTR). In 1924 CTR was renamed IBM.

26 Search Engine Query – “an accessible language for expressing queries to an online database” (Driscoll 16).

27ARPAnet funded by the Department of Defense Advanced Research projects agency (ARPA, now

(20)

computing has influenced how individuals collect and measure data. Nowadays many larger business networks also share hard drive space, where any networked computer has access to the same data through a server or other hardware set up. Such innovative, yet multifaceted infrastructures have now shifted the way individuals collect thus scrutinize the statistical information retrieved. More recent developments have made network computing extra sophisticated but ambiguous.

A key development was the process of network ‘virtualization’28 which allowed for massive scalability, giving individuals virtually unlimited resources, enabling on-demand network access to a shared pool of resources (Zhao et al. 2014). Starting in the mid 2000s, the computer utility model became fashionable again, under the name “The Cloud”2930. Cloud computing is the perfect complement to big data since it offers unlimited storage space on demand, however introduces concern surrounding data protection and privacy infringement (Millard et al.2012). Big Data became big because of the transformation of networked digital technologies, from the introduction of digital computers to Web 2.0 and the platformization of the web. With the rise of such powerful data infrastructures, the constitutive and productive power that these systems of measure may have comes into question (Beer 4).

Such advancements listed above have transmuted data analysis introducing the sought after method of data analytics. SAS, Statistical Analysis Systems, a leading data analytics company defines data analytics, as “examining large amounts of data to uncover hidden patterns, correlations and other insights” (SAS 2016). With todays technology its possible to analyse your data and get answers from it almost immediately. The new benefits that big data analytics brings to the table in speed and efficiency is desirable. A few years ago a business would have gathered information, run analytics and unearthed information that could be used for future decisions, today that business can identify insights for immediate decisions. The ability to work faster and stay agile gives organizations a competitive edge they didn’t have before (SAS 2016). Competitiveness being an earlier key feature of performance metrics.

28Virtualization is supplementary to ‘cloud computing’; providing flexible, location-independent

access to computing resources that are quickly and seamlessly allocated or released in relation to demand; the services are abstracted and typically virtualized, generally being located from a pool shared as fungible resource with other individuals (Millard et al 2012).

29 ‘The Cloud’ referring to a ‘network’ diagram of the computers and storage devices there and a major trend in both networking and computing today.

30 The term ‘The Cloud’ really caught on when Amazon Web services launched its Elastic Compute Cloud (EC2) in 2006.

(21)

‘Data analytics’ is different to statistical analytics as analytics is extracting valuable information out of data, which is also referred to as ‘data mining’. At this point, ‘data mining’31 is a key concept to this research, as it is a fundamental example of data analytics in sport, being a set of techniques used to find patterns and make predictions in many fields to study performances in a variety of ways (Schumaker et al. 2010). Though it is important to distinguish that the concept of “data mining” is a broad umbrella term, used to depict the latter aspects of data processing.

Data mining can be used for prediction, in its rigorous procedure. For example, procedures from the data processing phase; using specialized hardware such as sensor network to collect documents, via feature extraction and data cleaning, to the analytical phase; designing methods and algorithms for effective analysis via pattern mining and data clustering etc. (Aggarwal 2015). The previous section on statistics drawn out of quantification alerted to metrics related to “predictability”. Commonly referred to as ‘predictive analytics’; from elections to sporting events to the stock market, one can find countless opinions on what the future will bring. But without supporting data, any opinion is nothing more than an educated guess. How do you go from a guess to a prediction? By using data to inform decisions about the future (SAS 2016) A predictive model is a mathematical function that is able to learn the mapping between a set of input data variables, usually bundled into a record, and a response or target variable (Guazzelli 2012). One of the earliest applications of predictive analytics was credit scoring, which was first used by the mail order industry in the 1950s to decide who to give credit to (Finlay 2014). This confirms that predictive analytics was a practice before the advent of digital technologies and big data. By the mid 1980s credit scoring became the primary decision making tool across the financial services industry (2014). Currently, predictive analytics has transformed and continues to transform the marketing industry.32

Today, new measurements that are intended to limit possibilities and to reduce the chances of error, are seen to be objective and are trusted often ahead of human agency and discretion (Beer 2016). Earlier sections revealed quantification and statistics to have probability prospects, which in turn questioned their root cause

31 Extensive IBM researcher in data mining systems, Charu Aggarwal (2015), describes the procedure as “the study of collecting, cleaning, processing, analyzing, gaining useful insights from data” (1). 32 From telecommunications to education to gaming and beyond, organizations are leveraging even more diverse data and crunching this to optimize marketing focus and spend (Wallace 2015).

(22)

and objectivity. Nowadays we are thrust into a big data climate in which predictive analytics is far more influential in terms of data analysis and the metrics now created. This is largely due to the economic role of big data, in its explanatory and predictive power of large data sets (Driscoll 2012).

Big Data Economy

The development of new database technologies cited above is driven by the demands of extremely large data sets, especially those produced by highly centralized web services such as Google and Facebook (Driscoll 2012). Following a shift from statistics to big data, new foundations for measurement and data collection formed. Big data began to be used in the commercial world. “Big Data” has quickly become a buzzword, but it encapsulates the dramatic rise of statistical and computational technology that allows for collection and analysis of data on everything from industrial systems to our online interactions (Friedman 2015). Wal-Mart, a retail giant, handles more than 1million customer transactions every hour, feeding databases estimated at more than 2.5 petabytes – the equivalent of 167 times the books in America’s Library of Congress (Cukier 2010).33 These examples tell the same story: that the world contains an unimaginable vast amount of digital information. In return, this makes possible to do many things that previously could not be done: spot business trends, prevent diseases, combat crime and so on (Cukier). The data can now be used to unlock new sources of economic value.

Big Data is already an integral part of every sector in the global economy- as essential a factor of production as physical and human capital. Much of our modern economic activity simply could not function without it. The effect of big data is already evident in manufacturing, finance, and especially retail, where companies have been assembling increasingly sophisticated consumer profiles (Bradshaw 2013).34 The economic and business possibilities of Big Data and its broader significance for social and technological advances are critical issues when it comes to the new capacities of measurement.

33 Facebook, a social-networking website, is home to 40 billion photos. And decoding the human genome involves analyzing 3 billion base pairs – which took ten years the first time it was done, in 2003, but can now be achieved in one week (Cukier 2010)

34 British retailer Tesco collects 1.5 billion pieces of data on its customers shopping habits every month and uses that information to adjust prices and send targeted coupons (Schumpeter 2013).

(23)

The business of information management – helping organizations to make sense of their proliferating data – is growing by leaps and bounds (Cukier 2010).35 This industry is estimated to be worth more than $100 billion and growing at almost 10% a year, roughly twice as fast as the software business as a whole (2010). As the capabilities of digital devices soar and prices plummet, sensors and gadgets are digitizing lots of information that was previously unavailable. And many more people have access to far more powerful tools.

Big data has infiltrated the key economy segment of marketing, by allowing marketers to identify, measure and manage in detail. Advanced data collection and analysis has led to insight that create immediate performance improvements, assisted by metrics (Bodensteiner 2014). From media companies, to startups, to airlines, to sponsorship in sport, data driven marketing has delivered impressive results in terms of engagement and growth (Cukier 2010).36 The same idea is being extended to hotel rooms, cars and similar items. Personal-finance websites and banks are aggregating their customer data to show up macroeconomic trends, which may develop into ancillary businesses in their own right (Cukier 2010). Economic production used to be based in the factory, where managers pored over every machine and process to make it more efficient. Now statisticians mine the information output of the business for new ideas (Cukier 2010).

In recent years’ computational technology has broadened the scope of statistical techniques available to us (Mayer-Schonberger and Cukier 2013.) It is now possible to combine quantative methods and organize the data in such a way to observe shifts at individual and aggregate levels. (Gray et al. 2015). Big data is transforming how policymakers and business leaders make decisions. Vast stores of data also open new avenues to researchers seeking to confront and understand the economic and statistical challenges of our time (Friedman 2015). As Web 2.0 took flight more and more data was created on a daily basis, innovative startups slowly

35 In recent years Oracle, IBM, Microsoft and SAP between them have spent more than $15billion on buying software firms specializing in data management and analytics.

36For example, Farecast, a part of Microsoft's search engine Bing, can advise customers whether to buy an airline ticket now or wait for the price to come down by examining 225 billion flight and price records (Cukier 2010).

(24)

began to dig into this massive amount of data and also governments start working on Big Data projects37 (Rijmenam 2016).

Big Data has infiltrated the economy via the employment and marketing sectors. With the expansion of big data, without the right support understanding and processing data becomes a daunting task. As businesses expand, it made sense to hire full time data directors who can produce insightful and accurate analysis in a timely fashion. Louis Columbus reported a demand for Computer Systems Analysts with big data expertise increased at 89.9% in the last twelve months, and 85.40% for Computer and Information Research Scientists (Forbes 2014). Statistician is the new ‘it’ job. Prominent research consultancy firm Gartner revealed in a 2015 Data-Driven Marketing Survey, marketers expect most of their decisions to be quantitatively driven by 2017. As a result, more than 50% of companies plan to grow their analytics teams. This stems from a necessity to identify and cultivate new sources in an already competitive market, roles such as data architects, data visualization and data scientists are some of the most desirable (Pemberton 2016).38 Big data is not only infiltrating the economy but also education, via the growth of courses and degrees specifically data related (Pemberton 2016), so also the future of the economy.

The role of numbers today is often lost in words like statistics and data, due to the vastness of technological evolution and the latter terms being brandished frequently and interchangeably in broadcast media and journalism. What has been apparent is a shift in data aggregation practices, that are infiltrating boundaries between commercial, scientific, technological and even regulatory domains of expertise. Digital storage has become increasingly cheap over the past decade with advances in both hardware (magnetic disk technology) and online storage models (cloud computing). As is apparent, data is important as it accelerates the processes of commodification and profit seeking. With statistics, data, quantification and metrics, its apparent we live in age where there’s an effort to capture everything in numbers and to compare, evaluate and act on those numbers. What has been significant in this

37 In 2009 the Indian government decided to take an iris scan, fingerprint and photograph of all of tis 1.2 billion inhabitants. All this data is stored in the largest biometric database in the world (Rijmenam 2016).

38 Gartner predicts that 6 million jobs in the United States will be generated by the big data-driven economy over the next five years (Gartner 2012).

(25)

chapter is the growing desire for a competitive edge, in all industries, which is supported by increased use of metrics. With this new infrastructure of measurement what are the consequences of these advanced metrics in terms of manipulation? Following on from the significant role of big data in the economy, I now continue in this thesis by establishing how the sports industry explicates the impact of big data analytics and performance metrics.

Chapter 3. The Commercialisation of Sports and Statistics 3.1. The Early Years

Following on from the role of the economy in data analytics, the development of the commercialisation of the sports industry will be traced. Understanding how fundamentally, sports leagues were turned into industries. This transition is a pivotal starting point to an assessment on the current ‘big data’ influence on the sports

(26)

industry as the production of statistics in sport derived from the commercialization in sport we see today. I will begin with a history of the rise of commerce and media attention in sport, with a focus on the United States and United Kingdom. I focus on these countries due to their contrast in commercialization and historical lineage especially within the context of the USA having a rich history of sports commodification, and the UK having a steady yet influential commercialisation. Within these two countries I will focus on the sports, baseball, basketball, American football and football(soccer), due to their large participation and popularity rates and significance in sports historical commercial development. Throughout this trajectory I chronologically describe how the use of data and statistical analysis in sports helped to turn sport it into an industry, i.e. helped commercialize sports. But, additionally how the ongoing commercialisation of sports, i.e. its “becoming industry”, affected the importance/value of data and its methods of analysis.

Sport has not always been the billon dollar money maker it currently is. As with every industry, inevitability there is a point of transfer where capitalism seeps through. Trevor Slack (2004) notes how sport is now a highly commercialized commodity. 39 Sports have come a long way from the 1900s and folk games: earlier times where sports meant pastimes and recreation, such as hunting, falconry and fishing (Nauright & Parrish 2012). Nowadays, sport is regarded as being a modern spectacle.

Today, sporting leagues have become industries in themselves. When did this all begin? As early as 1958 the sport of baseball began its professionalization charging 50 cents for adults and 25 cents for high school students to witness the game between the Los Angeles Dodgers and the Palm Springs Merchants (The Desert Sun 1958). Baseball was the first sport to experience a transition into professionalization, dubbed the “national pastime”, dominating the sporting scene in the early 1900s. It was not simply a sport of fun, but baseball’s ideology fit prevailing values and beliefs.40 It was following this climate that baseball transformed in the latter half of the nineteenth century from a recreational game into a professional sport. The drive for people to begin paying money to see players and teams compete, stemmed from a swath of

39 Trevor Slack was Professor of Physical Education and Sport at the University of Alberta specialized in and researched the field of sport management in “The Commercialization of Sport” (2004). He is now deceased.

40 Baseball is often considered a sport of pastoral American origins that improved health, character and morality; taught traditional rural values; and promoted social democracy and social integration

(27)

Americans becoming wealthier and players becoming better skilled at their craft (Ross 2016). With nineteenth-century sport being primarily participatory, the era’s most significant development was then the rise of professional spectator sports, a product of the commercialization of leisure later on and the large potential audiences created by urbanization (Adelman 1990).

In terms of professionalization, in America, basketball players began receiving salaries even earlier than baseball, who were touting tickets for spectators. The first competitive basketball league was played between the larger east coast cities like Philadelphia, New York and Boston. On the 7th of November 1896 the first known professional basketball game was played in New Jersey between The Brooklyn YMCA and the Trenton YMCA.41 Following on from this match, for the first time each participating player got $15 (fee) except Fred Cooper, who got $16, becoming the first highest paid basketball player (Peterson 1990). American journalist and writer Robert Peterson, describes how the exclusive game would have had “700 people jammed into the hall with the box office taking $150”, in addition to the team having had to pay the hall rental, fees for the referee, and umpire (Peterson 36).

Early salary distribution alerted very early on, sports inevitable commercial future. Industry developments have impacted player earnings and competition excitement, born out of individuals working and running the stadiums, coaches and trainers included whom needed to be paid. This began the stream of business into sports. When somebody is “The Best in the World” in their field, ordinary people want to watch or see those people perform, whether they are actors, musicians or athletes. Subsequently basketball began to develop as an industry in itself, shortly after the historical basketball match with Cooper, the National Basketball League was announced in 1937 (Nelson 2009). The NBL lasted five seasons and in 1946 the Basketball Association of America was created. The top teams were made professional.42 It was then three years later that the league became the NBA, National Basketball Association (Nelson 2009).

Returning to baseball and its professionalization, a key component to this was the existence of mass media that promoted and carefully analyzed the sport. The

41 The match was played at the Trenton masonic temple of New Jersey and it was here, similarly to baseball, an entry fee was charged to people who came to watch the game.

42 The very first professional basketball game was played in Toronto, Canada, but with American teams.

(28)

media notwithstanding, one of the most effective tools for expanding the popularity of baseball was the Civil War, “both confederate and Union soldiers played the game among themselves” (Ross 6). Following the war, the mounting interest in information about the game, meant sports media expanded and multiplied. The National League (now commonly referred to as MLB, Major League Baseball), was founded in 1876, and as an industry, was the first association of professional baseball clubs that became consistently profitable (Ross xvi). Following on from this, in March 1869 baseball team, The Cincinnati Red Stockings, became the first sports team in history to have each member of its team on salary (Hiskey 2011). A key development throughout the 1900s came with the rise in attendance at games, and a growth in club profits that led to players demanding better salaries. Attendance in the league doubled between 1903 and 1908, and by 1913 more than “300 cities had pro baseball teams” and the game had ‘gained social respectability’, a status exemplified from 1909 by the practice of the President starting each season by ceremonially pitching the first ball (Smart 51).

Statistics in Sport

Statistics has a long history in sport, existing before the progression of commercialization discussed above. For the greater part of the century a myriad of statistics has been kept of records of sports events, though the statistics themselves were often taken for granted and rarely questioned to the level they are today (Schumaker et al 34). Henry Chadwick, a 19th Century sportswriter and statistician is credited to be one of if not the earliest statisticians of sport, with many of todays familiar statistics, e.g. batting average and earned run average, owing their existence to him (Lewis 2003). In the US baseball was the first sport to adopt heavy statistical use for insight, with Chadwick being the preeminent writer on baseball for over half a century and developing many of the standards by which we evaluate players and teams today. It is noteworthy, given that statistics originated in Sweden and then other European countries only later making it to the US.

Researchers at SABR, “The Society for American Baseball Research” argue that Chadwick was a key pioneer, particularly in the invention of box scores. Box

(29)

score43 data analytics being a fundamental aspect in sports data journalism, in the USA, and in statistical analysis (SABR). However, Schumaker et al. (2010) insist Chadwick had an inadequate understanding of baseball and based many of his baseball statistics on his experience with the game of cricket.44 In this sense, it is key to mention in the case of the historical use of statistics in baseball, the imprecise nature of its original usage. The problem with traditional formula, lies in what the statistic is intended to measure. For example, the rebound statistic or the number of times a player received the ball after a missed shot attempt, did not imply that points will be scored. This was traditional statistics, with basketball experts at the time going on to use these statistics as “measures of performance” (Schumaker et al. 35). Traditional formula used for this statistical analysis lacked the realization that there are some problems that cannot be answered through statistical examination alone (Schumaker et al. 2010). The question of precision is always present when regarding decision making based on statistics, though someone like statistician Hans Rosling (2011) would affirm the nature of statistics accuracy and the potentiality of its ability to measure.

3.2. Media Rights and Sponsorship Media Rights

The sports explosion was directly abetted by the technological revolution. Communication innovations like telegraphy and telephony helped newspapers report events at distant locations. ‘The New York World’ in the mid 1890s introduced the first sports section (Juergens 119).45 Baseball began to further resemble a business in the

43 A box score is a summary of the results from a sports competition, usually listing the game score as well as individual and team achievements in the game, and commonly used in baseball and American Football (Schumaker 2010).

44 This is explained in why walks (i.e. advancing to a base without a hit) are not included in these formulae, because the walk had no equivalency in cricket.

45 Daily coverage was supplemented by weeklies beginning with the American Turf Register and Sporting Magazines (1829) and William T. Porter’s urbane Spirit of the Times (1831), which promoted angling and horseracing.

(30)

1860s following the hiring of the first paid players, the opening of Brooklyn’s Union Grounds, and the 1869 national tour of the all-salaried Cincinnati Red Stockings. Spectator sports grew rapidly in the prosperous 1950s and 1960s. There were more major sports, the number of franchises rose, and television enabled millions to watch live events. Air travel facilitated major league baseball’s opening up of new markets in 1953.46 This expansion was accompanied by the construction of arenas and stadiums by local governments to keep or attract sports franchises, thus increasing the need for ticket sale (Siegfried and Zimbalist 2000).

Advances in technology have revolutionized sports. An example being in the 1920s, radio stations began airing boxing matches, bringing the sounds of live competition to millions of people for the first time (Lehew, 2015). By the 1940s, television networks were broadcasting games, and fans could actually see their favorite athletes run, jump, and hit from the comfort of their homes. As the sports media grew, so did our ability to engage with the competition we crave. Television broadcasts, promoted growing interest in college football, and created a huge boom in professional football during the 1960s. Roone Arledge47 (1931-2002) was a key figure in the time of televisions infiltration into sport. Media moguls regard Arledge as one of the most important behind the scenes figures in the television coverage of major events of the last half century, from the Olympics to the boxing matches of Muhammad Ali in the 1960s (New York Times 2002). Media experts would argue Arledge’s most significant achievement was leading sports programming out the limited window of weekend television coverage, which he did for ABC with first the Olympics from Mexico City in 1968 and then with “Monday Night Football” in 1970 (Meserole 2002). Arledge’s actions paved the way for sports events of all kinds to be moved into prime time, which snowballed into higher advertising, sponsorship revenue, enter further fuel for commercializing sport. To reflect the impact of Arledge, only a year after his success, baseball moved one game of the world series into the night hours. In baseball’s entire sporting history all the games had been played during the day. Only a few years later, all the world series games were at night as networks recognized the opportunity to get

46 Air travel facilitated expansion the Boston Braves moved to Milwaukee, and again five years later, when the New York Giants and Dodgers moved to the West Coast. The thirty teams in Major League Baseball, the thirty- nine in the National Basketball Association (founded in 1949) were located throughout the country.

Referenties

GERELATEERDE DOCUMENTEN

We adopted a researcher-as-experimental-subject (RAES) approach [7], where we engaged as active observers and took into account our first-person experiences in a variety of use

As with the BDA variable, value is also differently conceptualized among the final sample size articles, the way of conceptualization is mentioned in the codebook. As

Drawing on the RBV of IT is important to our understanding as it explains how BDA allows firms to systematically prioritize, categorize and manage data that provide firms with

 Toepassing Social Media Data-Analytics voor het ministerie van Veiligheid en Justitie, toelichting, beschrijving en aanbevelingen (Coosto m.m.v. WODC), inclusief het gebruik

1 Expert Systems with Applications 61 MIS Quarterly 196 2 Decision Support Systems 27 Harvard Business Review 172 3 International Journal of Sports Science & Coaching 18 MIT

Figure 4.1: Foot analysis: Foot type and static and dynamic foot motion 61 Figure 4.2: Analysis of the left foot: Heel contact, mid stance and propulsion 63 Figure 4.3: Analysis

I briefly describe the historical development of the pulsar field (Section 2.1), the mechanism of pulsar formation (Section 2.2), di fferent classes of pulsars (Section 2.3),

In this research the goal was to build a Multi-Agent truck time slotting proof of concept for terminals in the port of Rotterdam in order to demonstrate the use of information