• No results found

From Linear to Non-Linear Television: An analysis on the shift of Audience Measurement Technologies

N/A
N/A
Protected

Academic year: 2021

Share "From Linear to Non-Linear Television: An analysis on the shift of Audience Measurement Technologies"

Copied!
68
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

William F. Ruiz Richter Student Number: 11595205

Wednesday August 14th, 2019

From Linear to Non-Linear Television:

An analysis on the shift of Audience Measurement Technologies

UNIVERSITEIT VAN AMSTERDAM MA TV AND CROSS MEDIA CULTURE

Supervisor: Sudeep Dasgupta Second Reader: Toni Pape

(2)

Contents

1 Introduction... 4

2 Televisual Flow and the Implementation of Metadata...5

2.1 Televisual Flow across the Eras 6 2.2 Metadata across the Eras 11 2.3 Time in Television 13 2.4 Conclusions 15 3 Audience Measurement Technologies in linear television...17

3.1 The Broadcasting Era (1920s – 1950s) 17 3.12 The Early Audience Flow... 20

3.2 The Network Era (1952 – 1980s) 21 3.3 The Multicast Era (from 1986 - 1996) 24 3.31 The Set Top Box and the integration of AMT’s...24

3.4 The Multicast Era (1986 – 1996) 25 4. Audience Measurement Technologies in non-linear television...28

4.1 The Multicast Era (1996 - 2000) 28 4.12 Video on Demand (VoD)... 29

4.13 The significance of Timeshift Devices to AMTs...31

4.2 The Matrix Era (2000 - present) 32 4.21 Television in Modern times... 33

4.22 The Significance of Metadata in Modern Television...34

4.23 The evolution of AMTs in the Non-Linear Era...38

4.24 Modern AMTs and their commercial importance...41

4.25 From Catch-up to Matrix Media... 46

4.26 The Content Shift... 48

4.27 Licensing Rights in Modern Television landscape...49

4.6 Conclusions on Non-Linear Flow 52 5 Conclusions... 55

(3)

“Television is dead”...”Linear Television is obsolete”... “Audience Ratings are useless”... Such statements are currently prevalent around Television in Media Studies. The Non-Linear era of Television is now established and streaming content has become the most popular and

widespread means of access to television programs.

With this research paper, I attempt to explore the changes in Audience Measurement

Technologies to explain how they have evolved over time. This will help me to better debate the opening statements above.

(4)
(5)

1 INTRODUCTION

This explorative analysis looks at the changes from Linear to Non-Linear television to respond the research question “How can shifts in Televisual Flow and the implementation of

Metadata explain the evolution of Audience Measurement Technologies?” The tools I will use to analyse the evolution throughout the eras of television are:

1. Program Flow

2. The relationship of power between Broadcasters and Viewers 3. Content and Time

The three tools will be applied to the development of Time Manipulation Technologies (and the different platforms implemented throughout the evolution of Television) to explain the evolution of Audience Measurement Technologies (AMTs). With these tools, I will therefore explain the development of the concept of Television up to its transformation into a computer, and therefore the transformation of Broadcasting into Streaming. This paper starts with a Literature review to understand the theory of Televisual Flow and Metadata, which are key to effectively analyzing the evolution of AMT’s. The first chapter of the analysis looks at AMTs in Linear-Television and their evolution from the Golden Era of Television to the Multicast and Network eras. I will identify some key developments across these key periods in the history of Television.

In the second chapter, I will explore changes towards Non-Linear Television and the impact of Time-Shift technologies on the evolution of AMTs. In addition to the shifts in Televisual Flow and implementation of Metadata, which are the central concepts of this research paper, I will also emphasize the importance of technological advancements throughout my analysis. Although the focus of this thesis is on Audience Measurement Technologies, the overall technological

advancements of Television and Broadcasting will form an important part of this analysis. Technological advancements are key as they fostered the evolution and implementation of Metadata.

(6)

2 TELEVISUAL FLOW AND THE IMPLEMENTATION OF METADATA

The evolution of Flow across the main historical periods in Television is an important driver for the evolution of AMTs. Specifically, in transitioning from Broadcast Flow to Digital Flow, the concept of Metadata is introduced and continued to evolve. This was a defining moment in the history of television. This chapter will explore changes in Televisual flow based on Raymond Williams’ and Klaus Bruhn-Jensen’s definitions of the layers of Flow and the notion of Time itself. This analysis of Time and Flow will play a key role in explaining the transformation of Television from its original meaning of device through its evolution. Following the analysis of Flow, I will focus the analysis on the implementation of Metadata in Digital Broadcasting. Although there have been examples of Metadata within a Broadcast-era Flow, during the era of Digital Broadcasting and Non-Linear Television it became of primary importance. This era of Television starting from the Network Era is characterized by digital transmission and made possible by new technological advancement. I will relate the shifts in Flow to the implementation of Metadata to emphasize the various applications of Metadata as a key concept in Digital Broadcasting. I will conclude the chapter emphasizing the key changes in Metadata and low as they evolve from the broadcast to the digital era and the relevance they share with AMT’s.

Although many of the systems employed to allow metadata to measure audiences specify qualifiers, (such as Closed Captions or Download Permissions) and generate related content are heavily protected under Intellectual Property laws, I will outline the basic functioning within the established theoretical guidelines of Flow and Metadata. Finally, I will outline the importance of Metadata and it’s relevance to AMT’s to better understand the shifts that have occurred in Flow and Metadata. I have also performed field investigation on the Netflix and YouTube streaming platforms to understand the catalogue organization and the content suggestion systems. Most importantly, I have analysed the channel flow and subsequent Viewer Flow that was a result of my viewing choices, though observation and empirical trial; and then verified the result of the algorithm in place to suggest relevant content to maintain a Viewer Flow based on the latest content accessed. This practical investigation is related to different technical aspects

(7)

2.1 Televisual Flow across the Eras

The notion of Flow in television has evolved over time and faced a major transformation in the Digital Era. Arguably, Flow itself is what defines Television in a traditional notion (Corrigan, Williams, Uricchio, Marshall). Raymond Williams described “Flow” as “a single irresponsible flow of images and feelings,” designed to prevent the viewer from summoning up enough “energy to get out of the chair,” and instead “grabbing his attention” (2003, 91–92, 95) by promising the continuation of unending expectation and stimulation, “the reiterated process of exciting things to come”(Corrigan and Corrigan, 2012). Klaus Bruhn-Jensen expanded on that definition by adding the idea of different flows that converge into one thus suggesting a differentiated concept of Flow.

Bruhn-Jensen distinguished between three types of Flow:

 Every television network or company plans a channel flow to keep the viewer watching for as long as possible; this is the category closest to Williams’ original concept when describing textual organisation on a macro-level (Channel Flow)

 Every viewer creates her or his own viewer flow based on all presently available content. Here, the understanding of the subjective experience of each viewer, as also included in the original concept, is in focus (Viewer Flow)

 The two categories are related to everything that is available on all channels – television’s Super-Flow” (Moe, 778); this thirds category of Flow brings forth the notion of

individualised flows that converge into the Super-Flow of Broadcast television.

By this, Bruhn-Jensen’s updated definition acknowledges the existence of several channels and the use of Time-Altering devices, such as the remote control, to bring into light the notion of more than one Flow. By using this definition and breaking down the concept of Flow into

Viewer Flow (which refers exclusively to the Flow created by the spectator while watching), Channel Flow (which varies per broadcaster or channel), and Superflow (which is a compendium

of the three), analysing the shifts in Flow across the Eras of Television is possible.

Understanding the different layers of Flow allows for a valuable tool of analysis across both the Broadcast and Narrowcast as these layers can be analysed individually with reference to

(8)

During the Golden Era of Broadcasting, Channel Flow was the driving concept. Content providers such as television broadcasters had to establish a Flow with junctions (content that links two programs together within the Channel Flow, for example commercial breaks) and a pre-established schedule that viewers would then react upon by “staying tuned or zapping off”. Broadcasting is defined as “An organisational form of radio and television characterized by one-way, simultaneous distribution of content from a centre to the periphery (Moe, 774)”. This meant the viewer could form part of an audience community and find themselves deeply immersed in the given Channel’s Flow. This concept became socially and culturally significant as broadcasted radio and television became an important tool in the creation and maintenance of a common, national identity as viewers relocated cultural and social input directly into their living room (Williams 1975:26ff, Gripsrud 2002:270)” (Moe, 775). Broadcasting may refer to the fishing metaphor of having a net wide enough to catch the largest amount of fish; this concept also applies to viewers or listeners of mass media. Thus, in the Golden Era, the importance of

broadcasting was rooted in the establishment of social identity. This relates closely to Johnson’s take on flow throughout the Golden Age, claiming that broadcasting “introduced a

fundamentally different experience to the discrete activities of reading a book or watching a play by unifying different forms of communication into a singular continuous flow”(Johnson, 1).

Johnson believed Junctions between programmes to be key to television’s communicative ethos as they act as the site where the broadcaster has the opportunity to communicate directly with the viewer, shaping the tone of address for a particular broadcaster and/or channel as well as

communicating the structuring patterns of broadcasting to viewers. Junctions also serve as a key connection between AMT’s and flow, as the value of these Junctions is determined by the ratings provided by AMT’s; in Johnson’s words; “The junction communicates the temporal as the major organising feature of television flow” (2013). Here, William’s definition of Flow is very

relevant: these junctions are the main way to get the viewer’s attention in between blocks of programming and encourages the viewer to stay tuned for what is to come. Junctions can help us to identify the ways in which broadcasters have altered the communicative ethos of broadcasting in response to the new experiences of television in the Digital era” (Johnson). Junctions too have transformed from Linear and Non-Linear television. Linear Junctions served the purpose of

(9)

constructing and explaining the value and experience of television to the public and to key decision-makers (e.g.: regulators and politicians) (Johnson). On the contrary, Non-Linear Junctions are more of a transition between episodes of a show, pre-selected by the viewer (and are therefore void of value and experience).

The end of the Broadcasting Era saw a shift in Flow as well as technological revolutions that brought the invention of the remote control: a Time-Shift device that allowed viewers to take control of the Viewer Flow for the first time. The VCR was another revolutionary invention that allowed Viewers to alter televisual Flow through recording an available program and then watching it in a timeframe different to that of the original broadcast. Whenever the Channel Flow was not satisfactory, or when an excess of junctions caused viewer frustration, the viewer could simply change channels and have the power to create an individualised Viewer Flow

(Narrowcast). The Era of Narrowcasting therefore brought about a fragmentation in what was once the Mass audience and gained a greater degree of agency in arranging its own program sequence, in shaping its own patterns of interpenetration (zapping through advertisements, switching channels) and, thanks to the VCR, in defining its own course of program repetition and recycling” (Uricchio, 13). This concept would then evolve into Non-Linear broadcasting and technology would give rise to Over The Top (OTT), or on-demand, and is significant because the manipulation of Time and the availability of content not tied to a Linear schedule would later become a key obstacle for AMT’s to overcome in the transition from Linear to Non-Linear broadcasting.

Television Heterochromia, defined as “the sense of displacements in time or the vitiating of sequence” (Uricchio, 8) can be considered an interruption of the Flow of Television. As Uricchio states “the medium’s particular form of heterochromia plays an important and potentially

determining role in conceptually framing any given text”.

In order to frame a text, Televisual Heterochromia applies three key elements:

1. Sequence: “[...] the careful orchestration particularly relevant during the broadcast era programming (but residually present as well in its successor regimes), in which the program day addressed a changing constituency of viewers, and in which the program ‘line-up’ was designed to enhance the chances of continuous viewing.” (Uricchio, 7)

(10)

2. Interpenetration: “the practice of parsing out particular programs over time and over the broadcast schedule (e.g., weekly or daily series, where the program day and our lives are interpenetrated) and of fragmenting individual programs with advertisements and

announcements of various sorts, effectively constructing a meta- text beyond the control of the individual text’s author.” Interpenetration “also refers to the practice of using program ‘bumpers’ and ‘hooks’ (displaced micro-program elements) to keep viewers watching. The effect, paradoxically, is both to rupture engagement in a particular program and to interconnect program elements into a larger whole” (Uricchio, 7). The texts that are presented in the channel flow between programs, remind the viewer of temporality and the interpenetration establishes social context (Uricchio, 9-10).

3. Repetition: “the recycling of footage, programs and program units, whether in a single channel environment or across channels.” (Uricchio, 8) This in turn, invariably takes place in a new cultural present, serving variously to reactivate the past of the primary text (recalling original impressions upon first seeing the program or, through the text, its fuller cultural moment), or to recast it through the knowledge that has since been acquired”(Uricchio, 10).

These three key elements presented by Uricchio appeal to each of the different layers of Televisual Flow mentioned by Braun Jensen. The emphasis on Sequence is evident in the

Channel flow, as the Sequence programming and Interpenetration are at channel level. Repetition in this case also forms part of the sequence programming. “On the programming front, new opportunities for the recycling of old broadcast (and film) texts proliferated as cable stations sought ways to fill air-time, and television’s economic logic increasingly turned to syndication, reruns and endless self-reference. Old classics found new contexts, whether through

happenstance or programming strategies (from thematic packaging like the History Channel to Nick at Nite’s recasting of classics as camp)”(Uricchio, 13). In other words, it is through

Televisual Heterochromia that flow can be used as a frame to contextualize a text. “Flow offers a concise way to locate the changing contours of the televisual dispositif, that is, the historically specific constellation of technologies, logics and practices that constitute the medium”(Uricchio, 12). Although the sequence programming may be rendered ineffective by the notion of breaking current events (such as news), these subsequently feed into the flow in the form of repetition. In

(11)

other words, you can recognize an Era based on the content available on television due to their characteristic Program Flow.

Content has changed and Non-linear flow has no interruptions, like the commercial model of Television. Entire texts comprise blocks of flow and the interruption comes when a block is over. The focus has shifted from the macro flow (Super-flow, Channel Flow) to the micro (Audience Flow) and given the importance to the televisual text and not the junctions (Johnson, 7). The literature review has emphasized the technological advancements and rapid changes that occurred during the formation of Flow. The development from Broadcasting to Narrowcasting, and from analogue to digital increased the number of simultaneous flows available. Here, the device formerly known as “Television” is no longer confined to the physical Television set, but becomes a series of aesthetic and technological characteristics that bind the notion of Television to its texts and exist within a multidimensional Flow. This is the main characteristic of Matrix Era.

The technology that put the control of flow into the Viewer’s hands already existed at the end of the Network Era (DVR technology, the power to skip commercials in linear television) however the introduction DVD box sets and streaming services that the concept of Flow changed during the Network Era. Textual continuity lasted the whole of the duration of the program. This allowed for complex narratives that could draw the full attention of the viewer for prolonged periods of time; this also led to prolonged exposure to televisual texts through “binge

consumption”. Binge consumption would seem like a normal outcome with an uninterrupted Flow of televisual texts where the notion of the passing of time is blurred. For example, serial soap dramas have durations of between 45 minutes and 1 hour; here an internal flow that ends in a cliffhanger, or a pivotal moment in the narrative, provides the means for a text to incite a binge-watch. The continuity in the platform flow that prepares the next episode for consumption also feeds into the blurring of external passing of time. Watching a season of 10 episodes in one sitting, for example, translates to 7.5 hours of binge while a normal workday consists of 8 hours. The mass availability of content definitely blurs the passing of real time as it is delivered in seasons and not in an episodic manner as Linear Television.

(12)

This behaviour of excessive media consumption is prevalent and encouraged in social media platforms such as Facebook and Twitter. Binge watching is encouraged by mainstream media through junctions promoting availability of content through VoD (Video on Demand) and OTT (Over The Top) deliveries of content that is an extra to what was broadcasted in Linear channels. Interpenetration plays a role as well, however, it has evolved as well into a multi-media notion, skipping from linear to non-linear streams within the dimensions of the viewer flow and super-flow.

2.2 Metadata across the Eras

As noted in reviewing shifts in Flow, Television production and broadcasting have undergone fundamental changes in moving from analogue to digital. According to the National Digital Infrastructure and Preservation Program, the analogue method of Broadcasting, which transmits sounds and pictures through continuous wavelike signals or pulses of varying intensity, is being replaced by digital capture and transmission (in which sounds and images are converted into groups of binary code). This transformation has brought about the need for classification methods for a large number of Flows and texts available, which is achieved through Metadata. There have been examples of inclusion of Metadata during analogue broadcasting, although it was limited to a Closed Captions track and subsequently as Electronic Programming Guide data. Program descriptions in Electronic Program Guides (EPG) traditionally constituted the only available metadata describing program content.” (Gibbon et al.) Although this raw Metadata has potential, Datacasting truly unleashed the power of Metadata. Datacasting therefore is key in outlining the importance of Metadata in television. In order to capture the Metadata within broadcasts, each program is extracted; “The breakdown of program material into segments is crucial to rights management. Segmentation is not only vertical but also horizontal. Attributes must be logged for each component part. For example, music or narration for a program needs to be available as a stand-alone component, if only to allow editors to remove it for rebroadcast. Rights information needs to be applied to each of these components.” (Council on Library and Information Resources and National Digital Information Infrastructure and Preservation Program (U.S.))

(13)

Velden defines Metadata as ‘data about data’ and can be in a specified sense as well, depending on the setting in which they play a role or data about the context in which a file was produced, edited or stored”. Metadata, such as EPG or content-specific permissions, is of key importance in defining Flow at a given point in time – Framing. Metadata allows for all of this information to be used by broadcasters, advertisers, audience measurement agencies, and electronic

programming guide editors. Fusing broadcast signals with additional source Metadata, after editing and enhancement on its way to the viewer, further improves contextual coherence. “Metadata offer a narrative potential: a piece of evidence can tell something about itself through its metadata because contextual information is captured within the (digital) object itself”. Velden further describes Metadata as four-angled: Data, Consent, Intent and Genealogy.

 Data, the standard Metadata, in televisual application corresponds to “data about the file”; this is basic file information such as date of creation.

 Consent indicates whether the piece of media has the authorisation from all parties to distribute

 Intent is described as ‘information about the media’s creator and the permissions for sharing’. This according to Gregory could beconsidered as a materialisation of the ‘a licensing system that recognizes intentionality’ (2012). And,

 Genealogy, which refers to data integrity and serves as a verifier to ensure that data has not been tampered with.

For the purpose of this exploratory paper I will define Metadata by Data and Intent becuse they are the most relevant to the subject of this paper. The ‘tag of intent’ will say something about what kind of ‘use’ the producer envisions with a particular digital object. These notions account for the valuable producer data embedded into a transmission.

Metadata accounts for regional availability as well as the implementation of OTT services such as Video-On-Demand (VOD) or download watch and limits the amount of time that a text is available for re-watching as well as the devices that can access the media. Uricchio considers control to have “shifted to an independent sector composed of metadata programmers and filtering technologies (variously constructed as search engines in the case of IPTV and adaptive interfaces in the case of intelligent DVR-based systems such as TiVo). This marks a key shift where Viewers can exercise their own agency over program sequence -- and even textual

(14)

production and distribution – like never before. In the Content Analysis Engine present in the infrastructure of many IPTV and modern Digital Broadcast platforms, Metadata is ingested and, at the same time, data about the audiences is collected. As defined by Gibbon, the Content Analysis Engine (CAE) collects broadcast content and Metadata, along with corresponding audience metrics data then processed with a set of algorithms. The resulting metadata for each program is stored in an index. This provides valuable information for investigators, marketers and content producers alike. Through an analysis of Metadata collected by the CAE within broadcasts the televisual text can be provided a historical context. This allows for metadata to be used as a reference point and as of the Matrix Era becomes a two-way street for viewers to send and receive information. The interconnectedness of Flow and ATMs showcases the value of earlier layers of Flow.

The importance of broadcasting data has reached a peak in the current media landscape because Radio and television broadcasting have been a major influence in shaping the political, social, cultural, and economic trends of the twentieth century. The shift from Broadcasting to

Datacasting also increased the amount of Metadata available for a given transmission. The Evolution of Flow provides social context and historical framing for the analysis. This is because flow is a representation of Televisual Time and can therefore explain the characteristics of television over Time. In order to understand the evolution of AMTs across time and key changes from Broadcasting to Streaming. I will analyze the journey that led to Flow and Metadata

defining contemporary Television.

2.3 Time in Television

The first Audience Measurement Technologies (AMTs) of the Broadcast Era focused on

segments of televisual times: who is watching what and when? This resulted in the allocation of value to Frame of Time in television. This is where the term “Prime-time”, when the highest amount of people are watching TV, originated, coined by the Wall Street Journal in 1947; by this the close connection between the linear flow of time and the broadcast flow becomes evident.

(15)

This notion of Linear Time is especially important for commercials as the frame of time would determine the projected number of Viewers and therefore dictate its comercial value. The television day was divided into segments according to the time and viewers most likely to tune in:

 Saturday mornings (7 am- 12) when children are predominantly the audience;

 Weekday daytime (9am – 3 pm) when adult females are more frequent viewers than adult males,

 Weekend afternoons (12 – 6 pm) when sports programming attracts a largely male audience

 Prime-time (8-11pm) when adult males only slightly outnumber adult females (Condry, 1989).”

The number of projected viewers would also indirectly determine the load of Junctions inserted in the programming as large numbers of Viewers were bound to generate a larger resopnse to the commercials. Commercials, according to Corstock and Scharrer, are “comparatively infrequent when audiences are very small (such as Sunday mornings); at their most frequent when

audiences are moderate in size which compensates for the necessarily lower prices that can be charged; and somewhat less frequent at prime time when prices are at their peak but television industry angst is high that too many commercials would reduce the audience’s incentives for viewing” (1999).

The lifeline of television Content, or any broadcast content for that matter, is the amount of engagement and audience ratings it can generate. This key concept attaches value to the Timeframe and the broadcasters (or “Media Sellers”) can get returns on the investment of producing the content in question. Media buyers, on the other hand, explore the options that best suit their customers looking to get broadcast airtime for their product or service. In their most primitive form, the audience panels consisted of television diaries from the Brazilian company IBOPE (now part of Nielsen) that chosen households would fill. IBOPE would subsequently tally the results and calculate an average. Data collection occurred from segments that were created from demographics that would then average out the total population, not from the total audience. Hence ratings were points, not absolute information of who, when and what was being watched by distinct members of the audience. Although the emphasis of the study did not look

(16)

deeper into the audience characteristics, the outcome determined the value of every time slot and for the big three broadcasters, every point counted. The number of ratings that a program had was the difference that could define the future of its production and without syndication no program was safe from the effects of bad ratings. This brings about the important question, how were ratings captured anyway if the total amount of views was not captured?

This was done with statistical analysis of a set “Universe” which consisted of the total audience. The engagement of the audience to a particular media at a particular time was then calculated as a percentage using statistical calculations such as Reach, for example;

“Universe: 10 individuals.

For a single episode of Chhoti Maa:

if out of the above 10 people 6 saw at least 1 minute of the programme then, Reach:6 out of 10

Therefore, reach = 60%” (Tam Media Research)

With the evolution of AMT’s and the increase in the value of televisual time, there were more elements added to the calculation of reach such as gross, cumulative, and net reach (Tam Media Research), which will be elaborated on in depth during the analysis of the network era.

2.4 Conclusions

This literature has helped me define and understand the concept of Flow over time and Metadata. The inclusion of Time as an important factor to the analysis of AMT’s has a dual purpose; one is to frame content to a particular era within the Linear and Non-Linear applications of Television and the other is to serve as a point of comparison between eras and between technological implementations. This analysis focuses on Television broadcasting in the United States. This means there might be differences across other markets with prominent TV consumer engagement such as Europe. I chose to focus the analysis on the United States, as this is the birthplace of many changes implemented to commercialize television as well as the birthplace of many

changes in Flow and technological advancement. Early examples of AMT’s in Linear Television Flow are key to understanding the present and future significance of Television as a medium and therefore their evolution.

(17)

In the next chapters, will apply these concepts across the Linear and Non-Linear television Eras to understand and explain the evolution of Audience Measurement Technologies over time, with reference to the improvements and challenges across each new development. Technology will also play a key role in this explanation and special reference will be made to the concepts of Content and Time and how these slighlty change meaning and focus over time. The first chapter, focusing on Linear Television, will bring forth the early challenges faced by AMTs, such as poor accuracy and reliability, which will then carry onto the future. In this Era of comercial television, I will analyze Junctions and explain the dependency that advertisers and content producers have on AMT’s. Chapter two will see the significance of Time-Shift technologies and the evolution of new television services such as the concepts of Catch-Up and Over the Top (OTT). The

relationship of power between Broadcasters and Viewers will shift and new issues, such as Linensing rights, will appear with the transition from traditional Broadcasting to Streaming. Audience Measurement Technologies will therefore evolve as they adapt to changing trends in the media landscape.

(18)

3 AUDIENCE MEASUREMENT TECHNOLOGIES IN LINEAR TELEVISION

The peak of Broadcast television, also known as the “Golden Era” will be the starting point of this analysis. I will then analyse the evolution of AMTs across the Narrowcasting era and relevant technological advancement that further pushed AMT’s to develop in the Matrix Era of Television, where broadcasts become digitalised but continue to be bound by time. Time and the control of Time, as mentioned in the earlier chapters, are key concepts in this analysis that clearly distinguish Linear Television from the later Non-Linear in the subsequent chapter. Audience Panels, one of the earliest examples of AMTs, are another important recurring concept in Linear Television because they were the most widespread AMT at the time. Although the means of collecting data from Audience Panels evolved over time, they have been the main raw data source for AMT’s since their inception; the Nielsen Company, the inventors of the first AMT and therefore key player in their evolution, still makes use of them today. Audience Panels, developed during the early Linear Broadcasting Era, will serve as the foundation of future developments of the Non-Linear and Matrix Era.

3.1 The Broadcasting Era (1920s – 1950s)

Arthur C. Nielsen invented the first Television Metering device in 1936, 10 years before the “Golden Era” of Broadcasting began. In 1946, with the beginning of the “Golden Era’, there were roughly 10,000 television sets in the United States. This number increased exponentially to 10.5 million over the next four years. The early days of Television offered only two channels, NBC and CBS, which limited the duration of their transmissions to 11 hours a day (from 07:00 to 23:00hrs). This duopoly in broadcasting led the Federal Communications Commission to place a temporary halt on the emission of broadcasting licences in 1949; this situation left a vacuum, allowing three big media conglomerates to establish themselves on the scene. NBC and CBS were names already known in radio; hence the move into Television to keep up with changing trends was a logical move. By 1950 the number of channels increased to 3: NBC, CBS, and now ABC, which controlled the television market (1952-1980s- Network Era).

(19)

Due to technological constraints and the limited amount of channels on the air, the Golden Age of Broadcasting lasted from 1946 and concluded in the late 1950s. The influence of World War II was also notable. The shift from focusing all industrial activity on war presented notable advances in technology and improvements of the quality of post-war life for American citizens. The focus of commercial Television on consumption and the televisual Flow played a great part in cementing the economic value of commercial television. Meanwhile, the content available was making a shift from wartime coverage that consisted mostly in opportune updates of the on-going conflict and propaganda, to post-war entertainment that aimed at getting more people to watch commercial Television. Raymond William’s notion of flow is clear throughout the Golden Era. With only two channels at the beginning, the viewer did not have the option to control the Flow and with only three channels available, the viewer had to comply and watch what was on air. Without the means to be in control of the televisual flow, the Viewer had to decide whether to “stay tuned” or not. This is an important characteristic of the most traditional notion of Flow. The Flow was mostly in the hands of the broadcasters that had complete dominance over the medium.

These were still early days for Television and technology was still primitive by modern

standards. Television broadcasts were in greyscale and Television Sets no larger than 10 inches; content was also considered crude; as the more elaborate programs were mostly theatrical plays, which were expensive to produce and to maintain on the air. This made advertising and Junctions for comercial use became even more important. Without commercials, there would be no funds to produce content for commercial Television. And sponsors financed the first instances of commercial Television. Through advertising agencies, brands (sponsors) would not only promote their products in the space they had paid for but would also decide on the content being

broadcasted in that span of time. The best example of this scenario are Soap Operas, known by that name as detergent brands would usually purchase time frames that broadcasted dramatic series that were usually aimed at a female demographic. This appealed to the target group of housewives and trickled down to their daughters. Sponsored spaces are a very important point to define the Flow of that era, as the programs were sponsored by brands that had a common target audience. This was the precursor of audience segmentation, which would be perfected

(20)

prizes for the contestants. Although this dynamic came to an abrupt end when it was discovered that a contest sponsor had rigged the results for a particular participant to win and what is worse, they were caught in the act (Neuman, 1982). While outlining the advent of sponsorship, it is necessary to note that the value added to television was the particular linear time frame of the flow. Just as the soap operas had captured the audience segment of housewives, other blocks of linear time were aimed at a different social segment, but all in all within the greater flow of broadcast. Broadcasters tried to diversify their content in order to capture more diverse and niche segments without compromising on the wider main audience appeal. This was possible to measure due to AMTs and subsequently, added value to these segmented time frames, making them a suitable investment for brands.

The 1950s were a turning point for Television as Zenith Radio Corporation developed the first remote intended to control a television in 1950. This period also saw the introduction of colour television, which later would replace itse greyscale ancestor. The primitive nature of the AMT’s was sufficient during the Golden Age as radio was the only other live broadcaster. Television had such an immense growth that sponsors were pulling out of other media with decreasing popularity, such as radio and print, so they could get involved in Television. This increased the importance of AMT’s as the economic value of televisual time needed to be determined in order for advertising agencies to sell the televisual time they were promoting. Historically, this data was collected in the months of February, May, July and November. The data collected would determine the rates of advertising time within the 250 television markets in the United States Television ratings would in turn serve to design more programming for the future based on successful formulas proven to have larger audience reach. This would also impact spin-offs and other textual characteristics of televisual programming:

“The necessary (but easy to meet) condition is that there are differences among viewer’s preferences for a genre or dimension of content. Competitors attempt to attract such viewers up to the point that gains in viewership level out. The competition then centres on other elements of content that differ in their attractiveness to various viewers. This conceptualization vanquishes the question of whether violence, sex, scandal or situation comedies promise better ratings because it rests on the economic principle of equilibrium in which competitors change their

(21)

programming emphasis in response to who it is still possible to attract.” (Cornstock and Scharrer)

3.12 The Early Audience Flow

Returning to Bruhn Jensen’s definitions of flow, the great economic importance placed on Channel Flow allowed for the development of Superflow. Here, I recall Bruhn Jensen’s notion of Superflow as the overall Televisual Flow and Channel Flow as a segment of the Superflow, defined by a television channel. By the late 1950s, use of the remote control was widespread and television moved toward early days of Audience Flow. Colour Television gained popularity and, in the period between the 1960s and the 1970s, AMTs also experienced a key development. The addition of colour to the images presented on the Super-flow was all the more attractive, which began to increase the number of viewers, but more importantly, it created a need for variety in televisual programming. The widespread adaptation of the colour television therefore pushed AMTs to evolve. The technology of the remote control (RCD), introduced in the early 1950s allowed viewers to change Channel Flow without having to get up from their seat. This new device also enabled the implementation of the People Meter. The People Meter was a new type of AMT, which combined demographic information; this was possible as all viewers had to “log on” and identify both gender and age before being able to watch television. The audience panels were upgraded to these set-top boxes so that more information about their viewing habits could be recorded. Every person watching had to follow a sign on process and every time a family member (or even guest) partook in watching television, they too had to sign on. This allowed for the collection of great quality viewer data, since the time, channel and the people in the room were also recorded. A hypothetical situation in which example children have been logged on and are watching cartoons until Prime Time begins and the housewife logs on and changes channels to the current soap opera; shortly after it ends, the father arrives home, logs in and changes the channel yet again to the news and the example mother puts the children to bed. Meanwhile, the next-door neighbour has randomly visited the family and joins the watching. She has logged in and joins the audience flow that began with cartoons. Finally, the news is over and the father turns the television off thus ending the session. This is an example of a situation that would cause television inheritance since several Audience Flows are combined in a daily viewing session of the Superflow. The different audience members did not log out until the

(22)

television was turned off, therefore their views counted toward the reach and the total audience of a given program, even if they were not actually watching. This brings about the issue of Accuracy of AMTs; the lack of this accuracy would become a key challenge faced by AMTs during the Network Era.

Lack of accuracy would be a problem that formed part of AMT’s during the network era. With the advent of satellite channels and the explosion of commercial television, the people meter was less frequent to find, as the technologies used to measure audiences became integrated with set-top boxes from television providers. It was also much more cumbersome to operate;

Individuals in the people-meter sample make a commitment to do things that ordinary viewers do not do. When their TV set is turned on, a red light on a device that rests on it goes on. Each person watching then should press an assigned button on the remote control or on the unit on top of the TV. When one or more have pressed their buttons, a light flashes until an "OK" button is pressed to indicate that the individual buttons are registering correctly. This light flashes and demands response again when channels are changed and when the set stays tuned to the same station for 70 minutes to verify that a person is still watching. Each individual is supposed to push the button whenever he or she stops watching permanently, or even temporarily to answer the phone, use the facilities, or inspect the refrigerator. Household members are asked to undertake this commitment for 2 years. Thus the design can be described as a continuous measurement panel” (Milavsky). Modern iterations of the people meter are disconnected from Television sets and can even be used on the go to measure televisual content that the panellist is exposed to outside the home, in places like restaurants, bars and hotels.

3.2 The Network Era (1952 – 1980s)

In the early 1970s, there were now 5 major networks in the United States. Although this was already a significant increase for in Channel Flow for the viewers of the time, there still were limited options in terms of Content. Audience flows had great similarities and continuously jumping from one channel to another with only five options proved cumbersome during the broadcast era. The Network Era, second period in Linear Television, provided a greater amount of Channel Flow variety than before. This rapid increase in Channel offering brought both

(23)

challenges and opportunities for AMTs as, in this new scenario, People Meters lacked accuracy and were therefore not reliable. Although People Meters had somewhat evolved, and were now able to identify an audio snippet of whatever was playing in order to attribute the view to any given content, (instead of the earlier manual input required), lack of accuracy was a big

handicap. For example, even if the viewer was surfing channels and landed on any given channel long enough for the People Meter to register the snippet, this would count. At the height of the Network Era, in the mid-1970s, the Superflow was now flooded with a huge amount of Channel Flows, offering variety so great that User Flows could be totally different even if Viewers were living next door from each other.

The significant increase in channels during the Network Era brought about a new scenario in which the viewer could always have something to watch. By the late 1970s, the number of cable subscribers and the number of multi-television households had increased. This led to an increase of Channel flows within the Superflow, as well as a rapid increase in the individual amount of Audience Flows. Partly through the drastic increase of screens available, but also because of the increase of available content; with more variety came more viewers. This was a drastic change and upgrade from the earlier duopoly setup as the new television scene was characterised by a larger than ever before array of content and channels “Yet neither fundamental growth, which occurred much earlier nor the maintenance of the pattern that marked the 1960’s and 1970’s could be said to characterise the period. Instead, it was the dissolution of that pattern before the twin forces of technology and viewer preferences. We look upon it as an era of transition because the eventual configuration of its elements is uncertain. The most significant aspect has been an enormous increase in the number of channels offered to viewers. This was a result of further diffusion of television technology – the increase in the number of stations, the rise in cable subscriptions and the swift adoption of the videocassette recorder (VCR). The seeming implications were that conclusions drawn from decades of data on television would have to be revised and viewing might increase because more people would find what they wanted.” (Cornstock and Scharrer, 11-12)

Late 1970s are a key point in the evolution of AMTs ad televisual flow as 1976 saw the

(24)

of what would become known as Time-Shift technologies, which allowed viewers to access programs at a chosen point in time, not just when they were broadcasted. This instance directly challenged the “liveness” of television as viewers now could control Time. These new

technologies combined with the increased amount of screens, caused the traditional AMT’s and the balance of power that comes with control of the linear Televisual Flow to shift once again. The television Superflow of a familiar dynamic was not concentrated on the main television anymore and inheritance between different demographics ceased. Now, the paradigm had shifted to several Channels Flows at the same time within one household Superflow, and the Audience Flows hardly crossed (unless multiple viewers were sharing a television/room).

For example, if a household comprised of a mother, father, and two sons with televisions in the master bedroom, living room, and children’s room, this means that there could be up to 3 different flows at the same time; while the audience flows of adults and children could combine,, there could also be instances where adults have different flows on different screens while children were sharing an independent audience flow.

Just as the example is difficult to read and follow, it also proved to be a challenge for AMT’s as the passive People Meter would most of the times not be set up in all televisions of the house, but rather the main screen in the common area. There were still instances of collective viewing on the main television as a moment to strengthen family ties, but the views from secondary screens were not registered. That would mean that only the views from the main television and the audience diary (depending on the commitment of the viewer) would count as audience ratings. This inconsistency would definitely be a cause to doubt the accuracy and reliability of the era’s Audience Measurement technologies. Audience-rating data was not made public but was for access of television broadcaster, producers, executies, and similar stakeholders. Agencies selling commercial space also somewhat had access to this data. This means that the audience

themselves did not have nay consequence wirht regard to the precision or lackthereof. AMT’s are mainly a Television lifeline that harvests data from viewers, but is entirely used by content producers and advertisers. In 1976 the first Direct Broadcast Satellite (DBS) was introduced. In the late 1980s, Multi-Channel transition began. This period saw deterioration of the dominance of the Big Three television networks: ABC, CBS and NBC, and follows the creation of a wide

(25)

variety of cable television channels that catered specifically to niche groups. The Network Era brought the early days and trends that would later become key elements of Non-Linear

Television. The introduction of Over The Top TV (OTT) a Television service complimenting time-shift device

3.3 The Multicast Era (from 1986 - 1996)

3.31 The Set Top Box and the integration of AMT’s

“Consumer acquisition of new media technologies – new media or new means delivering old media – in our view occurs within three laws. [...] The first law is that of functional equivalence, which requires that new technologies serve at least most of the functions of established

technology. The rapidity of adoption will then be a function of costs and success in this dimension with greater convenience or superior performance prerequisites. [...] Technologies that fail this first requirement will be sharply circumscribed in adoption unless costs are trivial or non-existent unless the technology provides novel gratifications. [...] The second law is the Janus-like motives of satisfaction and dissatisfaction. Those viewers particularly pleased with what they are receiving will want the means to receive more of the same. Those not as pleased will be attracted to technology that offers new options. Cable benefits both the pleased and the dissatisfied – more of the kind of programming already available as well as options new to the local market – as do Direct Broadcast Satellite (DBS) systems and other means of delivery. The third law is affordability. All technological inventions with costs – in this instance colour, cable, the VCR, DBS, and initially television– are adopted more frequently by households of higher socio-economic status. Among households in lower socioeconomic status, rates of adoption will catch up as costs fall, as was the case with television, color and, currently the VCR.” (Cornstock and Scharrer, 12-13). Cornstock and Scharrer emphasize some of the motivations behind the implementation of new technologies and refer particularly to Television and the audience rate of acceptance and implementation of new consumer technologies.

Building upon AMTs of the earlier Eras, the People Meter greatly improved by adding measurement capabilities based on sound bytes that made measurement more precise than

(26)

manual input. To some extent, Set-Top Boxes intended for Direct Broadcast Satellite (DBS) systems replaced People Meters. Modern examples of the people meter are disconnected from Television sets and can even be used on-the-go to measure views in public spaces such as restaurants, bars, hotels, etc. Another impactful technological transformation in the Network Era is the implementation of the V-chip, which was included in all televisions larger than 13 inches. The V-Chip utilised Closed Caption metadata to enforce parental guidance settings. Despite acceding to warning labels and automated zapping, the industry remained stoutly opposed to giving consumers the ability to block anything beyond the coded entertainment programs (such as news and sports) because of the possibility of audience loss (Albiniak & McConnell, 1998). This is a great example of the “indifference to paternal convenience on behalf of earnings

(Cornstock and Scharrer, 16-17). Furthermore, technological problems delayed the installation of the V-chip in new Sets; considering that sets, on average, were being replaced every 7years, the comprehensive implementation of the V-Chip system took a decade. After full implementation of the V-chip in 1996, it helped collecting audience data to determine whether it was appropriate for the current parental setting. This was done using a sound byte to recognize the program playing. This method was very similar to the technology used in the People Meters to determine what was being watched and counts it toward the program’s ratings.

3.4 The Multicast Era (1986 – 1996)

The Multicast Era, as the modern period of televisión is known, was the turning point that led to the development of Non-Linear television. By definition, commercial broadcasting exists with the purpose of financial profit. Broadcasters even avoided placing parental ratings to ensure they could reach a large number of viewers maintain a steady revenue stream through audience ratings. With the widespread implementation of DBS (Direct Broadcast Satellite) systems, with their own decoders and set-top boxes adopted AMT’s into their systems thus taking full

advantage of the new Metadata stream within digital broadcasts. These systems would arrive as a solution to the issue of more screens within a household and the lack of success measuring the Superflow of a household with all the audience flows combined. Housholds with a higher socio-economic status would have multiple sets as (presented by Cornstock and Scharrer in the previous chapter), each with its own DBS system, which in turn had AMT’s built into them.

(27)

Here, subscribers of DBS service determined the total amount of viewers; this was a huge improvement on the accuracy of AMTs, which, during the Network Era, struggled to remain relevant and accurate tools. The combination of Electronic Programming Guide (EPG) data and the V-chip technologies now ensured that AMTs delivered detailed information about what was being watched and when. The demographic information was based on the audience data that the DBS service provider retained about all its subscribers. This subscriber data together with the more traditional viewer data was homogenised to provide Audience ratings that were more precise, but also more widespread as they were no longer limited to the data provided by Nielsen Company or Audience Panels. Audience Panels also benefited from Technological advancement as they are now more targeted and accurate. The Nielsen Company data scientists run different processes in order to identify the best households to recruit for the panels to ensure they

represent the population being measured, accounting for characteristics such as age, gender and ethnicity (Panels | Ratings Academy | Nielsen). This is a key example of how digital technology played a role in the evolution of Audience Panels therefore bringing about improvement. Digital technology has thereby helped AMT’s to become widespread. In the next chapter I will more closely look at the developments in this Era.

3.5 Conclusions on Linear Television

Audience Panels, the earliest example of AMT in the Broadcast Era, despite serving as the foundation for later developments, had different drawbacks. The introduction of the remote control brought the first shift towards Audience Flow and, with developments such as colour television, viewers increased. AMTs developed as this new device enabled the implementation of the People Meter. The People Meter was a new type of AMT, which was made possible as viewers had to “log-on”, and identify themselves with certain characteristics. Audience Panels were now upgraded to the People Meter; this was a significant step in the development of AMTs. The Broadcast Era brought the earliest examples of timeshift technology as well as a turning point where the concept of Viewer Flow is introduced and the Viewer has control of time. This was possible through the Remote Control and then the VCR. The establishment of Viewer Flow through the Remote Control and the inception of the VCR as the first Time Shift Device allowed Viewers to create their own viewers flows.

(28)

AMT’s saw a great development due to the revolutions of the Network Era. Subscription based Television led to the development of new devices. Time Altering Technologies also made great strides since the remote control and the VCR; traditional notions of Broad and Narrowcasting were altered as the focus on Flow shifted from the Channel Flow, which included programming and commercial breaks (junctions), to the Program Flow. This established junctions, as one of the main reasons viewers would stop watching Television. The fear of a reduced incentive to watch television was a driving force for Broadcasters to seek technologies that moved away from the traditional notions of Television and Channel Flow. This would mark another significant shift that focused more on Audience Flow. This power shift marked the beginning of Non-linear Television, which appealed to consumers who wanted more of the same, and received Over The Top services, as well as for those who wanted a change, which arrived in the form of streaming services. The Network Era was the first step towards the prominence of Metadata. With the arrival of the Set-Top-Box, Metadata became integrated. This would prove to be also a step towards a challenging time for AMTs and the start of controversy around the data’s accuracy and reliability. Despite this being a key point in the history of AMTs due to the arrival of new

technologies and shifts in flow, it was not until the Matrix Era, that modern AMTs truly became important.

(29)

4. AUDIENCE MEASUREMENT TECHNOLOGIES IN NON-LINEAR TELEVISION This Chapter will continue the analysis of how changes in Flow and the implementation of Metadata can explain the evolution of Audience Measurement Technologies by focusing on Non-Linear Television. I will start by looking at the evolution of Non-Linear broadcasting and the changes AMT’s underwent in order to remain relevant to changing times and trends. In this period, the Catch-up television concept is born. In Non-Linear Television, Metadata becomes of central importance to analysing AMT’s because of the shift from over the air Broadcasts to digital data Streams. Non-Linear Television brought and witnessed several changes in Flow, which I will look at throughout the analysis to explain the advancements of AMTs over time and technology. In this epoque of Television, Time-Shifting devices become key and Viewers continuously gain more control over the Flow. I will also make reference to Junctions as central part of Channel Flow and assess new contemporary Junctions and their effects on Viewer behaviour. In this Era, Television Catch-Up services eveolve into “Television everywhere” with the introduction of new technology. This era also witnessed the transformation of Flow from a Semi-Autonomous Audience Flow, with the further development and prominence of Time-Shift devices, to the Fully Autonomous Audience Flow of streaming platforms. The evolution of Junctions in the Channel Flow will also be discussed as well as the evolution of the commercial and pop-up ad. I will conclude this chapter with an analysis of Time and Content and how, as a consequence of the implementations of Metadata, evolved over time. Altogether, this will emphasize the importance of AMTs in contemporary televisión and the role they play in the modern media landscape.

4.1 The Multicast Era (1996 - 2000)

In this period, Content becomes the most valuable part of the Televisual Flow as the technology of Non-Linear television evolves, and later becomes a central part of Streaming, which will be explored later on in the analysis. The importance of Time-Shift devices in changes in Televisual Flow becomes undeniable. Non-Linear television displays several key differences to its Linear counterpart nevertheless building on its basic characteristics;

(30)

1) Content is organized in channels, which seek identification and recognition by viewers. Each channel has a name; visual identity and a programming profile, formulated with the purpose of keeping the viewers watching.

2) Channels organize content in individual units, the TV programs and commercials. This organization aims to hold the viewer’s attention by interleaving different programming profiles. Programs are promoted and presented to viewers extensively through internal-promotions [6].

A key feature of these self-promotions is the positive self-reference, where the next episode or program is promised to be better than the last one, thus sustaining the “eternal to come”

adage”(Abreu et al 1). The possibility to access missed programs through catalogues of recorded television Broadcasts – On-Demand Content – is too an extension of Linear Television Flow because the programs available have already been broadcasted in a Linear format and have subsequently been added to the On Demand catalog. This follows the notion that most of the content available in Over The Top and On-Demand services have already been part of a Linear Broadcast.With the introduction of Set-Top Boxes and Electronic Programming Guides in the Network Era, television viewers now had a much larger choice of channels and programs to watch, than ever before. With Digital transmissions and the introduction of the EPG (Electronic Programming Guide), viewers, with this interactive guide, had all iterations of televisual content available in the program guide interface of the Set-Top Box. New functions were introduced that allowed the Viewer more control over the flow. For example, Viewers could set reminders for future programs and could navigate through hours of television shows spread across different channels. These developments signified a major turning point in the development of Audience Flow while Content from the channel flow was bound by linear time.

4.12 Video on Demand (VoD)

VoD and Catch-up content is stored as soon as programs are broadcasted, they are recorded and form part of a television provider Video on Demand catalogue for subscriber access. This

recording, together with its available metadata from the Electronic Programming Guide (EPG) as part of a Super Flow of Television has increased the accuracy and reach of AMT’s today.

(31)

information available from broadcasts can be measured to variable ranges of time thanks to their implementation. Although eventually, some platforms such as Netflix and Amazon Prime would come to produce original content adapted to the flow of VoD and Catch-up.

The emphasis on television liveness and its importance have not changed in the Non- Linear era; “Live television (whether viewed through digital terrestrial or online) is particularly valuable for commercial broadcasters because viewers cannot fast-forward through the advertising breaks” (Johnson, 11). The programming was still bound to the constraints of time and the only programs available to re-watch were those from the past while live programming maintained its

importance among viewers. For those viewers not interested in re watching live TV, they could access a catalogue of video content available for intervals of time such as recently released movies for an extra price. This service was known as Video on Demand (VoD). This was the first OTT (Over the Top Television) service available on the platform for streaming, as we know it today, although it is now considered as one of the two classifications of VoD;

1. “Transaction VoD” (T-VoD) ; the most typical version of the service, where customers need to pay a given amount of money whenever they want to watch content from the catalogue. The rental time is usually 24 or 48 h, during which they can watch it several times.

2. Electronic sell through VoD (EST-VoD): a version of the VoD service involving the payment of a one-time fee to access the purchased content without restrictions, usually on a specific platform [37]. Although this type of VoD is more frequent on OTT providers like Apple iTunes and Amazon Instant Video, it also being offered by traditional Pay-Tv operators, like Verizon’s FiOS TV”(Abreu et al. 6)

Transaction VoD was considered an added extra for content such as movies that were

complimentary to watching Linear Television. Electronic Sell Through VoD was available for sporting events and other Pay-Per-View, through Season Passes, which are yearly subscriptions to sporting events or sporting leagues like the NFL or NBA. There are also other services such as Netflix or Amazon Prime that provide Electronic Sell Through VoD, also known as Subscription VoD (SVoD) by charging a subscription fee to access a catalog of content, both original and syndicated. The availability of these catalogues is thanks to Timeshift Devices.

(32)

4.13 The significance of Timeshift Devices to AMTs

Time-shift devices have been available since the inception of the VCR in the 90’s however; the arrival of Personal Video Recorder (PVR) was the next big step forward in the development of Television. “Time-shift TV refers to the visualization of deferred TV content, i.e. linear-TV content that is recorded to be watched later [...]”(Abreu et al., 5). For Viewers who were not able to view the preferred content when it aired, Time – Shift devices such as the VCR and more recently PVR’s made it possible to watch differed content at any point of time. Viewers were also capable of pausing a transmission or even fast-forwarding the commercial breaks (even though this also counted as an ad view). This was especially useful for viewers who wanted to videotape an episode of a television show or movie, for differed viewing. Time- Shift Devices also saw a technological advance as the hardware evolved (for example from the VCR to DVR), with increasingly user-friendly hardware. This was in line with the notion of Television

becoming more controlled by the Audience. In the beginning, this time-shift service was only available on television screens. Through an interactive TV guide, viewers could perform actions such as record ongoing broadcasts or program recordings of future broadcasts in advance and store them on the hard drive of the set-top box. This service was originally created as a compliment to linear television and is an example of how Non Linear Television builds on Linear Television and maintains some elements of it by nature.

Later with the introduction of Smartphones and Tablets as Personal Video Recorders, this would change as it gave the viewer even more control over content and Time (for example having more options of what to watch and where to watch it).“Personal video recorder (PVR) [...] behaviour is similar to that of a videocassette recorder (VCR), however with larger storage capacity and nonlinear access. The user can start watching a recording when he wants, even if the program is still being recorded [...]”(Abreu et al., 6). Now viewers could navigate a personal collection of recordings acquired straight off the television with digital quality and what’s more, they were all stored in the internal HDD of the PVR.

My focus is mainly on Transaction VoD, as it was first made available by Television Service Providers for Over The Top content for paying viewers. The terms are also interchangeable as

(33)

OTT can be used to describe extra non-linear services provided by channels or it can also be used to describe streaming platforms like Amazon.

Transaction VoD services were first implemented by Linear Channels to create further reach within its audience. At first, they were a video rental service by television providers. This service evolved to include extra content that was complementary to popular linear programming. VoD services always intended to keep the televisual flow live and implement Video on Demand services as an extra to complement what was already being broadcasted. All of these

characteristics, together with the increasing quantity of devices available to access content, have led to the increasing popularity of OTT services; “consumer adoption of these services is

surging, driven by the increase in broadband Internet access and the availability of the services in a multi-platform approach simplified by “bridging devices”, that are making the process of watching OTT content using personal computers (PCs), gaming consoles, smartphones and tablets on the big screen more straightforward and accessible” (Johnson, 6). The importance of this chapter lies in the shift to the Matrix Era that saw Television completely morph and the gap between television and computer narrow. AMT’s were now built into the Set-Top box and no longer functioned as external measurement. With the advancements in technology, AMT’s further integrated into Non-Linear broadcasting and OTT services.

4.2 The Matrix Era (2000 - present)

In the late 1990’s and early 2000s, viewers increasingly had access to computer networks, the Internet, and digital technology. This access and technological advancement facilitated the Streaming of audiovisual content and allowed Viewers new tools by which they could exercise control on the Flow. Other technological advancements that played a key role in the development of television would be the introduction of Smartphones starting from the early 2000s. The Matrix Era was a notable turning point for television as its essence left the confinenment of the TV set and became a service available in all screens; ranging from smartphones, tablets, videogame consoles to car dashboards and many other iterations to come. “All of our screens are now TVs, and there is more TV to watch on them than ever” (Weiner).

(34)

4.21 Television in Modern times

Just as Televisions became Computers through the implementation of Set-Top Boxes,

Broadcasting also morphed into Streaming. This was made possible by new technology by which Television signals were no longer captured over the air through antennas to the TV set (in the case of Commercial Television providers, while public access television still relied on traditional broadcasts), but were now Streamed through the Internet. This new technology also allowed these “signals” to be stored in the Set-Top Box’s hard-drive (and later into the TV service provider’s Cloud, a new invention). On the infrastructure level, modern Television saw drastic changes however with regards to the relevance of Program Flow, not much has changed despite significant advancement in technology. The concept of Televisual Liveness is still relevant too. Nevertheless, the internal Flow of programs has been designed to create an independent Flow from the one present in the traditional Channel Flow and Superflow of the Matrix Era.

In modern times, Audience Flow has become the central concept and has given rise to

Streaming. Programs still maintain their own established Flow and succeed in self-promotion, which leads to consecutive (or “binge”) watching. Modern Program Flow is complemented by the modern version of Channel Flow in the non-linear era; the only junctions in between programs are a countdown to the following show. There is no “commercial break” in the

traditional sense, but in the main interface view, all the algorithms regarding Content suggestion and sorting are present to inspire the Viewer based on their preferences and habits as well as curated in the case of a new release or seasonal special. This gives rise to a new level of interaction between the Viewer and Broadcaster. According to Abreu, Channel Flow is established by Content. The commercials of Linear Channel flow have been replaced by self-promotion of other programs from the channel. This still builds on Linear-Flow concepts by which Content is pre-established and self-promotion is a modern Junction of the Channel Flow. Such emphasis on the “still to come” is another characteristic of Linear Television that still applies to Non-Linear Television despite the transition of traditional Broadcasting into Digital Streaming. As briefly mentioned in the Network Era of Linear Television, the introduction and subsequent widespread us of Set-Top boxes was a key driver in the improvement and accuracy

Referenties

GERELATEERDE DOCUMENTEN

It has taken the government until 2016 to be willing to fund a large- scale research programme into acts of violence and still the focus of this programme is largely from

Until the integrated multidimensional change agent steps forward the change ambassador can be the linking pin between the formal and informal change agents by visibly supporting

[r]

In the third section a new two-stage ordinary differential equation model that considers the evolution of carbon, sugar, nutrients and algae is presented.. Careful estimates for

Er zijn verschillende studies gedaan naar de effectiviteit en bijwerkingen van amisulpride bij patiënten met schizofrenie en deze onderzoeken werden in verschillende

186 However, the moral desirability attached to the expansion of the possibilities for human well- being equally suggests that enhancement interventions which are directed

Het is niet bekend in hoeverre het aaltje meegaat met de knollen als deze goed zijn geschoond door het verwijde- ren van alle wortels.. Meer informatie: paul.vanleeuwen@wur.nl

Tijdens de eerste bijeenkomst in de verdiepingsfase op 3 oktober 2019 bevestigden de betrokken partijen dat patiënten zo lang mogelijk begeleid moeten worden in de eerste lijn, en