• No results found

Authenticity in Game Emulation: Obsolescence and Open Issues

N/A
N/A
Protected

Academic year: 2021

Share "Authenticity in Game Emulation: Obsolescence and Open Issues"

Copied!
54
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Authenticity in Game Emulation:

Obsolescence and Open Issues

Universiteit van Amsterdam

Master’s Thesis in New Media and Digital Culture

Student: Giovanni Carta

Supervisor: mw. dr. Fiorella Foscarini Second Reader: dhr. dr. J.A.A. Simons

(2)
(3)

Abstract

This thesis addresses the digital preservation of complex digital objects such as video games, which require some specific preservation actions. Emulation currently is the accepted, even though problematic, approach for dealing with game preservation. By drawing on research from Digital Archiving and Digital Preservation, but also from Games Studies and New Media theories, this paper will try to answer the question of how emulation can guarantee the authentic preservation of video games. This work argues that emulation as a preservation strategy presents several issues that interfere with the correct retention of authentic gaming environments.

(4)
(5)

Table of Contents

Introduction...7

1 Obsolescence...9

2 Digital Preservation and Video Games...13

2.1 Digital objects...14

2.2 Video games as complex digital objects...15

2.2.1 Video game preservation...16

3 Emulation as a long-term preservation practice...19

3.1 Emulation and migration...19

3.2 Emulation strategies...21

3.3 Open issues...23

4 Authenticity and significant properties...27

4.1 Authenticity...28

4.2 Significant properties...30

4.2.1 Significant properties and users...32

5 Games, open issues and metadata...36

5.1 A metadata model approach...38

5.1.1 Effectiveness of metadata within emulation...41

Conclusion...44

(6)
(7)

Introduction

In 2009, Milleniata, an American digital start-up company, announced the sale of an optical medium called Millennial Disc – M-Disc – (Oliver). Unlike standard DVDs, M-Disc is made of inorganic stone-like materials which would resist better to heat, humidity, scratches and other deterioration factors. According to Milleniata, Disc can last for one thousand years. Whatever M-Disc’s effectiveness may be, it is appropriate to raise doubts about the actual long-term efficacy of this product. Specifically, are we sure that our future technologies will support these special DVDs in one thousand, or even fewer, years?

Digital devices and digital objects (see ch. 2.1) cannot be preserved like any traditional, analogue items. Digital preservation, in theory, must protect the conditions that make a) devices and computers work; b) software applications and operating systems run; and c) files readable, usable and understandable. Digital preservation, though, is running a race against the clock.

Due to the vicious circle of obsolescence, new media devices and new media technologies get old in a short time. This incessant mechanism affects the medium, as a informational carrier, and the information itself meant as software applications and file formats, but also, and more generally, all the digital information resulting from digital media.

This thesis addresses the digital preservation of complex digital objects such as video games (see ch. 2.2), which require some specific preservation actions. Emulation currently is the accepted, even though problematic, approach for dealing with game preservation. By drawing on research from Digital Archiving and Digital Preservation, but also from Games Studies and New Media theories, this paper will try to answer the question of how emulation can guarantee the authentic preservation of video games. Authenticity is an important concept to assess the quality of preservation in relation not only to emulation, but also to other digital preservation approaches.

Chapter 1 focuses on the problem of (digital) obsolescence, which is a condition that greatly

influences the practices of preservation. Obsolescence is considered here as a complex condition mostly driven by economic factors and physical deterioration. Chapter 2 analyses the importance of and difficulties in preserving video games. In Chapter 3, emulation as a long-term preservation strategy is discussed, including its drawbacks and limitations. Chapter 4 examines the notion of authenticity in relation to digital fields such as digital archiving and digital preservation. The concept of significant properties is introduced here in order to better delineate the meaning of authenticity for video games. In Chapter 5, further issues related to obsolescence and authenticity concerning video games are presented. In addition, the need of a metadata emulation model that

(8)
(9)

Obsolescence

New Media are actually old media. It is rather self-evident: the universe itself along with inanimate

elements and living creatures are influenced by the relentless effects of time. New media, then, as part of this global scenario, are no exception to this. Yet although ageing is not a specific peculiarity of them, new media are affected by a particular kind of degradation. Digital devices age fast and their functionalities decay in a short time not only because of their implicit materiality, but also because of their relationships with capitalistic and consumerism processes. These objects, in fact, might be considered old, even if they are not.

Jonathan Sterne argues that new media are no longer new as their newness is not related to old media (18). Rather, new media “are new primarily with respect to themselves” (19). Sterne’s point is that behind this distorted idea of newness lies a practice of planned obsolescence that works cyclically in order to provide a constant replacement of devices. Industry and market themselves decide what is new and old by designing, producing and commercialising products with the sole purpose of replacing their predecessors, regardless of their actual state of decay.

Although there is no univocal definition of planned obsolescence1, several authors (Boradkar; Hertz

and Parikka; Slade) trace the origins of the term to Bernard London, an estate agent who wrote a pamphlet called Ending the Depression Through Planned Obsolescence in 1932. Given that people had been “using everything that they own longer than was their custom before the depression” (n.pag.), London conceived an economic strategy that consisted in providing an expiry date to objects produced in the United States. Consumers would eventually return expired commodities to the government.

Even though planned obsolescence can be understood as a practice that aim to “artificially limit the durability of a manufactured good in order to stimulate repetitive consumption” (Slade 5), scholars within new media studies, such as Sterne, Garnet Hertz and Jussi Parikka, argue that this process is defined by a set of industrial and marketing strategies which make new media less desirable and usable after every cycle. Sterne himself observes that “Until they are 'obsoleted,' many computers show no significant signs of wearing out” (26), while Hertz and Parikka note that obsolescence can be identified as a form of “micropolitical level of design” which supports “black box” devices (426). An example of this particular industrial model can be found in those digital devices whose

1 It should be noticed that planned obsolescence is an issue that can be tackled from many angles and therefore can be interpreted from different perspectives. While according to social critics such as Giles Slade, goods are “made to break” due to planned obsolescence, new media scholars such as Sterne and Parikka focus on the cyclic aspect of technological production, which is partially unlinked to the issue of physical durability.

(10)

internal components are not easily upgradable or replaceable. Users are therefore either forced or persuaded to buy a brand new replacement object instead of keep using the old one.

Obsolescence also affects operative systems, computer programs and file formats. Neither hardware nor software is exempt from this problem. In 2013, software engineering Keith Bare and computer scientist Michael Dillo worked along with Carnegie Mellon University and new media artist Cory Arcangel in order to extract different images from floppies belonging to Andy Warhol (D’Agostino). The American artist created a series of digital pieces by using a prototype of the Amiga 1000 in the second half of the eighties. These works, though, were never made public until Warhol’s digital equipment was restored and analysed. Among the many difficulties encountered, Bare and Dillo had to deal with a problem concerning file formats that were no longer supported. After successfully copying the files onto a new computer equipped with Amiga emulators, the two computer scientists were not able to open all of them. While some of the files were easily converted to a newer and legible format, others were apparently unreadable. Warhol had used a Commodore program called GraphiCraft which mainly worked with .pic and .iff format files. Although Bare and Dillo emulated the program used by Warhol, some of his files refused to open and were not possible to convert. Only after a hacking session, Bare successfully converted .pic and .iff unreadable files into a newer format (Detailed Technical Report). Thirty years in the world of digital media is a long time and Bare and Dillo had to work not as computer scientists, but rather as digital archaeologists. The example above demonstrates that successful operative systems and popular file formats can become obsolete in a matter of years. Technology companies systematically replace computer programs with newer versions and stop supporting older releases, while proprietary file formats usually require specific applications in order to be opened as the encoding-scheme is not free or standardised. Hence it is very complicated to access digital files and objects without a full knowledge of how they work(ed).

In the absence of the right software that interprets the implicit structure of a format, each bitstream is actually a mere sequence of 0s and 1s which “can represent almost anything” (Rothenberg,

Ensuring the Longevity 43). Bit sequences can be “interpreted as numbers, characters, images and

the like. The meaning of the patterns is always not obvious. Numbers can be stored in a baffling variety of formats. An integer might [for example] be represented by 8, 16, 32 or 64 bits. The bits could be read from left to right or from right to left. Negative numbers could be encoded according to either of two conventions, called one’s complement and two’s complement. Other variations include binary-coded decimal and floating-point numbers.” (Hayes 411). It is therefore necessary to preserve the digital framework that allows bits to be interpreted (encoded) correctly.

(11)

“[d]igital media suffer from physical decay and obsolescence.” (Rothenberg, Avoiding

Technological Quicksand 15). Independently of how they are produced or designed, digital media

theoretically require a constant maintenance and proper conditions to keep working. Therefore they are not free from the risks of ageing, electro-mechanical failures, disasters and attacks. Each digital machine, whether a smartphone, a video game console, a router in the Internet network or a cloud data server, is affected by the law of physical degradation. This is all the more so if we keep in mind that digital devices usually have to function and be efficient twenty-four hours a day, seven days a week. As Bruce Sterling likes to point out, new media, like the universe itself, are subjected to entropy which is all about “delamination, disintegration, deterioration, degeneration, decomposition, and doddering decline” (Digital Decay 14).

Component failures are common problems related to new media’s materiality. Devices are usually needed to execute constant work in variable conditions, thus their inner components, made up of different and fragile microelements, have to work simultaneously and under the pressure of various factors such as changes in temperature, power surges, electromechanical wear. A single malfunction or variation in the proper functioning of a component can cause loss of data or the impossibility to use the machine or the software installed in it. Computer chips “have limited physical lifetimes” (Rothenberg, Avoiding Technological Quicksand 13) and the incessant production of heat due to the conduction of electrons within circuits may cause severe repercussions in the future use of hardware. Regarding this matter, Parikka points out that in Finland dismissed paper factories “are being reused as server farms partly because of their proximity to water, which acts as a cooling mechanism” (106).

All magnetic and optical supports for data storage suffer physical degradation because of diverse factors. Media degradation can be the result of hardware put “under conditions of high temperatures”, humidity, “contact with magnetic materials”, catastrophes and accidents, or “wear as a result of excessive use” (Ross and Gow IV). Storage media are also affected by so-called bit rot (also known as bit decay), which is the condition of slow and relentless degradation of stored data. Once the decay has started to appear, especially on magnetic disks and eprom chips, it is impossible to stop it. “[U]nused programs or features will often stop working after sufficient time has passed, even if 'nothing has changed'” (Raymond n.pag.).

One common misconception about new media regards their supposed immateriality. This strange assumption has been driven by several factors such as an inadequate comprehension of inner mechanisms of digital devices and the emerging idea of the cloud2. As Andrew Blum points out, the 2 The cloud is understood here as a “large pool of easily usable and accessible virtualized resources (such as hardware, development platforms and/or services)” [italics mine] (Vaquero et al. PAG). The wrong idea of

(12)

Internet is built upon a concrete interconnection of structures, but “the cloud asks us to believe that our data is an abstraction, not a physical reality” (n.pag.).

New media can be functioning devices, but also e-waste, discarded hardware, toxic materials. In 2005, over three hundred million electronic devices were dumped by Americans (Greenemeier n.pag.). As a predictable consequence of planned obsolescence, nearly two hundred million of the discarded devices were still functioning. This trend, which is common to the majority of developed countries, appears to continue, as a recent report of United Nations has shown (Kees et al.).

Understanding that new media are digital matter can be useful in order not to forget that they are a perfect combination between software and hardware. Of course these notions can be considered independently on a practical footing, but this distinction may likely be challenged. This is not only because, from a materialist perspective, software itself is the result of tangible interrelations between technological design and human expertise (see Latour; Kirschenbaum), but also because software is a hardware inscription and bits are “material entities” (Blanchette 2; see also Kirschenbaum; Sterling, Delete Our Cultural Heritage). New media depend on matter and information (bits). But matter – digital or not – decays anyway.

In conclusion, we should consider obsolescence as a twofold concept that represents a double menace. New media decay is caused by planned cycles of obsolescence and relentless erosion of materials. Digital culture is inevitably controlled by laws of matter and techno-capitalism. It should be highlighted that both living and lifeless things have obsolescence inscribed into their existence and they are therefore planned with obsolescence. Rather than denying the obviousness of planned obsolescence, we must claim that ongoing development and ongoing decay go on the same pace.

immateriality probably lies on the virtual nature of the cloud itself, which is perceived as virtual, but not concrete by users.

(13)

Digital Preservation and Video Games

In 1996, the then Intel Corporation chairman, Andrew Grove (who recently passed away), stated that “[d]igital information is forever. It doesn’t deteriorate and requires little in the way of material media” (qtd. in McKenna 14). Ironically, only one year before this assertion, digital preservationist Jeff Rothenberg addressed the problem of digital obsolescence in one of the most influential articles in the field, claiming that: “The content of and historical value of thousands of [digital] records, databases and personal documents may be irretrievably lost to future generations if we do not take steps to preserve them now” (Ensuring the longevity 42).

Perhaps you may have read about some digital preservationists’ “rant” somewhere. Digital archivists and preservationists often give examples comparing traditional media’s lifespan with new media’s; for instance: “We can read a thousand-year-old manuscript, yet archivist cannot decipher some materials that are less than 20 years old” (Gould and Varlamoff 46). These kinds of assertions may sound a bit of a truism within digital preservation, but they undoubtedly describe rather well the current situation. Computer design Stuart Hillis reminds us, for example, that after shutting down the PDP–10 belonging to the MIT Artificial Intelligence Lab, “there was no place to put files except onto mag[netic] tapes that are by now unreadable. (qtd. in Brand, The Clock of the Long

Now 84)” This caused the loss of historically valuable data like “the early correspondence of the

founders of artificial intelligence” including Marvin Minsky and John McCarthy (84). Although well reported cases of digital data loss are not numerous (Rothenberg Avoiding Technological

Quicksand; Ross and Gow), scholars and preservationists’ warnings should not be underestimated.

According to Kathleen Fitzpatrick, digital preservation is at an early stage of its development. Therefore, whereas we have centuries of experience in preserving traditional media such as books and other kinds of print texts and manuscripts, we still have to refine our techniques aiming at the conservation of digital information (123). However, the lack of adequate specialist knowledge is only one aspect of the issue

Persistent obsolescence and ongoing technological development set a pace difficult to keep up with. Digital preservation must constantly accommodate to this velocity of change, by trying to understand how obsolescence can be tackled in the most effective way. This task is extremely demanding given that, as previously mentioned, digital information is concurrently both material and logical. New media require a completely different approach from traditional artefacts. One does not expect an object from the past (such as a vase or a hand tool) to be functioning or usable today, but it sensible to argue that digital records, software applications and video games should retain

(14)

their functionalities over time. We ask, as Doron Swade puts it, “functional intactness” of software because only through a process of interactivity digital information can genuinely be understood and analysed (199). In the absence of this parameter the risk is a loss in terms of both inherent informational content and evidential and cultural memory.

Every day we often unwittingly produce and interact with an incalculable amount of data. From instant messaging to emails, ID magnetic badge swiping to online banking, bits are everywhere, out of human sight yet extremely ubiquitous. Nearly four billion gigabytes (3.7 exabytes) per month were generated worldwide by mobile networks in 2015, according to a Cisco white paper (n.pag.), whereas the global daily production of data was estimated to be 2.5 exabytes in 2012 (Wall n.pag.); over five hundred titles, between indie and companies’ games, were released for the Playstation 4 by the end of 2015 (Arikó n.pag.), while more than 2 million mobile apps are currently, as of April 2016, available on Google Play store (Number of Available Android Applications n.pag.). Although this is a partial overview of the incredible amount of data in circulation, still it is a good example of how much information will likely become obsolete. However, currently the actual challenge for digital preservation is not merely to save the gigantic amount of data produced every year by private users and companies since “[t]here is more room to store stuff than there is stuff to store” (Brand,

Escaping the Digital 46-47; see also Hedstrom 192), but to figure out what kind of techniques

should be enhanced in order to allow data to be accessed in future. As digital information is inherently diverse, various approaches should be considered.

Digital objects

Before examining migration and emulation preservation strategies, it is essential to introduce a concept, digital object, that will help us to address the main question of this thesis. For the purpose of this work, digital object must be understood as a coherent entity that is made up of bitstreams. Of course, this is just a methodological abstraction. As part of vast and varied networks, digital information is not so easily containable. Nonetheless, the term object draws attention to both the particular, material manifestation of bitstreams and the cohesive nature of the latter.

Archivist Kenneth Thibodeau distinguishes between three layers within a digital object: the physical layer, the logical layer and the conceptual layer. On a physical level, a digital object is represented by “an inscription of signs on a medium” (6). Bits are material inscriptions on the surface of a storage device that do not have meaning until a process of encoding translates them into significant information. Hence, a digital object is also a logical object because its medium inscriptions can be recognised by a software as meaningful and interpretable information (7). Physical and logical

(15)

layers are independent from each other because their governing rules are simply different. A physical object is the result of a material manifestation, whereas a logical object is a “unit” that is interpreted by a set of rules managed by an application. It should be mentioned that Thibodoeau’s viewpoint is limited to digital documents such as PDF and word processing files. However, software applications themselves may be considered digital objects whose logical layer is recognised according to the logic of another application software (see Thibodoeau 7). For example, a computer program is a logical object according to the logic of an operating system, while a file is a logical object in accordance to the logic of a computer program. The third layer is the conceptual object, that is the actual and meaningful representation of a digital object. The qualities and attributes of conceptual objects are contained within the logical layer. Given that the same digital object can be encoded differently depending on the application, there could be more than one logical layer representing the same conceptual object. We should add that the conceptual object is what a digital object is supposed to be at a representational (and functional) level. For this reason the conceptual layer should remain unaltered over time. This aspect is crucial because, as we will see later, the same digital object can be represented differently according to different strategies of preservation. Although they are conceptually independent of each other, physical and logical objects constituting the conceptual layer should be preserved in order to ensure that the integrity and authenticity of the object are not altered.

Video games as complex digital objects

Technically, each digital object possesses a connotative set of properties, but it is possible to define its general characteristics by placing it within a class or category (see Giaretta 31-39). Our research will focus on the class of complex digital objects, more specifically on video games.

Gaming environments and complex digital objects such as FITS files (Giaretta 31), simulations, software art and virtual worlds (Delve and Anderson XXXVIII) are peculiar because of their problematic structure. When it comes to preserve these objects, “layers” of scale and detail must be taken into account (Delve and Anderson XXXVI). The issue of scale refers to great amounts of data that need to be preserved (this is especially true of huge gaming environments and virtual worlds with users all over the world and constant daily traffic). The major difficulty here is to trace all the data produced and decide how to preserve them. Another difficulty that preservationists have to face is related, for example, to the fact that data produced by users may be owned by companies. The issue of detail concerns “the problem of drilling down through many layers of technical items which have various levels of interconnectedness, both within digital objects themselves and also with their

(16)

technical environments” (Delve and Anderson XXXVII).

Video games3 are not only a challenging case study per se, but they can also be used as a prime

example of several issues and contradictions which lie at the core of the new media’s world. Moreover, digital games are becoming so increasingly complex that they can be employed as a sort of reference point for investigating digital preservation. Henry Lowood notes that video games are inherently linked to other types of digital media: “There are all kinds of layers to the software, relating to other kinds of software. […] So, one argument [that] can be made for tackling [video games] digital preservation is–to put it bluntly–if you can solve that problem, you can probably solve any other digital preservation problem” (qtd. in Enis 44).

Video game preservation

Primarily, games are, like many digital objects, the result of a long process involving creativity, strategy planning and industrial production. They circulate as material commodities; hence, they are not exempt from issues of bit rot and physical degradation. Storage formats such as floppy disks, cartridges, CDs, DVDs and hard drives are susceptible to data loss. Although there is no agreement on what the lifespan of each of these devices4 is, EPROM5 cartridges seem to be the most

vulnerable to bit rot (Monnens et al.; Gooding and Terras; Newman, Best Before).

Nowadays, an increasing number of games are released and distributed through specific online platforms such as Steam, Xbox Live Marketplace, Nintendo eShop, PlayStation Now. Nonetheless, these games also require storage, peripheral devices and computing power provided by specific hardware configurations, which also require maintenance and preservation actions6. Furthermore, 3 As it is used here, the term (video) game is interchangeable with both digital game and gaming environment.

4 For instance, figures regarding longevity of CDs are varied. In 1995, a CD was expected to last thirty years (Rothenberg, Ensuring the longevity 44), while nearly ten years later the projected life cycle was about “5 years for low quality products to several decades for high quality products” (Webb and National Library of Australia 113). According to physicist John D. Cressler a burned CD may retain data twenty to one hundred years (439), but Bollacker said that its maximum life span is twenty years. Obviously, these previsions need to be taken with a grain of salt because they cannot consider all the variables (e.g. temperature, humidity, material quality, manufacturing defects) that influence media longevity (see Gladney 212-13).

5 Erasable Programmable Read Only Memory.

6 It is undeniable: matter is indissolubly related to video games. Since their first appearance, digital games have embodied an exemplary case of collision between hardware and software. Whether they are distributed through online platforms or sold in physical copies, games depend on material devices, more generally, on matter. It is essential to stress this point because emulation occurs on a software level where it is not possible to replace original peripheral devices and original hardware (e.g. consoles, monitors, joysticks, joypads, light guns, coin-op cabinets).

(17)

companies may decide to interrupt their service or remove a title whenever they deem it appropriate. Thus, the game “would only be available on systems whose owners had purchased and downloaded the title” (Monnens in Monnens et al. 145). The same issue apply to virtual worlds (Monnens et al.; Newman; McDonough et al.; Bartle) and, more generally, to all data whose existence mainly depends on cloud services. As long as the company decides to stay on the market, data are “protected” and accessible (at least in theory). But what if the company goes bankrupt or ceases to operate? How would it be possible to trace the story of unavailable digital objects?

In a nutshell: on the one hand, video games are threatened by degradation and ageing of storage technologies; on the other hand, the digital industry has the power to make its own products “disappear”. More precisely, game companies (implicitly) complicate digital preservation through a series of practices (see Newman, Best Before; Monnens in Monnens et al.; Gooding and Terras). Game studies scholar James Newman, in his Best Before book, notes that these practices create the conditions for the production of technological obsolescence. “[M]uch of the work of the games industry, in its broadest sense, is diametrically opposed to the project of game history, heritage and preservation” (Best Before 9). Backwards compatibility is just one of the many examples of such a strategy. A hardware or software system is backward compatible when it can work with the features of its previous versions7 (see WC3 n.pag.). As historian of computing Paul Ceruzzi shows, this

capability is actually a specific kind of emulation. In the early sixties, IBM started to install a program in its mainframe computers which “would allow the processor to understand instructions written for an earlier IBM computer. In this way IBM salesmen could convince a customer to go with the new technology without fear of suddenly rendering an investment in applications software obsolete. Larry Moss of IBM called this ability emulation” (149; see also Rothenberg, Avoiding

Technological Quicksand 25; Slade 188). In fact, a hardware emulator for one of the first mainframe

computers was sold by IBM itself already in the late fifties (Bollacker 110). Despite its wide implementation in game consoles and operating systems, backwards compatibility seems however to be flawed8 (Newman, Best Before 57-58) or incomplete (Monnens in Monnens et al. 143;

Guttenbrunner 46). The reason is that, because of different hardware architectures and software protocols, older versions are not necessarily easy to emulate. For instance, with the advent of Windows 98, which was partially based on DOS, users had to run slowdown applications in order to play native DOS games. The retro-compatibility feature of Windows 98 could not emulate old

7 “There are two kinds of backwards compatibility: the obvious one of a version of a specification with previous versions of the same, and another one of new technologies with earlier ones.” (WC3 n.pag.).

8 Recently, for example, XBOX 360 and Playstation 4 users have reported problems of low frame rate and gameplay while playing old titles with their new consoles (Klepek n.pag.; Leadbetter n.pag.).

(18)

games at the original speed because the then architectures worked differently from their predecessors. Furthermore, as video game preservationist Mark Guttenbrunner shows, backwards compatibility is “a commercial strategy and not done with preservation in mind”; therefore, “usually only the previous generation of games is supported” (46).

The existence of backwards compatibility, which should somehow extend the lifespan of digital objects, in fact expounds a contradiction in new media. Digital objects are repeatedly absorbed by the next layer of newness. The process of incorporation does not work in order to preserve prior layers: it is not meant to preserve anything, but rather to withdraw and slowly erase previous layers of newness. It is not by chance that Sean Cubitt states: “[t]he digital realm is an avant-garde to the extent that it is driven by perpetual innovation and perpetual destruction” (n.p). As soon as a game ceases to be a profitable product, companies and industry are no longer interested in supporting it. It may be surprising to learn that, for instance, “a few companies do maintain archives of their own games” (Gooding and Terras 23), and many of them firmly oppose emulators and amateur preservation practices. In this respect, the comments made in 1999 by the then Nintendo spokesperson, Beth Llewelyn, are exemplary: “Emulators are illegal, and they continue to support counterfeiting and piracy” (qtd. in Kahney n.pag.). To date, Nintendo’s stance has not changed (n.pag.).

Besides from being intellectual property products owned by their companies, video games are an expression of our contemporary culture. They are, as Lev Manovich put it, “cultural software” (7; see also Guttenbrunner, Becker and Rauber; Newman, Best Before) because they are influenced by the cultural knowledge of our age. Considering that digital objects and software are present in all facets of our everyday life, preserving these artefacts means to allow future historians and scholars to study the past from a privileged point of view. Richard Bartle summarises this point by considering that scholars may want to analyse a digital object a) to understand and learn from our culture; b) to learn more about our technology (e.g. how it was made, how it worked, how it was designed); c) to appreciate it from an aesthetic and recreational perspective (Bartle 13-17)

(19)

Emulation as a long-term preservation practice

Emulation is a technique “in which one computer system reproduces the behavior of another system” (Rothenberg, An experiment using emulation VI). More specifically, emulation is a digital process that aims at simulating either specific software applications or hardware architectures (or even both). Emulations is usually done with an emulator, which recreates the “emulated system” within an “emulating system” or host system (see Rothenberg, Using Emulation).

Although they are conceptually similar, emulation should not be confused with virtualisation, which creates a virtual machine that primarily (but not entirely) relies on the hardware of the host machine in which it is installed. On the other hand, emulation is able to recreate an emulated system whose hardware (and software, of course) can be entirely different from that of the host system.

Emulation and migration

Since the late nineties, emulation has been considered a valuable long-term digital preservation strategy (see in particular Granger; Rothenberg, Avoiding Technological Quicksand; Ross and Gow; Webb; Van der Hoeven and Van Wijngaarden; Gladney; Guttenbrunner). In addition, emulation has often been associated with migration, which is also regarded as a preservation practice (see in particular Hedstrom; Lee et al., The State of Art; Webb; Gladney; Reis; Giaretta). Although emulation and migration can be combined in order to provide a mixed approach (see Lorie; Van der Hoeven and Van Wijngaarden), they are two completely different strategies.

Digital migration is the process of transfer of digital objects from a software/hardware configuration to another. “In this context, one should distinguish between content migration, which transforms data from a source format in to a target format, and media migration from one digital medium to another (either digital or non-digital) medium” (Reis 89).

While media migration is usually an easy and trustworthy operation (Gladney 9), content migration is susceptible to produce faulty results due, for instance, to the huge variety of proprietary formats. These can be regarded as black boxes given that they are not compatible with every platform or software. Therefore loss of authenticity and integrity is to be taken into account. In addition, the same format may have different versions which require multiple software configurations in order to be migrated and to remain accessible (Reis 89). Content migration is usually very effective when the process involves a target configuration model that implements the whole set of characteristics of its previous versions (Reis 89).

(20)

The major difference between migration and emulation lies in the way digital objects are treated in the process. Within migration the digital object is somehow changed in order to be recognised by a system. Emulation, instead of modifying the object, changes the environment where the objects work:

Migration is aimed at the digital object itself, and aims to change the object in such a way that software and hardware developments will not affect its availability. […] Emulation does not focus on the digital object, but on the hard- and software environment in which the object is rendered. It aims at (re)creating an environment in which the digital object can be rendered in its authentic form. (van der Hoeven and van Wijngaarden 1)

Migration seems to be an effective solution when a) it is not possible to plan an alternative preservation approach; b) it is used to preserve large amounts of the same data (Granger n.pag.; see also Rothenberg, Avoiding The Quicksand 13-4); c) the object to be preserved is not complex; d) when the authenticity and integrity of the object are not fundamental (Reis 93; Rothenberg, Using

Emulation 32).

Due to their multifaceted nature complex digital objects cannot be easily migrated without a significant modification of the original bistream and its functionalities. This applies to such specific complex objects as video games. Pinchbeck et al.’s position on this matter is very clear: “migration is of limited value in digital preservation generally, and of extremely limited value in the preservation of games […] the notion of migrating games, even without considering the general problems with migration, is clearly not feasible” (3). It should be added that migration of games is mainly done by companies through a technique known as porting. To ensure that the game runs on different platforms it is necessary to recompile its original game source. This process can lead either to the making of new and different versions of the same game (see Newman, Ports and Patches) or it can be employed solely to run the game on platforms other than that on which it was originally created. In both cases there would be several digital objects, each with its specific bitstream. That said, porting is a good way to migrate games whose platforms are obsolete and the process is neither error-prone nor infeasible (compare Rothenberg, Avoiding Technological Quicksand 13 and 16). Nonetheless, porting as a migration strategy cannot be regarded as a viable option by the “digital preservation community” due to the nature of practicalities involved” (Anderson et al. 114; see also Pinchbeck et al. 3). Moreover, porting is primarily a commercial strategy pursued by game companies and is not intended to be a preservation approach.

(21)

Emulation strategies

Rothenberg, one of the first proponents of emulation as a preservation strategy, suggests an approach based on data encapsulation, which is a technique that serves to put together data and the information necessary to read them9 (Avoiding Technological Quicksand 17). The logical layer of an

object and its application software (reader) have to be encapsulated. Furthermore, since Rothenberg’s strategy focuses on digital records, an operating system that manages the application software should be included. To finalise the process of encapsulation, it is necessary to add both information about the emulator that will run the application software – which may vary depending on the specific hardware configuration – and metadata of the document itself (Avoiding

Technological Quicksand 19). The purpose of encapsulation is basically to have a toolbox

containing original bitstreams, a detailed description of the system configuration and emulator, and descriptive data about the digital object to preserve. Although it is possible to emulate software applications and operating systems, emulation of hardware has to be preferred since it is overly complicated to detail the behaviour of old computer programs (Rothenberg, Using Emulation 25). Rothenberg discusses various emulation techniques.The layered emulation approach (also called chaining, see Slats 34) aims at creating an emulator chain for a platform P0. When P0 is becoming obsolete, an emulator EP0 should be written in order to run the current system on a newer one (H0). Once that the host platform becomes old, the previous emulator EP0 is itself emulated by a newer emulator EP1 that runs on a newer host platform H1 and so forth. This approach leads to a chain of emulators that run on an up-to-date platform in order to mimic the original platform P0 where an application software can read the digital object (Rothenberg, Using Emulation 39).

The emulator specification approach focuses on writing specifications of computer platforms by using a standard programming language. At a later stage it will be possible to select a particular specification in order to write a platform-oriented emulator (Rothenberg, Using Emulation 40). In a different approach called specification interpreter, a sort of super emulator selects a specification and run an emulation process on a future host machine. Jacqueline Slats discusses a similar approach called rehosting, where an emulator is constantly ported to newer platforms without using chaining (35).

9 A clear definition of encapsulation can be found in the UNESCO Guidelines for the Preservation of Digital Heritage drawn by Colin Webb and the National Library of Australia: “Encapsulation is a widely adopted means of binding together data and the means of providing access to it, preferably in a ‘wrapper’ that describes what it is in a way that can be understood by a wide range of technologies (such as an XML document). Because it is often impractical and unnecessary to encapsulate the actual means of access such as software and hardware, encapsulation usually bundles metadata describing or linking to the correct tools” (127).

(22)

Emulators can also run on a virtual machine that mimics the original hardware configuration. As soon as the host system becomes old, the virtual machine can be ported (or even emulated) to a newer system. Therefore, it is not necessary to migrate old emulators or to rewrite them every time that a platform becomes obsolete. Raymond Lorie, engineer at IBM, proposes an approach based on a universal virtual computer (UVC). The UVC is essentially a software application that loads archived objects that are encapsulated with a compatible program. Both the virtual machine and the program are written in the same simple programming language. The UVC is written in such a way that it is independent on the host platform. In addition, given its relative simplicity, the UVC can be easily emulated to future platforms.

The modular emulation approach aims at emulating specific hardware components of a given computer configuration. A module is a set of software instructions that replaces the original hardware device (Rothenberg, Using Emulation 44). This approach is discussed by Jeffrey van der Hoeven and Hilde van Wijngaarden. They define modular emulation as “Emulation of a hardware environment by emulating the components of the hardware architecture as individual emulators and interconnecting them in order to create a full emulation process” (10). Similarly to previous approaches, modular emulation seeks to mimic a hardware configuration instead of emulating applications or operating systems. This kind of emulation is called full emulation process because hardware platforms are emulated by a so called surrogate, namely an emulator that reproduces the original configuration of each single device. (5). Modular emulation proposed by van der Hoeven and van Wijngaarden is based on a Universal Virtual Machine (UVM), that is an advanced version of the UVC. Both virtual machines are substantially independent of the system in which they are installed. The UVM provides virtual support hardware for a modular emulator. For each platform, a modular emulator can load a specific library that is composed by different modules (e.g. CPU, memory, sound). Within the UVM environment, the emulator can work with the original application software (and operating system) and the original bitstream of the digital object. In theory, thanks to this approach, each obsolete platform can be emulated according to a specific set of modules loaded in the UVM. In addition, the UVM, which is programmed using a standard language, can be installed on future platform, thus avoiding the need of writing new emulators and module libraries. From a technical point of view, modular emulation is the best approach to preserve and emulate video games. The emulator and its module library could guarantee an accurate emulation for inner components with an emphasis on long-term preservation. The great practicability of the virtual machine is indeed a huge advantage. However, as shown by Guttenbrunner, the majority of game emulators implemented so far are not based on the UVC concept (84). Therefore, they are themselves, like any other digital objects, at risk of obsolescence and need to be preserved. A

(23)

notable exception is Dioscuri, a modular emulator that is based on a virtual machine written in Java, and whose development is supervised by Van der Hoeven. Dioscuri, though, has been tested primarily with simple objects within simple operating systems (MS DOS and MS Windows 3.x), and its efficacy still has to be proven. Furthermore, the UVC emulation is not free from drawbacks. The main disadvantage regards the several layers of computer architectures, both real and emulated, involved in the process. Think, for example, of a computer that has to work with an operating system in which a virtual machine has to be installed along with a modular emulator, its libraries and the native environment (an application software and an additional operating system) of a digital object. The entire burden is on the host machine computing power, which, at present, cannot be enough to sustain the process.

Open issues

The emulation approaches that are described here offer the opportunity for some considerations. Early advocates of emulation as digital preservation practice were mainly interested in simple digital objects, such as records and documents. This is strange considering that since the end of the nineties a huge community of amateurs and aficionados started to write and distribute emulators on the web with considerable results. Perhaps an explanation can be found in the lack of interest in emulation itself as a preservation practice. The attention to video games, which are often considered too trivial to be preserved, is, therefore, something relatively recent.

The reality is fairly different from emulation proponents views. Several emulation frameworks have emerged over the years10. However, video game emulators are being mainly developed by “a

grass-roots movement led by a few dedicated programmers” (McDonough et al. 62). The Multiple Arcade Machine Emulator (MAME), which was initially released in 1997, is probably the most important video game emulator. “MAME main purpose is to be a reference to the inner workings of the emulated machines. This is done both for educational purposes and for preservation purposes, in order to prevent historical software from disappearing forever once the hardware it runs on stops working” (n.pag.). Most importantly, MAME has recently been released with an open source licence. Thanks to this feature, more people may potentially contribute to the project, while MAME can be easily ported to future platform. It is, indeed, a good example of scalability (see Guttenbrunner 59) because the emulator development can be monitored by a bigger community

10 Among them are the KEEP (Keeping Emulation Environments Portable) project and the Internet Archive framework. The KEEP is a EU-funded project started in 2009 and ended in 2012 that principally focussed on complex digital objects, while the Internet Archive website has a few sections devoted to video game emulation.

(24)

and it does not depend on proprietary modules. Hopefully it will be therefore possible to contain the problem of recursion, that is when “there is no longer compatible hardware to run the emulator itself” (Bollacker 110). MAME can be considered an exception because accessible emulators are not developed according the basic principles of long-term preservation.

A further problematic issue in previous approaches is the lack of discussion about metadata, that is information, or data, about data. Metadata are not only indispensable because they provide valuable details about the provenance and history of an object, but also because they can supply technical information about its original platform and environment. The processes of digital preservation can undoubtedly benefit from a smart use of metadata. One of the most sensible critiques of Rothenberg’s approach was made by David Bearman, who argued that an analysis of metadata involved in the process of emulation would be necessary:

Rothenberg’s proposal does not even try to define the elements of metadata specifications that would be required for the almost unimaginably complex task of emulating proprietary application software of another era, running on, and in conjunction with, application interface programs from numerous sources, on operating systems that are obsolete, and in hardware environments that are proprietary and obsolete. (N.pag.)

Although it is not possible to give an extended overview of metadata it may be worth taking a quick look at their purposes and uses. Among the many goals of capturing data about data are “identifying items uniquely”, “grouping items into collections within a repository”, “recording authenticity evidence” and “helping protect item integrity against improper change and unintentional corruption” (Gladney 129). Each discipline, though, has its own priorities, and may therefore use metadata differently. A digital artefact displayed in a museum might require require a set of metadata that emphasises its historical authenticity, while a video game selected for emulation should be encapsulated with technical information about the original platform. The choice to give priority to specific metadata depends on the purpose of the object. The OAIS reference model, for example, distinguishes between three macro-categories of metadata (Giaretta 17). The first one is that of understandability, which helps us to interpret and understand digital objects; the second one concerns “origins, context and restrictions”, which are fundamental to know the provenance of an object, its past uses and people who modified it; finally, the third one deals with the relationship between metadata and data, that is how objects are linked with metadata (Giaretta 17).

According to NISO, metadata can be divided into three major categories: descriptive metadata, structural metadata, administrative metadata (1). Descriptive metadata provide identifying details about a digital object. They give some basic but essential information including title, authors,

(25)

keywords, language. These metadata facilitate the search process within digital archives and allow for the unique identification of objects. Structural metadata describe the internal structure of a given object, its dependences and relationships with other objects. They are necessary to analyse those digital objects that are made of elements which can be treated individually. Think, for example, of a PDF file page or, more interestingly, the music in a game. Administrative (or technical) metadata are used to give technical details regarding an object such as the application software and hardware required to run it, copyright information and technical features. They are essential to manage and preserve digital objects as they may, for instance, provide crucial information about emulators and their execution. To sum up, metadata perform three general tasks: a) to identify the content of a digital object, b) to identify the global history of a digital object, c) to identify how digital objects and their components are linked one another and to other objects.

Each digital object displays some specificities and metadata should be able to describe the version of a game as well as the version of a module applied to emulation. The issue of metadata for video games is fundamental, yet rather overlooked. As pointed out by Jerome McDonough, “generating adequate descriptions of the games we wish to preserve” is probably more crucial than developing a preservation strategy. In fact, “[p]reservation of computer games is in many ways a knowledge management problem, and without adequate metadata, managing the knowledge necessary to keep a game accessible and understandable is an insurmountable task” (186). Nonetheless, the majority of video games emulators do not support metadata (see Guttenbrunner, Becker and Rauber 87) and even descriptive metadata are sometimes difficult to be provided. Gooding and Terras show how difficult it is to obtain trustworthy information about games (33). Online resources usually rely on contributions by amateurs, while game companies who could provide genuine information, both descriptive and technical, show little interest.

Another interesting point, which is somehow related to the variety of uses of emulation, is that authenticity is not widely discussed, almost as if it was taken for granted. As many authors argue, authenticity is the principal advantage of preserving digital objects through emulation (see Rothenberg, Avoiding Technological Quicksand and Using Emulation; Slats; van der Hoeven and van Wijngaarden; Gladney; Reis). The major reason for that is that emulation usually works with original objects, without the need of migrating them. A change in the structure of the original bistream may indeed cause a loss of authenticity. In this regard, Slats argues that authenticity can be defined as the sum of authentication and integrity, which are concepts related to digital archiving. The term integrity indicates the capability of an object to retain its original structure over time. For instance, a given digital object A¹ has been migrated to a newer storage medium. The copy A² and the original A¹ share the same sequence of bits; therefore A² has integrity because is exactly alike A²

(26)

(Gladney 98). On the other hand, the process of authentication ensures that a digital record has been verified by someone who has the authority to accomplish this role. In other words, authentication warrants that the record does not result from any manipulation, substitution, or falsification” (Duranti 7). However, these criteria, which are inherently variable, are not sufficient to develop an authentic emulation of digital complex objects. Because of the highly technical context in which digital objects and emulation are involved, authenticity is in fact “difficult to define precisely” (Slats 11). Furthermore, it is not simple to define stable parameters for authenticity which may vary from purpose to purpose and from discipline to discipline. “For example, if the only meaningful aspect of a text document is considered to be the sequence of words contained in its body, then a conversion that loses or corrupts the document’s original layout, structure, pagination, fonts, spelling, punctuation, footnotes, cross-references, citations, etc. may be acceptable” (Rothenberg,

Using Emulation 15). Authenticity within gaming environments cannot only rely on integrity and

authentication. Even though from a technical point of view the original bitstream remains unchanged, the original conceptual object may be modified by the process of emulation itself (see Guttenbrunner and Rauber on rendering 14). A flawed reproduction of the original platform changes the look and feel of the original object because it modifies the pivotal relationship between software and hardware.

Guttenbrunner shows that it is possible to select some criteria in order to evaluate accessible emulators to preserve console video games. Significant properties of digital objects are chosen as part of the evaluation process. According to Guttenbrunner, the authentic look and feel of the game is defined by a) sound, b) graphics, c) network support, d) interactivity, e) input, d) support for additional items (54-58). However, to truly evaluate the process of emulation it is necessary to compare the “characteristics and properties of the original system and the digital object. Then these same properties are extracted from the emulation environment” (Guttenbrunner and Rauber 19). These characteristics also depend on the type of video game that should be emulated and analysed. For example, a Massively Multiplayer Online Role-Playing Game (MMORPG), which is fundamentally played through the Internet, has some traits that are not traceable in many other games. MMMOPRGs are unique in their own way because they are structured as virtual worlds (which cannot be preserved through emulation). These games are certainly a borderline case, given that they present a specific problem in digital preservation (see McDonough et al.; Bartle). Nonetheless, it is sensible to argue that “parameters of authenticity” in game emulation can be selected case by case and according a predefined goal.

(27)

Authenticity and signifcant properties

Authenticity is a recurring notion within digital archiving and digital preservation. It is not simple to give a clear definition of the term authenticity, though, as every discipline provides different interpretations of it. However, it can be argued that authenticity is central to assessing correct parameters for the quality of digital objects preservation. Authenticity of games within emulated environments is a crucial aspect and can be defined as the capacity to reproduce the gameplay and performance of a given object as close as possible to the original. In other words, an authentic emulation should retain the whole set of aesthetic, functional and interactive characteristics of the game.

Unlike “analogue artefacts”, digital objects, which are dynamic by their own nature, are influenced by social and technological processes. Applying the concept of authenticity can therefore be a very tricky task. Firstly, digital objects may differ greatly from one another. A digital record, for example, has characteristics which can hardly be found in a software application. Furthermore, objects may involve different technical properties within the same category. The current generation of digital games presents a degree of technological complexity that is not present in early games. In addition, to make things even more complicated, every reproduction or rendering of a specific object is potentially unique in its own way, by the virtue of the fact that it has to be processed through a variable combination of software and hardware. Furthermore, objects can be altered by digital preservations processes (migration, for example) or edits that are not authorised. Every change in the object or in its metadata may modify its integrity and authenticity. Finally, new or unoriginal digital environments may change the functionalities and look and feel of objects. These are determining factors in order to establish the authenticity of complex digital objects such video games whose interactivity and rendering are essential for their correct execution.

The continuous process of preservation, which tries to overcome the problem of obsolescence, forces us to question the concept of authenticity, as it modifies environments and objects in order to allow (future) users to visualise, read, use and edit them. “The authenticity of digital resources is threatened whenever they are exchanged between users, systems or applications, or any time technological obsolescence requires an updating or replacing of the hardware or software used to store, process, or communicate them” (Giaretta 207).

The paradox of the concept of authenticity lies in the false reasoning that digital objects may retain their functionalities and inherent properties throughout time because of their supposed immateriality or that they may be subjected to light modifications that make them recognisable after all. No object remains actually the same object, let alone digital records and software applications that are

(28)

compelled to face obsolescence much earlier than their “analogue” counterparts.

Authenticity

In the field of digital archiving, Luciana Duranti distinguishes between the concepts of reliability and authenticity. She argues that reliability alludes to the “trustworthiness of the records as evidence” (6). In Duranti’s reasoning, a record is understood as a document produced during a formal procedure, and can be considered reliable when it shows a certain degree of evidence in relation to its content. For instance, a driver licence is reliable as long as it states that its owner has the right to drive a vehicle. Hence, a reliable document must adhere in its form and content to the socio-juridical system in which it has been created. Depending on the system or the rules of record production, a document can be reliable to a greater or lesser degree. Rules of record creation are extremely important to create the conditions for a very reliable document. A licence may be considered unreliable if provided without a stamp by a recognised authority.

On the other hand, a document is proven to be authentic when it has not been manipulated, substituted or counterfeited, or, in another words, when it is the document that it claims to be (7). While reliability is referred to the socio-juridical system that makes possible the creation of a document (at a given time), authenticity is prominently related to the ongoing process of transmission of documents and their copies (over time). The authenticity of a document has therefore to be regularly monitored and eventually updated because, according to Duranti, it may change in time.

Although this interpretation of authenticity appears to be particularly effective in relation to digital documents that are subject to modifications and transfers, it is hardly applicable to complex digital object such as video games and software applications in general. A non manipulated object is certainly the premise for every preservation process, but one should also take into account the software/hardware environment on which that given object is executed.

Archival scholar Heather MacNeil and digital culture expert Bonnie Mak argue that the concept of authenticity is related to a kind of trust built “on centuries of social, cultural, and political negotiations among archives and libraries, their governing bodies, and the public” (27). Nevertheless this trust can be challenged, as authenticity has to be considered as a dynamic concept. The authors argue that authenticity is a factor that relies on specific contexts and times in history. The authenticity of an artwork, for example, does not remain fixed in time, but changes because of interactions with the social environment and preservation practices. Despite restorative works a piece of art still remains authentic because it “does not exist outside of the discursive framework of

(29)

conservation theory and practice” (33). In addition, MacNeil and Mak point out that different version of the same artwork are not “corruptions” (36), but spontaneous creations resulting from the interaction between the artist and other prominent figures such as critics, reviewers and editors. “The notion of an authentic record in evidence law (in the sense of a trustworthy statement of facts), like the notion of an authentic art work or literary text, is shaped to a considerable degree within a specific social and institutional framework” (43). Therefore, we should say that authenticity of artefacts is unavoidably influenced by the social and institutional environment in which the objects are placed.

For these reasons, it is important to understand that preserving the original object through (exact) copies of the original bitstream is not enough in order to achieve a high presumption of authenticity. Maintaining the original bitstream is only a first step in order to avoid issues related to the physical object obsolescence (see Gladney 218). Both digital archivists and digital preservationists agree, as mentioned earlier, that every digital object is part of a web of social and technological relations. In the area of digital preservation, Rothenberg argues that saving the bitstream of a digital object is not sufficient to preserve its authenticity. It is equally necessary to save the conditions that make it possible to read that object. Therefore, it is also necessary to preserve the metadata referring to the object in order to make it usable. Finally, one should save the bitstreams that can be used as “interpreter” for rendering the digital object (Preserving Authentic Digital Information). Rothenberg also points out that “preservation implies meaningful usability” (54). Every object has to be preserved in such a way that users are able to (re)use and read it.

According to Rothenberg, authenticity may be defined in relation to different approaches. The first one, called “originality strategy,” focuses on trying to preserve the capacity of an object to be as faithful to the original as possible.

The second approach, unlike the previous one, relies on the history and provenance of an object. Digital archivists, for example, are interested, as we have seen, in defining the authenticity of an object primarily by checking who accessed it, how it has been modified and whether it has been replaced.

The reports of the InterPARES project, an international multidisciplinary research on the preservation of digital records, state that a document can be considered uncorrupted if its presumed message is unchanged. “This implies that its physical integrity, such as the proper number of bit strings, may be compromised, provided that the articulation of the content and any required annotations and elements of documentary form remain the same” (MacNeil 45). Similarly a digital object can be considered authentic even if it lacks some features, as long as its meaning is unchanged.

(30)

The third approach is called “suitability strategy” and it focuses on defining the authenticity of an object depending on “a range of purposes or uses” (Rothenberg, Preserving Authentic Digital

Information 57). “One extreme is that a given range of expected uses might imply the need for a

digital informational entity to retain as much as possible of the function, form, appearance, look and feel that the entity presented to its author” (61) or maybe to its readership, or maybe “regardless of the capabilities that the original authors or readers of those entities may have had. (62)” Rothenberg does not mention that this third strategy actually encompasses the previous two. If a use is established by a user or a community of users, originality and provenance of objects have to be taken into account.

Signifcant properties

Depending on uses, users and the characteristics of digital objects, it is possible to identify some

significant properties. “Significant properties are those properties of digital objects that affect their

quality, usability, rendering, and behaviour” (Hedstrom and Lee 218). Due to obsolescence and other socio-technical reasons digital objects cannot be maintained in their original form and environment. Therefore, some essential characteristics have to be defined in order to preserve the “core” of an object. In this way, the object would still remain that object even when it does not retain its full, original features. Needless to say, this implies that it may be impossible to establish “perfect” authenticity given that some parts of a given object might not be reproduced. The question is: to what extent can a digital object involved in a preservation process retain its authentic behaviour and performance?

“Significant properties are essential to define, as they play a major role in determining authenticity” (McDonough et al. 14). However, defining which properties are essential in order to establish an object’s authenticity is not an easy task.

Video games, in particular, can be described by using a number of criteria that help to define significant characteristics (see Guttenbrunner 55-58):

• Music and sound effects. • Graphics.

• Interactivity. • Input.

• Network support.

(31)

From a technical outlook, multiple layers can be identified to define the essential properties of digital games (Decker et al.). Input and output layers focus on the peripheral devices (e.g. joypads, joysticks, keyboards, controllers), visual displays (e.g. monitors, screens, led indicators) and audio technology that make the interactive experience possible. The hardware layer addresses CPU, RAM and all those components that form the hardware configuration of a platform. Firmware and

support software layers describe the “built-in instruction functionality for a particular system” (3)

and software environments (e.g. operating system, dependencies, libraries, drivers) respectively. The experience layer “deals with the actual representation of the game” and its quality (4).

Although these parameters can likely be used to describe video games, significant properties may vary depending on the type of object and its level of complexity. Let us make an example to illustrate this issue. Zork is a famous text-based game released for various platforms in the eighties. Games similar to Zork do not use 2d or 3d graphics, but only ASCII characters. Thus, users interact by typing commands. Wolfenstein 3D is a pioneering first-person shooter originally released for PC-DOS platform in 1992. Users play in an early three-dimensional environment as a fictitious American spy escaping from a Nazi prison in Germany. Significant properties of Zork are completely different from those of Wolfenstein 3D. Zork is a simple and restrained game that does not use graphics effects, while Wolfenstein 3D is a frenetic game that is built on a specially created engine. A list of significant properties for either game has been created in order to show differences and issues.

Significant properties – Zork (PC-DOS) Significant properties – Wolfenstein 3D (PC-DOS • Text recognisability and readability

• Keyboard as input device • Original font

• White text on a black background

• Graphics engine (e.g. FPS, colours, virtual environment) • Keyboard as primary input device

• Original fluidity of controls

• Sounds and music correctly executed

Table 1. Significant properties of Zork and Wolfenstein 3D

These properties have been selected according to the objects’ characteristics, but it is not straightforward to understand what is and what is not significant. From an objective point of view, every characteristic should be ideally included in a list of significant properties. However, one may argue that retaining, for instance, the original font of Zork is not essential as long as the game is correctly executed. The same reasoning may be applied to Wolfenstein 3D: why does the soundtrack have to be executed without errors if this does not affect the original gameplay? Technically, it can be argued that every property is essential, since it contributes to the singularity of

Referenties

GERELATEERDE DOCUMENTEN

fiers with and without proposed features.. This is much lower than the performance of the click prediction in location search found in our research, where Precision is higher than

Two aspects of Intrusive Stop Formation are focused on - firstly, determining experimentally whether durational differences obtain between pure and derived affricates and

De ijle matrix waarop we LU-decompositie toepassen, kan als volgt met een graph (V,E) geassocieerd worden:.. De knopen van de graph (elementen van V) worden

Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication:.. • A submitted manuscript is

Hulpverleners moeten op grond van de WGBO in het cliëntendossier alle gegevens over de gezondheid van de patiënt en de uitgevoerde handelingen noteren die noodzakelijk zijn voor een

To investigate this, the experiment was repeated using three different settings (Table IX), where the attributes of the parasite and/or shock tower where lowered. Looking at the

Taking into account that product data in such a potential block chain regulated economy would involve multiple levels of volume and continuity in terms of the amount and spread of

These experimental observations are supported by discrete particle simulations that are based on analytical models: for small particles, if only viscous sintering is considered,