• No results found

Known occupational exposure to health risks

N/A
N/A
Protected

Academic year: 2021

Share "Known occupational exposure to health risks"

Copied!
71
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Known occupational exposure to health risks:

How the Dutch armed forces handled expertise regarding chromium-6 in the

face of social amplification of risk.

Master thesis

Name: Colin Crooks

Student Number: s1384856

First reader: Dr. M.L.A. Dückers Second reader: Dr. S.L. Kuipers Capstone Public Safety & Health Master Crisis & Security Management June 10, 2018, The Hague

Leiden University

(2)

2

Abstract

This thesis surrounds the 2014 crisis or scandal on the exposure of employees of the Dutch armed forces to chromium-6. The reason for this research is that media coverage claimed that there was knowledge of the risk and exposures to chromium-6 at the time of the exposures happening. One of the possible explanations is that a bad safety culture over time stopped organizational improvement from happening. Theory on safety culture is mostly used commercially in industrial settings and healthcare, which provides a chance to see if safety culture theory can also be applied in a governmental context. The research question is: “Why was the scandal of occupational exposure of maintenance personnel with the Dutch armed forces to chromium-6 not addressed while there was expertise on the risk of exposure and knowledge of exposures?”. This question is answered by making use of theory from the field of safety culture, social amplification of risk and agenda setting. The expectations are that between 2000 and June 2014, a continuously bad safety culture leads to the amplification of risk through media attention. This amplification of risk between June 2014 and June 8, 2018 produces pressure for legislative and organizational change, which drives a change in safety culture. The methods consist of qualitative content analysis of internal documents of the Dutch armed forces, media coverage, debate transcripts and documents from advisory bodies. The results show that in the first time period, the Dutch armed forces try to improve safety after political attention to exposures in 1998 and 2001. Safety culture improves from a pathological culture to a more calculative culture, but these improvements are not made or made at a slow pace. The effects of improvements diminish, resulting in alternating between a calculative and reactive safety culture in which the armed forces over the years must react to warnings from the labour inspection up until 2012. The results find that the second period is characterized by amplification of risk in the increase of the amount of media coverage and repetition of articles. This results in pressure for change produced by the Health Council, politicians, media coverage and research by the RIVM. The available data cannot say if safety culture has improved yet, but there are new plans for improvement. The answer to the research question is that employees were knowingly exposed to chromium-6, while upper management had knowledge of the effects. Management did not find reducing exposure a safety priority, and when plans for improvement were made, middle management also did not find reducing exposure a safety priority. Only when periodically the legal exposure limit is lowered, the Dutch armed forces make limited effort to lower exposure.

(3)

3

Foreword

Before you lies the master thesis ‘Known occupational exposure to health risks’. The research on the Dutch armed forces was conducted with the help of qualitative analysis of documents from the Dutch armed forces and media articles. This thesis was written as a part of the master Crisis & Security Management at Leiden University.

This thesis was supervised by Dr. Michel Dückers from the Netherlands institute for health services research (NIVEL) as a part of the Public Safety and Health capstone, set up by Dr. Sanneke Kuipers. I would like to thank Dr. Michel Dückers for his guidance and feedback during the writing process of this thesis, especially his comments after the first draft which helped me to further refine the final thesis. I would like to thank Dr. Sanneke Kuipers for setting up the capstone, which allowed for the exchange of ideas with fellow students and the easy availability of expertise.

I would also like to thank Dr. Jeroen Devilee from the National Institute for Public Health and the Environment (RIVM) and his colleagues for their suggestions and ideas on the theoretical framework for this thesis. Being able to discuss the thesis with them allowed their inspiration to let me take this thesis to the next level.

Lastly, I would like to thank my parents and close friends, who gave me the motivation to conduct my research when morale was low. Their encouragement and discussions were essential to me writing this thesis and completing it.

Colin Crooks

(4)

4

Table of contents

Abstract ... 2 Foreword ... 3 Chapter 1: Introduction ... 5 1.1 Introduction ... 5 1.2 Research question ... 6 1.3 Relevance ... 7 1.4 Readers guide ... 8

Chapter 2: Theoretical framework ... 9

2.1 Slow-burning crises and scandals ... 9

2.2 Safety culture ... 10

2.4 The social amplification of risk as agenda setting ... 16

2.5 Theoretical model and hypotheses ... 19

Chapter 3: Research design & methods ... 22

3.1 Research design ... 22

3.2 Research methods ... 23

Chapter 4: Case description ... 32

4.1 Policy and developments on chromium-6 paint between 2000 and June 2014 ... 32

4.2 Background on events and developments after June 2014 until June 2018 ... 34

Chapter 5: Analysis ... 39

5.1 Safety culture before the crisis ... 39

5.2 Social amplification of risk, pressures for change and safety culture during and in the aftermath of the crisis ... 46

5.2.1 Social amplification of risk ... 46

5.2.2 Legislative pressure and pressure for organizational change... 51

5.2.3 Safety culture ... 52

5.3 Theoretical implications ... 55

Chapter 6: Conclusion ... 57

6.1 Answer to research question ... 57

6.2 Discussion ... 58

6.3 Limitations and suggestions for further research... 59

(5)

5

Chapter 1: Introduction

1.1 Introduction

For decades, the Dutch railways and the Dutch armed forces have made use of paint containing hexavalent chromium (chromium-6), a metal which is meant to prevent corrosion on for example tanks, planes or trains. The metal itself is not hazardous when it is in solid state in applied paint but poses a serious health risk when the paint is sanded down or sprayed on to objects without the proper protection for one’s skin and respiratory system (RIVM, 2017). In 2014, a report came out that since the 1980’s technical maintenance employees working on maintenance of tanks and later airplanes in NATO workplaces for the Dutch armed forces had been exposed to this metal, even though efforts were made to ban the substance from the workplace when it was discovered that exposure to the metal could cause cancer and other diseases (Drost.nl, 2018). When a large number of employees who had worked in these workplaces reported health problems and that they thought that the cause was to blame to exposure to chromium-6 from paint, media was quick to pick up the reports and turn it into a scandal.

The going public of approximately 2100 suspected victims led to a lot of media attention and political pressure for the Minister of Defence, Jeanine Hennis-Plasschaert (Bhikhie, 2015). The political problem was that Hennis was responsible for this crisis happening while being the Minister of Defence because the Dutch armed forces were under her leadership. Hennis had to deal with harsh debates and take responsibility by offering solutions to the crisis. In short, Hennis suspended the use of paint containing chromium-6 on several air force bases because they were not equipped for the safe use of this paint. Additionally, Hennis commissioned a large investigation by the Dutch National Institute for Public Health and the Environment (RIVM), under the guidance and support of a commission which consists of representatives from all involved interests (Informatiepuntchroom6.nl, 2015; Volkskrant.nl, 2014). This commission consists of representatives from law firms representing victims, labour unions, health researchers and representatives on behalf of the employer. A large study would be conducted in smaller parts, one part being an assessment of what is known in science on the health effects of chromium-6, to be able to establish if victims have a condition that is possibly caused by chromium-6. Hennis came under fire from the opposition because of the thought that starting a large investigation was a causing a delay (Stoop, 2014). Hennis also created a compensation scheme for victims who are sick, without an admission of guilt from the government (Defensie.nl, 2015b).

(6)

6 Interesting is that the Dutch armed forces were aware of the found levels of chromium-6 in their workplaces, and that these exceeded the legal exposure limit (Drost.nl, 2018). The armed forces either did not address this problem or failed to address it in a satisfying manner. Legislation exists which limits the allowed concentration of chromium-6 in the workplace and makes the use of protective gear mandatory, but for some reason signals of bad work conditions and violation of the allowed limits were not addressed.

Furthermore, the recently completed literature study of the RIVM on the effects of chromium-6 on human health shows that there is a lot of scientific knowledge and consensus on the effects of chromium-6 (RIVM, 2017). Striking is that we know for some time that chromium-6 has bad health effects and that there is broad consensus within the scientific community on the effects. However, regardless of this knowledge and new supporting knowledge becoming available over time, there was no real attention for enforcing legal exposure limits and the mandatory use of protective clothing. It is thus the legal exposure limit that the Dutch armed forces did not enforce, but they were also possibly aware of the negative health effects or risks and failed to protect employees.

1.2 Research question

The research question for this thesis is:

“Why was the scandal of occupational exposure of maintenance personnel with the Dutch armed forces to chromium-6 not addressed while there was expertise on the risk of exposure and knowledge of exposures?”

This question comes from the argument that there is overwhelming expert knowledge on the health effects of chromium-6, yet nobody acted upon new information to pursue stricter enforcement of the existing legal exposure limit at that time. Expertise is normally used by policy makers to give content to the policy or to fill in details such as new limits or requirements, but it begs the question if this expert knowledge did not reach the policy makers or was simply ignored because of the safety culture that the Dutch armed forces has.

The explanatory research question attempts to explain a possible change in safety culture surrounding the amplification of the risk of exposure to the media. The difference in knowledge utilization by the Dutch armed forces on safety risks, determined by safety culture, could differ between the time period before the scandal of inaction by the Dutch armed forces came to light, and the time period right after the scandal came to light. Media attention possibly forced

(7)

7 political agenda setting which asked for solutions to the issue at hand. Drawing a comparison between the two time periods may provide insight into how there was a difference in information usage and the overall safety culture. The period following the date of the victims going public to the media (June 2014 up to this day) possibly shows the sudden usage of expert knowledge or expert advice by the Dutch armed forces, whereas they possibly ignored it before the scandal (between 2000 and June 2014) and the Dutch armed forces took the risk of possible exposure to chromium-6 for granted. This thesis seeks to examine if and why there was such a difference in knowledge utilization through safety culture between before and after the scandal came to light.

1.3 Relevance

On a societal level, research into this topic gives not only insight into how knowledge is utilized within governmental organizations such as the Dutch armed forces, but also contributes to understanding how expert knowledge potentially contributes to further crisis prevention. For example, as will be further explained in the theory, Pidgeon (1991, p. 131) explains how the utilization of knowledge and overall information interactions can be improved by improving organizational culture, which prevents organizational failure. A better safety culture will thus come with a better utilization of knowledge on risks because a better safety culture will make sure that there is a continuous improvement of safety, which requires knowledge on which risks to look out for. Crises are then potentially prevented through saving human lives by spotting potential hazards that have not been accounted for thus far. By looking at why expert knowledge regarding these dangers was or was not utilized, we might be able to make recommendations on how for example a change in safety culture can help in preventing future crises and following public outrage. In the long term this means that better signalling and monitoring will contribute to less victims of long term crises. Crises that may fly under the radar of political figures and policy makers may be signalled at a much earlier stage and addressed internally by the organization.

This research is particularly relevant in the academic field as most crises result from sudden events, like earthquakes or explosions. The difference is that this could possibly be classified as a slow burning crisis (Connolly, 2015, p. 53; t’ Hart & Boin, 2001, p. 33; Rosenthal, Boin & Comfort, 2001, p. 8). Comparable examples are asbestos and smoking, for which we did not know for a long time that usage of these products was causing a significant amount of deaths, but eventually it was discovered that these products had bad health effects because of their properties. These crises are not only hard to notice, but in this case, it is strange

(8)

8 that expert knowledge on the risk was available, but somehow it did not trigger more strict enforcement or improvement, even though there was information of exposures happening. The habits and behaviours that surround being informed about risks and dangers can be found in the field of safety cultures. Safety culture theory and the commercial programs constructed from its theory are mostly aimed at private commercial organizations. For example, Patrick Hudson together with his team of researchers worked on the development of the ‘Hearts and Minds’ program for Shell to be able to improve safety culture with oil and gas companies (Hudson, Parker, & van der Graaf, 2002). Struben & Wagner (2006) took Hudson’s framework to adapt it into being able to be used in hospitals regarding patient safety, which was developed into the commercial IZEP method. More recent papers look to expand the application of an underlying framework by Westrum on patient safety, as Ashcroft, Morecroft, Parker & Noyce (2005) apply the framework to safety within pharmacies. Research into safety cultures has thus far focussed on industrial manufacturing, healthcare organizations and oil and gas companies which do not allow for a big margin of error. It is the question if the model by Hudson is also applicable to governmental organizations, which are not engaged in pure engineering but do handle hazards and risks like industrial companies. This would contribute to the gap in research regarding the applicability of safety culture frameworks on different organizations. Furthermore, it would also contribute to the usage of different methods, as research on safety culture relies mostly on surveys and interviews. This thesis looks to construct safety culture from document analysis. Further research will also contribute to further preventing slow burning crises happening in the future, or to at least contribute to being able to signal that something is going on which needs to be addressed to prevent it from becoming a crisis with long-term effects.

1.4 Readers guide

In the following chapters of this thesis, I will describe in the theoretical framework what theory I will use to study the utilization of expertise in this crisis and the following commotion. I will first draw on basic theory regarding crises and scandals, after which I go into how safety culture is an organizational determinant of knowledge utilization and how crises and a lack of understanding of risk come with the amplification or attenuation of risk in media and political attention. After that, I will explain how the answer to the research question will be found in the form the research design and methods. Following the methods, I will give a detailed description and account of facts of how the crisis unfolded, followed by an analysis in the perspective of the theoretical framework. The conclusion will give a summary of the most important findings, give suggestions for further research, and recommendations for practitioners.

(9)

9

Chapter 2: Theoretical framework

2.1 Slow-burning crises and scandals

To go into the usage of knowledge in policy making in its function of preventing and solving crises, the topic of what crises are, how they can spread over time, and how the media scandalizes crises must be explained. Following the more traditional definition by Rosenthal, Charles & ‘t Hart (1989) I use the definition of crisis as: “a serious threat to the basic structures or the fundamental values and norms of a system, which under time pressure and highly uncertain circumstances necessitates making critical decisions” (as quoted in Rosenthal, Boin & Comfort, 2001, pp. 6-7). It consists of three properties: threat, uncertainty and urgency. Threat consists of potential physical harm, destruction or other types of harm. Threat does not necessarily have to lead to destruction, as it can also lead to social destabilization of neighbourhoods (Rosenthal, Boin & Comfort, 2001, p. 7). The high degree of uncertainty relates to the ability to understand what is going on outside of routine situations. Crisis disturb regular routines, which confuses policy makers and politicians as rules of thumb and procedures do not work anymore. Lastly, urgency or a sense of urgency means that crises are crucial moments for decision making (Rosenthal et al., 2001, pp. 7-8). Decisions must be made in a short time frame and under pressure under threat of for example more human deaths.

Crises may differ in the way they unfold. Most crises happen at an instant, such as explosions or natural disasters. However, there are also crises that unfold over the course of years. An example of this is an infection in the hospital which takes hundreds of lives of the course of years, which have creeping or slow burning characteristics (Connolly, 2015, p. 53; t’ Hart & Boin, 2001, p. 33; Rosenthal et al., 2001, p. 8). The problem is that these crises are not seen as crises at that time because people do not know what the problem is. Threats to public health from pollution or the usage of asbestos in construction are two of the countless examples that ‘t Hart and Boin (2001, p. 33) name. Only when the problem is discovered and defined, they become crises, and as it is such a big problem, it also takes years to fully resolve the crisis. As media get tired and are focussing on new crises, the position of the crisis on the agenda has ups and downs, dependent on new developments (t’ Hart & Boin, 2001, p. 34).

While there are different typologies and causes of crises to be found, some crises are qualified as scandals. What this means is that the events from a crisis get so much attention from the media and the public that they provoke a response from the public (Boswell, 2009, p. 168). The media look to unlock an emotional response from the public by mostly ignoring expert knowledge and by dumbing down the political debate to make it look more like drama.

(10)

10 Media are likely to scandalize damages that were discovered as a consequence of political decision making because they are framed as a breach of trust of the public in politicians. The public has certain expectation of those who represent them in the Dutch Lower House, and this breach of trust provokes a strong emotional reaction. While Boswell finds that the media like to dramatize events and not focus on using expert knowledge, the media does make use of expert knowledge to expose the damage that was done, or to take a deeper look into the scientific basis that the government made its decisions on (Boswell, 2009, p. 172).

2.2 Safety culture

Something which may play a role in differing between both periods is the safety culture within the Dutch armed forces. Safety culture was a concept which was coined and gained traction after the Chernobyl disaster. Failures of sub-systems within organizations is often said to be the cause of a bigger system failing. This was the case with Chernobyl, where inquiries found that a bad safety culture was one of the causes of a nuclear explosion that cost 30 people their lives and many more in Europe in the form of increased risk of cancer (Wiegmann, Zhang, von Thaden & Gibbons, 2004, p. 119). Safety culture is mostly looked at from the perspective of high risk organizations, such as oil, energy, aviation and space companies, and more recently health care.

While there are, as with all concepts that are well written about, countless of different conceptualizations of the concept of safety culture, I choose to use one of the definition of Wiegmann et al., who combine several other definitions:

“Safety culture is defined as the set of beliefs, norms, attitudes, roles, and social and technical practices that are concerned with minimizing the exposure of employees, managers, customers, and members of the public to conditions considered dangerous or injurious.” (Wiegmann et al., 2004, p. 122).

Some background into the cultural causes to organizational failure is given by Turner (1976), who created a model detailing the causes of accidents and disasters, divided in different stages. One of these is the incubation stage in which the foundations are laid for failures of foresight by organizations. Turner (1976, pp. 388-391) names the rigidities of beliefs within an organization to be one of the causes because the culture within the organization feeds blindness to problems. Other causes related to this are that organizations tend not to listen to warnings from outside their organization, fail to comply with regulation, even if it is out of date, and gravely underestimate the chance of disasters happening.

(11)

11 Pidgeon (1991, p. 131) additionally argues that Turner’s model focusses on the information difficulties on determining safety problems and the failure of individuals to find a solution to the safety problems, which are often ill-defined. In this way, organizational culture, the behaviour, norms, values and other institutions could contribute to preventing future risks, disaster and crises by facilitating better information interactions, or the utilization of knowledge.

Wiegmann et al. (2004, p. 128) find that there is no one size fits all toolbox to assess the safety culture of an organization. While there is a wide range of models used by organizations in different sectors to assess the safety culture of an organization I will go through some relevant models to guarantee that the used models have enough theoretical support from other scholars. A big name in this field is Patrick Hudson, who devised a way to classify the safety culture of an organization. He divides cultures in four components: Corporate values, beliefs, problem solving methods and working practices (Hudson, 2001, p. 14). By taking the model of Westrum and adding two more types of cultures to it (reactive and proactive), the evolutionary ladder of safety cultures was made. This ladder goes from pathological to generative, with reactive, bureaucratic or calculative and proactive in between (see figure 1).

(12)

12 Hudson, Parker & van der Graaf (2002, p. 3) explain pathological culture as one which is driven by a desire to not get caught and to blame workers when something goes wrong. The organization is not interested in looking out for safety. With reactive cultures, only an incident can trigger safety improvement. Incidents produce pressure from regulators, which pressures change. However, there is still a belief that employees are at fault for incidents. With a calculative or bureaucratic culture, safety is looked at from a managerial view by collecting a lot of data. Safety is calculated and incorporated in cost-benefit analyses. The organization forces itself under pressure of regulation to make procedures to guard safety, also protecting the organizations reputation. The proactive culture is characterized by a feeling within the organization that there are still uncertainties or chances that something might happen and that employees are also involved in safety management. The generative culture sees participation regarding safety improvement on every level, but there is some permanent unease because a crisis or safety incident may happen (Hudson et al., 2002, p. 3).

Hudson et al. (2002, p. 5) explain that safety culture can be assessed by looking at eighteen dimensions, which can be categorized regarding leadership & commitment, policy and strategic objectives, organization, hazard management, planning and procedures, implementation & monitoring and audit and management review. By using statements or typical characteristics for every rung in the ladder of safety cultures for every dimension, one can determine in what rung of the ladder the organization can be found. I made the choice not to make use of all eighteen dimensions for this thesis as it does not fit the short timespan that this thesis is constructed in and the assessment method normally is used by large organizations in a bigger timespan. However, I selected some dimensions from almost all categories, which provides indication of the safety culture level in organizational terms. While Hudson et al. (2002, p. 6) give typical personal statements for every type of safety culture from the perspectives of management, supervisors and workforce, these are not given for every dimension and do not help in this case because this thesis relies on document analysis, and personal statements are not likely to be prevalent in documents. Therefore, looking at the broader organization and management will bring more benefit than the search for personal statements.

To give a broad overview of safety culture, Hudson et al. (2002, p. 2) explained their adaptation of Reason’s characteristics of a safety culture, adapted to four major dimensions. They characterize safety cultures as being informed which shows from the employees’ knowledge on all factors that determine safety within the organization. Both the workforce and management are up to date on safety. Second, the culture encourages reporting of incidents.

(13)

13 Third, the culture is just, which means that there is trust between employees, and rewards for safety information instead of a blaming culture. Fourth, Reason names a culture to be flexible, in that the culture is adaptable together with the organization. Lastly, the culture promotes learning, which means that there is willingness and competence to learn lessons when something goes wrong and that there is willingness to reform. Hudson, Parker & van der Graaf also add worriedness to the list of cultural characteristics, as culture should make people uneasy because incidents can happen at any time, even if the organization has a good safety history.

Various authors, which include Hudson, developed the existing theory on the typology of safety cultures into the well-known ‘Hearts and Minds’ method by the Energy Institute, or made known by Shell. Part of this method is the ‘know your culture’ brochure, which makes use of Parker, Lawrie & Hudson’s (2006) description of the five different safety cultures regarding eighteen different dimensions of an organization, from the perspective of management and workforce. People normally are asked to score their experience of the organization’s safety culture based on the descriptions of the different dimensions (Energy Institute, 2006). The scores, for example when more characteristics from a reactive culture than a generative culture are found, show that the organization is perceived to have a lower level safety culture, which has less trust and is less informed.

Regarding the chosen dimensions from the eighteen, I make use of the dimensions of which I estimate that data can be found within document analysis. It is not possible to measure all eighteen dimensions. While the Hearts and Minds toolbox recommends looking at 9 specific different dimensions dependent on if the individual works as an employee or as a supervisor, to make sure that an overall view of the culture is made, I selected dimensions from both categories and both more abstract and more concrete dimensions. To clarify, the dimensions were selected based on my estimation of what is potentially measurable from documents. This is a departure from Parker et al. (2006), who made use of interviews to construct their descriptions of different cultures on every dimension. The table below shows what dimensions out of the eighteen are used, to what element of the management system they belong, and if the dimension applies to management or supervisors.

(14)

14

Table 1. Chosen dimensions for specific management system elements divided over management and supervisors, from the safety culture framework by Hudson, Parker & Van der Graaf (2002)

Management system element (Hudson, Parker & Van der

Graaf, 2002, p. 4)

Dimension (Hudson, Parker & Van der Graaf, 2002, p. 4)

Management or supervisor (Energy Institute, 2006)

Policy & Strategic objective Causes of accidents in eyes of management Management Organization & responsibilities Workers interest in competency / training Supervision

Hazards & Effects management

Worksite safety management techniques

Supervision

Implementing & Monitoring Incidents/accidents reporting analysis

Management

Implementing & Monitoring Who checks safety on daily basis?

Supervision

Audit Audits and reviews Management

Leadership & Commitment Management interest in communicating HSE (safety) issues with workforce

Management

The choice was made not to use the other management system elements: planning & procedures and management review. For the last one, there is only one dimension in this category, which revolves around benchmarking, trends and statistics. I made choice not to include this in the measured dimensions because it is hard to measure or to find in documents, and a focus on statistics is part of the more calculative safety culture, as this can be found within other descriptions. For the planning and procedures, the argument is that procedures and planning are also to some extent part of monitoring and the extent to which reports are made and researched.

The specific dimensions of commitment of the workforce to safety was excluded because it can also be shown from the interest of workers in training around safety and by

(15)

15 looking at who checks safety. The same goes for the dimension of hazard and unsafe acts reports, as incidents and unsafe acts have major overlap and the reporting of hazards can also be part of worksite job safety techniques, such as keeping track of what hazardous materials are being used. However, because of this overlap, these aspects of safety culture are also be addressed in the analysis to some extent. The dimensions named in the table above try to cover most of the management system elements, while also having a good division between management and supervision and incorporating possible overlap from other dimensions.

For the chosen dimensions of safety cultures, for each of the five types of safety culture, there is a description of what signs to look for regarding the dimension for that specific type of culture (Parker et al., 2006, pp. 557-559). For looking at causes of accidents in the eyes of the management, this means that a pathological safety culture has a lot of blame for individuals and acceptance of accidents as part of the job, whereas the generative safety culture has management accepting its own possible responsibility and looking at the overall system and interactions to address the cause of the accident. For workers interest in competency training, this goes from seeing training as necessary evil and compulsory by law and acceptance of harsh working conditions, to a setting where attitude is as important as knowledge and skill within the organization and safety training is incorporated in the overall development and needs of the workforce. Worksite safety management technique ranges from no techniques being applied to job safety analysis being done and an open setting of workers and supervisors warning others about safety issues. Incident or accident reporting analysis ranges from incidents not being reported and investigation happening after serious accidents and being limited, to in depth analysis of accidents and creating systematic change based in information of earlier incidents. Who checks safety on daily basis reaches from people looking out for themselves to all personnel looking out for themselves and others, making inspections by supervisors not needed. Audits and reviews ranges from needing to comply with inspection requirements, and safety audits being only after big accidents, to having a fully functioning audit system, continuously looking for problems in behaviour instead of hardware and systems. The managements interest in communicating safety issues with the workforce ranges from not being interested and telling the workforce not to cause problems to a two-way process with the management receiving more information than sending out (Parker et al., 2006, pp. 557-559). The methods chapter contains a more detailed operationalization, also regarding the levels between the pathological and generative safety culture in a schematic overview.

(16)

16 2.4 The social amplification of risk as agenda setting

As the safety culture of an organization indicates how it does or does not process information, the bigger question around it is if the change in safety culture is rather a cause or an outcome of a crisis coming to light. For example, a continuously bad culture may lead to an increased public awareness and outrage in the form of risk amplification through media coverage, but this attention for the risk may also be the cause of a changing culture within an organization.

This increase in public attention, the so-called amplification of risk, is embodied by the Social Amplification of Risk Framework (SARF). Kasperson et al. (1988) are the architects of this framework, which explains how the attention to small risks leads to a strong public outrage and even societal or economic impacts. It seeks to show that hazards are connected with all kinds of psychological, social, institutional and cultural processes and that these can amplify or attenuate the public reaction to the risks themselves. Kasperson et al. (1988, p. 179) explain that this framework is needed as risks or incidents can lead to major secondary societal impact. Social amplification is named to be a corrective mechanism which improves the estimation of existing risks, and thus increases safety overall. However, the effect can also be the other way, as it can also decrease attention for risks, such as which happened with the risks attached to smoking.

To give a definition of risk, I use of the definition of Rosa (2003), who defines risk as: “a situation or an event where something of human value (including humans themselves) is at stake and where the outcome is uncertain” (Rosa, 2003, p. 56). What is common in this definition is the importance of danger to something of value, the possibility of an outcome occurring, and the uncertainty that surrounds activities. For example, people worry about the uncertainty that they might be harmed as the result of engaging in an activity that is harming humans, in this case being exposed to carcinogenic metal after working with paint.

Within this theory, I am particularly interested in the amplification of risk and the impact that the sudden increase in media and public attention as part of the amplification process. This happens in two stages according to Kasperson et al. (1988, p. 177): The stage of information transfer on risk and the stage of response from society to the risk. Different kinds of individual and social amplification stations, like the media, networks, and individual scientists have a role in the sending and receiving of signals about risk. The problem is that the amplification of risk by these stations comes with commotion within society, which leads to further secondary societal impact. They create new social or social consequences, or even can increase or decrease the risk in question. This triggers a change in attitude towards the risk or even social action in the form of secondary impacts. These final impacts include regulatory change, political

(17)

17 pressure, changes in monitoring of risks and regulation, organizational change, changes in the feedback mechanism that increase or lower the risk itself and changes in training and qualifications (Kasperson et al., 1988, p. 182).

The amplification of risk works through receiving information, which consists of evaluating and constructing the knowledge, and transmitting information, adding new information to the message and thus amplifying the risk or signal going to other ‘stations’ (Kasperson et al., 1988, p. 181). The nature of the messages and what is in them also plays a role in how the messages are perceived by others. Kasperson et al. (1988, p. 184) argue that the information that goes through the amplification process may influence the degree of amplification through its volume, repeating stories and many other factors. Media compete for the attention of people, but large amounts of media attention shape the issue but also the agenda by repeating stories. As figures battle for media attention, it becomes much harder to reassure people. The volume of coverage is what creates the most attention for a subject.

Additional viewpoints on this role of media coverage come from Vasteman, IJzermans and Dirkzwager (2005), who see that SARF is closely linked with theory on agenda setting. The biggest amplification station in the process of social amplification is the media. By paying attention to the risk, they produce new information, which is also heavily transmitted to the public which consumes the stories on risks, leading to a reaction. Leiss (2003, p. 367) describes how media coverage is the uncontrollable factor from a public policy perspective. He explains how the risk signal is interpreted by the public and is translated to impact that the event has in the form of a societal impact. Two of the factors that the media coverage influences are the perception of managerial incompetence regarding handling of the risk, and the perceived future exposure to similar risks. Leiss thus suggests that media coverage indirectly leads to societal impact through its influences on public concern or perceptions. It is driven from a belief that the public beliefs that because this risk is not handled well, it begs the question to what extent other risks are handled better. It also instils more fear and uncertainty in the public, and this in turn leads to their reaction to the incident in society. However, it should be mentioned that there is still some debate on the position of media coverage and its effect within SARF with its relation to public opinion. Breakwell & Barnett (2003, p. 90) find, through using a method of layering, that for some cases there was a relationship between media coverage and public awareness. This was the case for coverage on mad cow disease, but not for salmonella. This possibly has two explanations: either there is a difference in influence that media has on public awareness on the national and local level, or the relation between media coverage and public concern may be not a simple relationship.

(18)

18 Leiss’ model of the role of media coverage in risk amplification makes use of conceptualizations by Burns et al. (1993, p. 621), who recognize that the media has a significant strength in that their coverage impacts the public factors from risk signals that people perceive, and that their behaviour in turn leads to societal impact. In measuring this, the volume and repetition of coverage thus matters in reaching the public, but also messages about perception of bad managerial competence and future risks influence the societal impact through the forming of public opinion.

The interesting aspect about the social amplification of risk framework is that Kasperson et al. (1988) do not view the framework as a full theory on its own. They would recommend the usage of additional complementary theories to explain phenomena’s. A complementary theory is John Kingdon’s multiple streams theory, in trying to further illustrate how media attention drives public attention and leads to the secondary impacts in the SARF framework. Kingdon’s multiple streams method consists of problems, policy and politics. He recognizes the different kinds of groups engaged in the agenda setting process by naming the examples of academics, consultants, the media, and the public itself (Kingdon, 1984, p. 45). When it comes to the media, Kingdon recognizes that the public attention to a problem is closely linked with media attention. Both citizens and legislators keep an eye on the media, and in this way media affect politics by influencing public opinion (Kingdon, 1984, p. 58). Politicians are thus driven by public opinion in that they must act to keep their support for upcoming elections.

Problems rise on the agenda because of focussing events in the form of crises, disasters, or personal experiences by policy makers (Kingdon, 1984, pp. 94-98). In terms of crises, instances lead to just signalling the problem and may take more crises to lead to a solution, so it is not entirely certain that one crisis leads to action from policy makers.

Kingdon describes how policy emerges from different policy communities from the policy primeval soup. This is a process of natural selection in which there is interaction between policy communities, in which different parties of academics and researchers are connected in finding solutions for problems (Kingdon, 1984, pp. 116-123). These solutions emerging from the primeval soup by specialists can only be coupled by policy entrepreneurs when there are the right political conditions, which come from the political stream. This stream consists of campaigns, election results, ideological developments, but most important of all, public mood. National mood can promote issues to a higher position on the agenda, but it can also make issues disappear from the agenda very quickly (Kingdon, 1984, p. 147). Eventually, a window of opportunity opens which allows for the coupling of the three streams of policy, problems and politics (Kingdon, 1984, pp. 168-173).

(19)

19 From the perspective of Kingdon, the social amplification of risk framework can be interpreted as problems being put on the agenda in the form of media attention following from a focussing event, which together with the political stream in the form of a worse public mood, allows for solutions such as the secondary impacts from the SARF framework to emerge. Political figures at the head of the government can as such be pressured to make new legislation or to enact organizational change, but this can also be done by governmental actors or experts putting pressure on the organization or politicians, as experts can argue for certain solutions. In the end politicians and a wide range of other actors such as inspections or experts can pressure ministers, state secretaries, and organizations for new legislation, and organizational change. These two factors are also described in the SARF framework as two of the secondary societal impacts. Kingdon (2014) understands these changes as being part of the solutions stream. One important clarification to make is that Kasperson et al. (1988, p. 182) do not link political pressure and other impacts in SARF, but Kingdon shows with his theory that there is a role that actors have in pressuring political figures, including politicians pressuring people within the administration for organizational change and a change in legislation. This is also supported by Gowda (2003), who uses Kingdon’s agenda setting theory to explain why different actors can pressure legislators and politicians into enacting legislative change.

2.5 Theoretical model and hypotheses

Now that both theories have been explained, it is necessary to examine the theoretical model and hypotheses coming from the theoretical background that was just explained. The theoretical expectations find their basis in the model in this paragraph (see figure 2). The expectation is that the conditions that impeded information usage, in this case a bad safety culture would lead to public outrage or the amplification of risk as the risk is not being addressed by the Dutch armed forces. This inaction means that hazards are not addressed, and that crises originating from this will lead to public attention and outrage through increased media attention to the problem. On the other hand, there is a possibility of a feedback loop. This comes from that the risk amplification framework explains that the amplification of risk leads to increased attention to the problem, producing regulatory pressure and pressure for organizational change, resulting in improvement of safety culture. It could thus also be the case that problems regarding the risk at hand and the underlying systematic problems are being addressed through a change in the safety culture as a form of organizational change.

(20)

20

Figure 2. Simplified theoretical model in overview

While there are countless of other confounding and interacting possible effects to be thought of, these cannot all be incorporated into the thesis. As said earlier in this thesis, this thesis aims to make a comparison between two time periods, one from 2000 until the crisis in June 2014, and the one during the crisis up until the present day. For the first period the question is if the safety culture was already bad, or did drastically improve leading up to the crisis, to see if it could be a possible cause of what happens in the second period. In the second period, the question is if the possible amplification of risk and media attention for the crisis resulted in a higher level safety culture through the produced pressure. In short, the first period is characterized by a bad safety culture leading to risk amplification, and the second period is characterized by risk amplification possibly leading to a process of safety culture improvement or organizational change.

The first hypothesis coming from these expectations is that there is a lower level culture before the crisis, causing the amplification of risk. Considering that amplification of risk and change in safety culture is a ‘chicken or the egg’ question, I expect the first period to be characterized by a bad safety culture regarding the Dutch armed forces, causing the amplification of risks. Information is not used, people do not care for safety in the period leading up to the crisis and are not prompted to improve the organization and its culture because they do not know that there is a problem. This brings me to hypothesis 1:

H1: A low level safety culture before the crisis leads to an amplification of risks in media

and public in the form of increased attention to safety risk with the Dutch armed forces.

Safety culture Social amplification of risk

Pressure for legislative- and organizational change

(21)

21 An important consideration to make is that when the culture has already been changing in the period leading up to the crisis, it is not certain that a bad safety culture has contributed to the crisis happening. However, it may or may not also have consequences for the second expectation or hypothesis, which is about the feedback loop. The second expectation is that the amplification of risk through the media and public and the overall attention leads to a change or improvement of safety culture with the Dutch armed forces through the production of pressure by amplification. According to SARF, some of the countless secondary societal impacts occur in the form of organizational change and changes in risk monitoring and regulation. While SARF also sees political pressure as one of these impacts, Kingdon explains that politicians are pressured by different actors, such as other politicians, experts, the media etc. to make changes in regulation and organizational changes. While there could be some overlap between the two periods regarding the end of period one and the beginning of period two, the first period is characterized by safety culture leading to the amplification, and the second to the amplification leading to pressures for regulation and organizational change which results in a change in safety culture. The second hypothesis is:

H2: The amplification of risk leads to secondary societal impact in the form of

regulatory pressure and pressure for organizational change to a higher-level safety culture with the Dutch armed forces.

(22)

22

Chapter 3: Research design & methods

3.1 Research design

For this thesis, I choose to employ the multiple case comparison design. In this case, multiple cases do not consist of multiple crises, but two time periods surrounding the crisis in question. The crisis came to light when the media picked up the reports of victims who went public after they had feelings of not being heard by the government, in June of 2014, when Dagblad de Limburger (2014a) reported a story on former employees of a workshop being concerned about their health after measurements found chromium-6 in the workplace. It led to questions being asked in the Dutch Lower House on the 11th of July 2014 (Rijksoverheid.nl, 2014f). While this crisis is one case in itself, I make the comparison between the two time periods because the media painted a picture of the government not addressing the exposure of employees before the crisis and questions what they did during the start of the crisis. It is interesting to see if there was a safety culture which allowed for the open usage of expert knowledge to address potential safety risks before the crisis, which would disprove the standpoint of the media. The second period, after June 2014, would then be characterized by knowledge usage by the organization.

The goal is not necessarily to find a causal link between social amplification of risk and safety culture, but rather to compare two time periods. It takes a deductive approach by taking existing theory on safety culture and the amplification of risk and applying it to the case of the Dutch armed forces to see if what they theory tells us happened in this case. Another point to consider is the internal and external validity of this design, while it is feasible, it is necessary to make sure that the method measures what it is meant to measure. The problem in this is that there are other factors which cannot be controlled for that might be of influence on what happened. It is important to try and search for these possible variables and to name them. When it comes to external validity, it needs to be said that this research cannot be generalized to all crisis situations and all fields of policy making on every governmental level.

It is wrong to assume that this specific crisis or organization is representative for all other governmental organizations or crises. However, this thesis may be able to make strategic recommendations on the prevention of crises with the help of safety cultures to assure the utilization of knowledge on risks. This means that the unit of observation is documents on media coverage and the organization, and the unit of analysis the organization in the two different timeframes. I take multiple observations on one case but use these to state conclusions on the organizational safety culture and amplification of risk in two different timeframes, thus drawing a comparison.

(23)

23 As I will illustrate in the methods below, the internal validity is guaranteed. This is because of the usage of existing frameworks, which have been applied before to other cases and has proven its face validity, for example in the case of Hudson’s framework on safety culture, from which the commercially used ‘Hearts and minds’ method was developed, as said in theoretical framework. However, one limitation is the external validity the used design and methods. While I make comparison between two time periods, it is hard to generalize the theory to all organizations confronted with a crisis. Safety culture theory shows to be adaptable for other organizations and to be able to generalize results, there needs to be proven causality. This thesis cannot rule out intervening variables as proving a causal link requires many cases for statistical significance. Therefore, the generalizability is limited to only this case or at best public defence organizations. However, this does not mean that the potential results are useless, because the thesis gives more insight for new research to further prove safety culture theory to be applicable to public defence organizations, in addition to hospitals, and oil and gas companies.

3.2 Research methods

For the methods I utilize the method of content analysis. At first, the ambition was to conduct semi-structured interviews with all parties involved in the crisis. This would be most logical as research on organizational culture employs the use of semi-structured interviews because of its focus on human behaviours and norms. Earlier mentioned examples of this are Parker et al. (2006), who made use of semi-structural interviews to create their framework. Struben & Wagner (2006) also use focus groups as a part of their IZEP method. There is a large pallet of actors to call upon, such as former employees who were exposed to hazards, the lawyers representing them, labour unions and representatives from the Dutch armed forces. However, these actors might want to talk, but other actors such as government employees or representatives speaking for the Dutch government may not because they have different interests. The problem is that the interviews must be balanced over all parties and not just give attention to people in the opposition as this creates bias. Other sources of bias may be that parties have the interest to not say everything because of ongoing court cases and ongoing investigations. Separate from the risk of self-incrimination, there is also hindsight bias. Parties likely say that they would have done it differently if the same crisis was to develop again. This does not help in assessing the underlying mechanics leading up to the crisis or assessing the root cause and will only help parties to try to make use of this thesis to plead their case.

(24)

24 For these reasons, I have chosen to employ the method of content analysis. What helps is that there are public documents, giving some detail on the crisis and what happened before, during, and after the crisis. While it does not provide a detailed inside look into the Dutch armed forces and the ministry, it does sketch a general picture of what happened. To avoid more biases and to guarantee some verifiability, I have chosen to go into four sources of information to reach triangulation between sources:

1. Debate transcripts and accompanying letters to the Dutch Lower House by the Minister of Defence and the Minister of Social Affairs and Employment, who is also responsible for good working conditions (including the questions they were asked and their answers).

2. Media coverage between January 1, 2000 and June 8, 2018. These sources are found through the use of the LexisNexis system, and searching for chromium and the case of the Dutch armed forces. This provides a view of an increase or decrease in media attention, but also the content of what the media reported as the cause of the crisis. Journalists have their own sources and can find different facts or viewpoints that could be overlooked with regular research.

3. Internal documents from the Dutch armed forces released publicly by the defence ministry following the crisis, which date back to 1985.1 I only make use of the documents from 2000 until the most recent ones. It is not certain to what extent these documents are usable because they have been censored to some extent. It is also in the interest of the government not to release documents proving their responsibility for the crisis, but there could be information in the documents regarding the procedures and standards regarding safety and perhaps the processing of cases of exposure. The documents also include letters to the Dutch Lower House, reports by the Dutch labour inspection, Municipal Health Services and more of the documents mentioned in the next category.

4. Public reports and advice from the labour inspection, the Dutch Health Council, which gives advice to the government regarding risks, and the Socio-economic Council. These reports help in reconstructing the timeline on what happened before and after the crisis,

1 To guarantee replicability and transparency of this thesis, a document containing an overview of all published document and the full archive itself can be found at

https://www.rijksoverheid.nl/onderwerpen/chroomverf/documenten/publicaties/2014/10/30/al

(25)

25 but also context and the causes that were seen before the crisis, or if the services even noticed that there was a problem. Moreover, the Dutch labour inspection visited the workplaces from the crisis and some of the reports that have been published may give more information into what was going on. These reports can also be found in the internal documents from the Dutch armed forces, from which they are also partly sourced. Lastly, documents from the RIVM also give more insight into the context in which the research was done and what the results of this research is so far.

One complication is that the issue of chromium-6 goes back to before the 1980’s. The problem in this is that there are close to no sources available publicly for that period. Moreover, the other problem is that researching two big timeframes requires a significant amount of time and would jeopardize feasibility of this thesis. To solve this, the first period to be researched is limited to the year 2000 up until the crisis in June 2014. This was chosen because the earlier years may provide less material, because of archiving regulations and a possible lack of digitized documents but may also give insights into the regulation that was made up to the crisis and possible signs of high levels of chromium-6. This is the case because most likely less attention was given to the problem, but also because a lot of older sources are not publicly available. The second period consists of June 2014, the moment of victims going public to the media and political debate, up to this day. Some of the research into the crisis is still going on, so even if media attention has decreased, the crisis is still formally being addressed. On June 4, 2018, the research on the use of paint containing chromium-6 at Prepositioned Organizational Material Storage (POMS) sites was completed. However, research is still being conducted on the use of paint containing chromium-6 and Chemical Agent Resistant Coating across all sites of the Dutch armed forces (Rijksoverheid.nl, 2018b, p. 3).

The analysis of the documents is done using the qualitative analysis software package Atlas.ti. Together with the operationalized theory in a coding scheme, Atlas.ti provides a systematic overview of all codes and how individual quotes for each code can be traced back to the original documents. The programme allows for systemic data management, which will come in handy when there is a large amount of data over time and multiple sources which requires triangulation. By working this way, there is also a good amount of reliability because the coding scheme allows the thesis to be replicated in allowing the same phenomenon to be measured and giving the same results every time. However, some of the documents cannot be analyzed with Atlas.ti as the textual information from order internal documents cannot be recognized by Atlas.ti and thus cannot be sorted schematically. Therefore, these documents are

(26)

26 analyzed manually and sorted with Microsoft Word. Quotes are used in the analysis and translated from Dutch to English.

When it comes to operationalizing the theory in this thesis, both theories have already been operationalized to some extent in the theoretical framework, but the explanation below helps in further operationalizing the concepts to the level of observable codes in the operationalized coding scheme which can be found below.

Something which has to be mentioned about the safety culture theory by Hudson and his ‘Hearts and minds’ method is that the given scores are subjective and do not represent actual objective measurements of the safety culture (energyinst.org, 2006). This shows that safety culture is also a question of interpretation. However, this warning is given based on the subjective perception of people who are filling out the scorecards, the employees of the organization. Parker et al. (2006, p. 556) do recognize that cultures exist from more concrete aspects, but also abstract ones. Examples of concrete aspects are aspects on auditing safety within organizations and more abstract the views of cultures come from perceptions of the workforce. While perceptions from individuals cannot be used as objective measurements, it is possible to estimate culture based on documents, which contain views of the workforce, management, but also the more concrete aspects.

For the measurement of the level of safety culture, I have chosen to make use of the earlier mentioned dimensions of safety culture and operationalized them using the description of Parker et al. (2006) of what every of the five levels of safety culture looks like when assessing the organizations safety culture. Every sub-code or indicator in the coding scheme below is a characteristic of the chosen dimensions at the level of each kind of safety culture.

When it comes to the amplification of risk, it is hard to measure commotion or public mood. However, as could be read in the theoretical framework, there are a wide range of intervening factors which help in determining the public concern or outrage within the SARF framework. One of which is media coverage, which is said to be influential in determining the message that gets put out, but also the message that people perceive from the media and the societal impact following from their concern. The problem is that public outrage or mood cannot be measured easily, but instead a measurement of media coverage and perceptions may pose as a proxy for the public concern, as most of the scholars in the theoretical framework see the media as a determining party of public opinion. Media coverage can be measured through the amount of news articles that have been published over time on a particular subject, and the repetition of stories (Kasperson et al., 1988, p. 184). Additionally, there are measures for the

(27)

27 perception of future risk and perceived managerial incompetence, which will be measured from the existing media articles. These measures are more aimed to look at the public concern, as Leiss (2003) sees that coverage influences these aspects from risk signals. These are defined by Burns et al. (1993, p. 614) as: “The degree to which people are at risk of experiencing harm from future hazards of this type” and “the degree to which the public beliefs that similar risks are measured incompetently”.

Lastly, there is the two types of pressure explained in SARF, while SARF does not explicitly name these are pressures which are necessarily political pressures, Gowda (2003) finds that actors put pressure on politicians for legislative change by making use of Kingdon’s agenda setting theory. I choose to make use of SARF to form a separate operationalization separate from SARF itself, because SARF formulates these results as secondary impacts together with political and social pressure but does not name the first two to be part of the larger process of pressuring that both Kingdon and Gowda describe. For this reason, I make use of SARF in that there is a focus on organizational change and regulatory change, but I do incorporate from Kingdon and Gowda that there are different actors involved in pressuring for example politicians or the governmental administration. These pressures for legislative and organizational change are thus operationalized as regulatory pressure by inspections, experts or other actors, and Pressure for organizational change by politicians, inspections, experts or other actors.

In the following coding scheme, the concepts, their dimensions and indicators / sub-codes are shown. Regarding the volume of media coverage, there is nothing to be coded for this dimension of SARF. Therefore, this dimension will be measured quantitatively, allowing for visualization and some insight into the ups and downs of media coverage, while still getting the coding measurements from the other dimensions. The table below covers both the operationalization of the theory and the coding scheme. For codes, the dimension shows the main code between parentheses, after which there are sub-codes for each dimension. For the cause of an accident in the eyes of management and when the description of the organization is the same as a pathological organization, the code used to indicate this is (Cause of accident-Pathological). On the other side, if there is an observation of managerial incompetence as a part of public opinion, the code used is (Public perception-managerial incompetence).

(28)

28

Table 2. Operationalization of theory and coding scheme

Concept Code/dimension Sub-code/indicator Source

Safety culture Causes of accidents in eyes of management

(Cause of accident)

- The individuals directly involved, accidents perceived as part of the job (Pathological)

- People involved in accidents are removed, accidents are seen as bad luck (Reactive)

- Poor machinery and maintenance and individuals, attempts to reduce exposure, still focus on individuals from management (Calculative)

- The whole system, management admits and takes some blame (Proactive)

- Management accepts possible responsibility after assessment, broad view of systems and individuals (Generative) Parker et al. (2006, pp. 557-559) Workers interest in competency / training (Interest in training)

- Only when a legal requirement, acceptance of harsh working environment

(Pathological)

- Training is aimed at attitude change after accidents (Reactive)

- Standard training is given, and knowledge is tested, some on the job transfer of training (Calculative)

- Leadership sees importance of training, employees are proud to show their skills, workforce themselves identify training needs (Proactive)

- Workforce show needs for training and ways of getting it, training is more aimed at attitude and becomes a continuous process (Generative)

Parker et al. (2006, pp. 557-559)

(29)

29 Worksite safety management techniques (Worksite safety techniques)

- No techniques, people watch themselves (Pathological)

- Use of hazard management technique after accidents, no systematic use of it(Reactive) - Commercial techniques are used, and quotas

are used to prove it is working (Calculative) - Job safety analysis and observation are

accepted by workforce (Proactive)

- Job safety analysis is a continuous process, people tell each other about

hazards(Generative) Parker et al. (2006, pp. 557-559) Incidents/accidents reporting analysis (Reporting analysis)

- No reporting for lots of incidents,

investigation after serious incidents do not include human factors (Pathological) - Informal reporting system, focussed on

proving investigation took place, focus on finding guilty parties but no follow up (Reactive)

- Procedures producing a lot of data and points of action, but opportunities are often not taken or restricted to workforce

(Calculative)

- Trained investigators who follow up, reports are sent companywide to learn lessons (Proactive)

- Investigations show understanding of how accidents happen, based on information of past incidents, follow up checks if change if permanent (Generative)

Parker et al. (2006, pp. 557-559)

(30)

30 Who checks safety on

daily basis?

(Daily safety check)

- No formal system, people take care of themselves (Pathological)

- External inspection after major incidents, site checks by management and supervisors (Reactive)

- Regular checks by management, not daily and only to extent of compliance to procedures (Calculative)

- Encouragement to teams to check themselves, managers walk around and engage employees in dialogue (Proactive) - Everybody is engaged in checking for

hazards, also for others, inspections are unnecessary (Generative)

Parker et al. (2006, pp. 557-559)

Audits and reviews (Audits and reviews)

- Compliance with inspection requirements and unstructured safety audits after accidents (Pathological)

- Unscheduled audits seen as punishment, but seen as inescapable after accidents

(Reactive)

- Regular scheduled structured audits in high hazard areas, external audits unwelcome (Calculative)

- Extensive audits program welcoming outside help, audits seen as positive (Proactive)

- Continuous search for problems, less focussed on hardware systems and more on behaviours (Generative)

Parker et al. (2006, pp. 557-559)

Referenties

GERELATEERDE DOCUMENTEN

Hieruit volgt dat het kerkhof, waarop deze 4 graven liggen, in ieder geval niet kan geassocieerd worden met de eerder beschreven groep gebouwen, maar in een latere bewoningsfase

We initially envisioned four working groups with the titles (1) Business Aspects of Security for CI in Different Domains, (2) Attacker Models and Risk Analysis for Targeted Attacks

Research has shown that we experience reactance, a form of resistance, towards personalized advertising. This reactance results from a loss of perceived

Con- vergence results are given for both the Krasowskii and the hysteretic dynamics: Note that due to the constraint of static uniform quantization we can not precisely obtain

1. Real options as a relatively easy to use decision support aid. Uncertainties with low predictability exclude quantitative methods. These reasons are elaborated below. 10) make

• Het VC-OS-gehalte van beheersgras met de hoogste concentratie ureum (granulaat) neemt ten opzichte van het controlegras toe met bijna 10%. • De VEM-waarde van beheersgras met

Keywords: vertical communication, horizontal communication, change management, sensemaking, sensegiving, strategic ambiguity, social process of interaction, resistance,

An inquiry into the level of analysis in both corpora indicates that popular management books, which discuss resistance from either both the individual and organizational