The ORE of Communication will be available for subscription in late September. Speak to your Oxford representative or contact us to find out more.

Dismiss
Show Summary Details

Page of

 PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, COMMUNICATION (communication.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 24 September 2017

Collective Knowledge for Industrial Disaster Prevention

Summary and Keywords

Since the 1990s there has been an increasing interest in knowledge, knowledge management, and the knowledge economy due to recognition of its economic value. Processes of globalization and developments in information and communications technologies have triggered transformations in the ways in which knowledge is shared, produced, and used to the extent that the 21st century was forecasted to be the knowledge century. Organizational learning has also been accepted as critical for organizational performance. A key question that has emerged is how knowledge can be “captured” by organizations. This focus on knowledge and learning demands an engagement with what knowledge means, where it comes from, and how it is affected by and used in different contexts. An inclusive definition is to say that knowledge is acquired theoretical, practical, embodied, and intuitive understandings of a situation. Knowledge is also located socially, geographically, organizationally, and it is specialized; so it is important to examine knowledge in less abstract terms. The specific case engaged with in this article is knowledge in hazardous industry and its role in industrial disaster prevention.

In hazardous industries such as oil and gas production, learning and expertise are identified as critical ingredients for disaster prevention. Conversely, a lack of expertise or failure to learn has been implicated in disaster causation. The knowledge needs for major accident risk management are unique. Trial-and-error learning is dangerously inefficient because disasters must be prevented before they occur. The temporal, geographical, and social scale of decisions in complex sociotechnical systems means that this cannot only be a question of an individual’s expertise, but major accident risk management requires that knowledge is shared across a much larger group of people. Put another way, in this context knowledge needs to be collective. Incident reporting systems are a common solution, and organizations and industries as a whole put substantial effort into gathering information about past small failures and their causes in an attempt to learn how to prevent more serious events. However, these systems often fall short of their stated goals. This is because knowledge is not collective by virtue of being collected and stored. Rather, collective knowing is done in the context of social groups and it relies on processes of sensemaking.

Keywords: collective knowledge, organizational safety, learning, disasters, risk management, expertise, incident reporting, sensemaking

Defining Knowledge

The nature of knowledge has been a perennial question, particularly within the social sciences and humanities, but since the 1990s, its study, specifically in organizational contexts, has grown rapidly in response to an increasing recognition of knowledge as a resource in a competitive marketplace (Drucker, 1993; Quinn, 1992). Processes of globalization and developments in information and communications technologies have triggered transformations in the ways in which knowledge is shared, produced, and used to the extent that the 21st century was forecasted to be the knowledge century (for a critique, see Dzisah & Etzkowitz, 2012). Transformations to work toward the knowledge economy have also been of interest (Drucker, 1993; Hardt & Negri, 2000), as have shifts in vocational training toward the tertiary sector (Spence & Carter, 2011). Organizational learning has also been accepted as critical for organizational performance (Argyris, 1992; Argyris & Schon, 1996). A key question that has emerged is how knowledge can be “captured” by organizations.

This focus on knowledge and learning demands an engagement with what knowledge means, where it comes from, and how it is affected by and used in different contexts. Some scholars have been critical of the direction and progress of the field in this respect. Spender (1996), for example, argued that despite a long-standing program of research on organizational knowledge, the theorization of such knowledge, its acquisition, storage, and application was inadequate. Since this time, a wealth of productive research has been published, particularly within an interpretivist tradition (Klein, 1998; Knorr-Centina, 1999; Turnbull, 2000; Wenger, 1998). There have also been great developments in the conceptualization of knowledge in organizations, which has worked to draw a distinction between more individual theories of knowing and the collective (Hecker, 2012). These critiques alert us to the importance of being clear about what knowledge is (and in our case what makes knowledge collective) as a foundation for more specific inquiry.

The term knowledge is used in different ways. There are two issues of particular importance: what counts as knowledge; and the extent to which knowledge is connected to individual knowers and social groups. The Oxford English Dictionary (OED) offers just shy of 15,000 words in their definition of knowledge, but common features include that knowledge means understanding of a situation or state of affairs grasped through experience, intuition, and the senses. These definitions encourage us to recognize different varieties of knowledge including the formal, informal, explicit, tacit, scientific, and local (Barth, 1995; Turnbull, 2000). There is also an acknowledgment that knowledge can be acquired through instruction, study, or practice (i.e., that it is learned), and that it is a matter of competence or expertise in a subject or skill. We can say that knowledge is acquired theoretical, practical, embodied, and intuitive understandings of a situation. As anthropologist Fredrik Barth (1995, p. 66) says, knowledge is “what people employ to interpret and act on the world.” In recognizing different varieties of knowledge there is also an appreciation of knowledge as something that is done (i.e., knowing) and also that knowledge is located socially, geographically, organizationally, and it is specialized. This is the idea that knowledge is not objective, universal, nor merely “local,” but its production and use is always at the intersection of places, peoples and their performances (Turnbull, 2000, p. 19). In keeping with this, scholarship within the interpretivist tradition has captured knowledge as messy, located in space and time (Ingold, 2000; Turnbull, 2000), and specific to small social groups (Knorr-Centina, 1999; Wenger, 1998).

The OED also gives definitions of knowledge as objective truth or information. These are not the definitions used here; however, it is important to engage with these different conceptualizations of knowledge as they imply that some things are knowledge where others are not. They also influence how knowledge sharing is conceptualized and the extent to which knowledge is connected to social groups. The definition of knowledge as objective truth has resonance with the idea of knowledge as “justified belief” commonly held within Western philosophical traditions (Gettier, 1963; Goldman, 1979). Here, claims only count as knowledge if sound reasons are given in support. This definition is inclusive of some types of knowledge more than others. For example, it is difficult to provide sound reasons for kinds of understanding that are not easily put into words including informal, practical, and tacit knowledges. The idea of objective truth is equally suggestive of methods for knowledge creation in the sciences, though knowledge resides beyond laboratories and, indeed, the “objective,” universal practice of science has been critiqued (Knorr-Centina, 1999; Latour & Woolgar, 1979; Lynch, 1984). As such, a more inclusive definition is preferred.

A consequence of speaking of knowledge as opposed to knowing is that there is the potential to lose sight of the connection between knowledge and knowers. The conceptualization of knowledge as information falls prey to this trap. Knowledge management is an established scholarly and practitioner field, and at its core engages with the relationships between data, information, and knowledge, which are often conceptualized in terms of a hierarchy (Braganza, 2004). Knowledge management emerges at the dawn of the information age, driven by the possibilities developments in information technology introduce for organizations (Grover & Davenport, 2001). The idea of the “global brain” also emerges and implies that technologies like the Web fundamentally change the nature of knowledge (Bloom, 2000; Russell, 1983). This link to information technology is evident in the ways in which knowledge is conceptualized in this context. Gathering and managing knowledge is a gathering and managing of any other information technology requirements (Braganza, 2004). This conceptualization of knowledge informs the use of technological solutions like repositories and databases for knowledge management because knowledge is thought to be easily captured, stored, and transferred in its most basic form as data. We also see this in early organizational memory scholarship where memory or knowledge is theorized as information that can be stored and retrieved (Walsh & Ungson, 1991). A challenge for organizations is moving past a concept of knowledge as information that can be collected and stored toward a concept of knowing as something that is possessed and done by individuals and groups.

Knowing in Social Groups

In the approach to knowledge preferred here, we can examine knowledge in all its varieties as connected to social groups as well as individual knowers. This capacity to examine knowledge at the level of the collective is particularly important in the context of hazardous industry, as the temporal, geographical, and social scale of decisions in complex sociotechnical systems means that this cannot only be a question of an individual’s expertise; but major accident risk management requires that knowledge is shared across a much larger group of people. The key contention of sociologies of knowledge has been that knowledge production, even of the most seemingly objective knowledge, occurs in a social context and is made by social actors. In order for knowledge practice to function effectively, particularly in the natural sciences, knowers are socialized or educated into the knowledge framework and its concepts, laws, and theories. This education prepares the knower for membership in a specific knowledge community and thus limits disagreements over fundamentals (Kuhn, 1962, p. 11). Miller and Fox (2001, p. 675) express a similar idea: “We can ‘see’ only those parts of the objective world that can be apprehended through the words and categories located in our cultural subjectivity. We cannot see facts if we are not pre-equipped with categories that enable them to come into view.” This connection between knowledge and social groups can be traced back to classical studies in sociology of knowledge, which emphasized that thought is a collective experience contingent on common social structures and social locations that foster common experiences and thus collective thinking (Mannheim, 1952, p. 291).

The connection between knowledge and social groups is articulated in different ways. The producers and consumers of knowledge are conceived of, categorized, and referred to in multiple ways in the literature. Terms such as “intelligentsia” and the “scientific community” each refers to particular types of knowledge. One term widely used is “epistemic communities.” The term refers to any group, regardless of size, locality, or means of interaction that has shared beliefs, values, and ideas that form the basis of their interaction, and the group has shared norms of inquiry and notions of validity (Haas, 1992, p. 3). The concept is useful because it fixes the focus on the social aspects of knowledge while being impartial on the type of knowledge and the beliefs, values, and methods of the community. Epistemic communities can be examined at different levels. For example, we could speak of “the scientific method”; however, scholars have also shown how individual laboratories have their own “epistemic cultures” (Knorr-Centina, 1999).

Another concept used to capture the social location of knowledge is “communities of practice” (Lave & Wenger, 1991; Wenger, 1998). This concept particularly emphasizes the ways in which knowledge is made and remade in small social groups that work to form agreement over knowledge in a given context. This model reasons that while people may be employed by a large organization, and may have been educated within a particular system and its curriculum, they work within and for much smaller groups (Wenger, 1998, p. 6). In cases like the scientific community, that there is a group of people that are working to build, refine, and debate knowledge is readily apparent. However, the social location of knowledge is not specific to scientific knowledge workers. In call centers people build knowledge of how to perform their roles in that setting (Wenger, 1998). In an engineering context, too, knowledge sharing and design decisions are specific to colocated work groups (Hayes, 2015; Maslen, 2015). The uniqueness of knowledge in small social groups is due to the different types of knowledge at work. Formal knowledges (rules, procedures, methods) are visible, but they do not reflect all of knowledge. Much understanding is tacit, informal, intuitive, practical, and embodied. This is the type of understanding that Duguid (2005, p. 111) had in mind when he wrote that “codified knowledge … rests on an uncodifiable substrate that tells us how to use the code.”

This awareness that knowledge is socially located is established in the sociological literature. However, the social location of knowledge is not uniformly engaged with. In “professional” knowledge cultures, formal knowledges and qualifications are given greater importance than mastery of practical skills (Lam, 2000, p. 505). As such, professional systems geared toward specialization, codification, and individual learning are inherently weak in the development and sharing of some knowledge types, and particularly tacit knowledges, because they do not support its conditions of collective learning, strong social networks, and experience. These findings about the ways that different organizational environments can fail to support some types of knowledge coupled with failures of knowledge management systems (McDermott, 2000) have encouraged engagement with what is needed to make knowledge “collective” in these environments.

Contrary to the notion of databases as “bins” in which knowledge can be unambiguously stored and retrieved (Walsh & Ungson, 1991), there have been moves to conceptualize knowledge in databases as constituted and maintained through “ongoing processes of collective action and interaction” (Hecker, 2012, p. 429). From this perspective, collective knowledge is not an object, but a practice. The notion of collective knowledge is contested but can be conceptualized in terms of three types, including knowledge shared by individuals; specialized knowledges that are distributed and coordinated; and knowledges embedded in artifacts, such as documents, databases, rules, organizing principles, and products and technologies. This last type of collective knowledge is most problematic. Hecker highlights that numerous in-depth studies have shown that use of these artifacts requires “social practices of negotiation and transformation to create a collective understanding and to bridge local and situated epistemic contexts” (Hecker, 2012, p. 429). Collective knowledge is not an object, but a practice, an “organisational remembering” (Feldman & Feldman, 2006, p. 862).

This is not necessarily an easy fit for organizations practiced at thinking about formalized solutions to knowledge management. Organizational learning is also inherently difficult. Tacit knowledges are particularly context specific, embedded, and tend to be “sticky”—not quickly or easily transferred—dependent on experience and doing with others (Lam, 1997, p. 974). This tells us to pay attention to the interactions between individuals in smaller groups and the agreements they form, the stories they tell, and the sense that they make. In the knowledge economy, knowledge and specifically collective knowledge are important. We are changing jobs more so there is a question about how organizations “keep” knowledge. However, in some types of organizations the stakes are higher than competitive advantage. In the case of hazardous industry, collective knowledge is an issue of disaster prevention.

Learning and Organizational Safety

In hazardous industries, disasters are mercifully rare and yet the potential is ever present. In practice and research, learning in order to prevent disaster is a dominant theme. Researchers have studied aspects of learning in the context of disaster prevention, including the importance of learning from incident investigations (Carroll, Rudolph, Hatakenaka, Wiederhold, & Boldrini, 2001), how best to present data for learning (Chevreau, Wybo, & Cauchois, 2006) and the role of team leaders in group learning (Edmondson, 2003). Conversely, a failure to learn has been implicated in disaster causation (Pidgeon & O’Leary, 2000). In an industry context, the term “lessons learned” is commonly used in many presentations and publications about disasters, signifying both this emphasis on learning and a perception that experience of an event results in an enduring lesson that permanently changes how risks are understood and managed. However, what this learning looks like is not agreed upon, nor is it straightforward.

A common immediate response to disaster is an effort to identify who is at fault and assign blame (Dekker, 2012; Maslen & Hayes, 2014). This response implies that it is the understanding and decisions of individuals found lacking. For example, at Deepwater Horizon, an identified cause of the blowout was a “faulty mental model” of the personnel running tests (Hopkins, 2012). The workers employed on the platform simply did not have the discipline background to imagine possible causes of their test results. If analyses stopped at this point we might conclude that disasters occur due to individuals, not organizations. However, failures within complex sociotechnical systems can take decades to incubate, and decisions are distributed over many actors in an organization operating in different areas and at different levels. The most common model of organizational disasters is James Reason’s (1997, 2000) Swiss cheese model. In this way of thinking about accidents, there are a range of defenses in place that are functionally designed to prevent any given hazard progressing to disaster. In practice, these defenses are imperfect (like holes in Swiss cheese). The defenses or barriers in place ensure that failure of any individual measure is not catastrophic. A disaster only occurs when multiple barriers fail, or put another way, when the holes in the cheese line up to provide an accident trajectory through all the defenses. Analyzing a disaster from this perspective involves an engagement with its technical causes, but also organizational factors such as budgets, safety priorities, and management styles. A significant body of analysis has been produced taking an organizational view of disaster causation, including studies of the loss of the space shuttles Challenger (Vaughan, 1996) and Columbia (Starbuck & Farjoun, 2005), the loss of the U.S. Black Hawk helicopters (Snook, 2000) and the Deepwater Horizon blowout (Hopkins, 2012).

This shift toward looking at disasters as organizational necessarily changes the way that we think about decisions, learning, and knowledge for disaster prevention. At the most basic, we can say that knowledge needs to be conceptualized at the level of social groups rather than individuals. However, the collective status of knowledge in this context has only been recently engaged with (Maslen & Hayes, 2016). Instead, there has been debate over whether knowledge should be “in” individuals (as expertise) or embedded “in” organizations via rules and procedures. Reason’s Swiss cheese model particularly directs our attention to system weaknesses that can fail to prevent an accident. Another way that organizational safety has been viewed is in terms of normal operations in so-called High Reliability Organizations (Weick et al., 1999; Weick & Sutcliffe, 2001). High reliability organizations have the ability to detect and respond to system variations due to their state of mindfulness about their operations. This mindfulness is fostered by five qualities, including deference to expertise in allowing responsibility for decisions to be driven by knowledge rather than position in the organizational hierarchy (Weick et al., 1999). Emphasis on expertise as opposed to systems of work raises questions about the status and location of knowledge for disaster prevention.

Over the last 50 years, research in social sciences has moved from understanding expertise in terms of “logical” knowledge and ability, to understanding it more in terms of wisdom and competence (Collins & Evans, 2007). A key influence in this shift was Polanyi’s (1958, 1966) concept of “tacit” knowledge, which captures understandings that can be difficult to put into words, not completely conscious, or just plain unnoticed. This tacit knowledge has been identified as vital to the decision making of experts who work in challenging, critical and extremely time pressured contexts (Klein, 1998). Much expert knowledge has been found to be acquired through experience, and drawn on through pattern matching and mental simulation (Klein, 1998, p. 157). In other words, expert knowledge is located in practices, not in rules, procedures and books (Collins & Evans, 2007, p. 23). As Nyiri explained: “One becomes an expert not simply by absorbing explicit knowledge of the type found in textbooks, but through experience, that is, through repeated trials, ‘failing, succeeding, wasting time and effort, … [and] getting a feel for a problem’” (Nyiri, 1988, p. 20).

This gap between procedures and practice is observable among Californian electricity grid transmission staff who, in the face of electricity restructuring, had to operate in non-routine ways in order to literally keep the lights on (Roe & Schulman, 2008). With changes to the system that were unforeseen and uncontrollable, maintaining a constant electricity supply was not possible by following procedures. Instead, staff relied on a “real time” responsiveness to the countless variables they were continuously managing. As one dispatcher put it: “It’s the massive amount of multitasking, you’ve got to be analysing what’s moving, how fast it can move, you’ve got to have a good overall picture of what’s going on, all this simultaneously” (Roe & Schulman, 2008, p. 37). In other hazardous industry settings, there is an acknowledgment that safe operations are critically reliant on the practices and expertise of companies and their personnel. For example, statesmen of the Australian high-pressure gas pipeline industry claim its excellent safety record is due, in part, to the safety values and expertise of its professional staff (Bonar & Tuft, 2009). As such, there are concerns about future major accident risk management as the industry has an ageing workforce and is facing the imminent retirement of many of its key experts (Maslen, 2014). This recognition of the significance of expertise does not necessarily translate into robust approaches to support its development (Maslen, 2015).

Enthusiasm for expertise at the level of the individual is not uniformly shared. This is because learning from experience is complicated when it comes to preventing rare, catastrophic events because experience can be misleading and dangerously inefficient (Hopkins, 2012, pp. 111, 16). Experience may also lead to system defenses eroding if barriers are routinely by-passed without consequence (Snook, 2000; Starbuck & Farjoun, 2005; Vaughan, 1996). The distribution of judgments across time and place also makes reliance on individual expertise problematic. As such, there is a view that knowledge needs to be embedded in formal systems. Instead of viewing organizational learning as the culmination of individual knowledge, it can also be viewed in terms of adjustments to organizational structure, procedures, performance indicators and resourcing priorities in response to incidents (Hopkins, 2012). Similarly, there has also been focus in both practice and research on rules. Hale and Borys (2013a, 2013b) conclude that the knowledge and learning needs of hazardous contexts require that expertise among personnel is balanced with explicit tools such as “good” rules and procedures.

Difficult to resolve here is the tension between tacit and formalized, and individual and organizational forms of knowledge. If good decisions rely on expertise and expertise relies on tacit knowledge, then this raises questions about the value of formal systems of knowledge and the extent to which knowledge can be embedded in formal structures in this way.

Incident Reporting Systems for Capturing Knowledge

Incident reporting systems are a common way in which hazardous industry organizations seek to capture knowledge, specifically with view to disaster prevention. In these organizational contexts trial-and-error learning is dangerously inefficient because accidents need to be prevented before they occur in all cases. For this reason, resources are not only committed to investigating major industrial disasters. Companies and industries as a whole put substantial effort into gathering and analyzing information about small failures in an attempt to learn about the state of the system and how more serious events might be prevented (Weick, Sutcliffe, & Obstfeld, 1999). In this way, incident reporting systems have come to be seen by many organizations as a key learning tool, or even as a repository of knowledge and experience. However, research on incident reporting has captured how these systems are limited both by the information they capture and their management (Drupsteen & Guldenmund, 2014). This invites critical engagement with the understanding of knowledge implicit in these systems, knowledge needs for disaster prevention, and the extent to which incident reporting systems facilitate knowledge in the sense of collective understanding.

While incident reporting systems are diverse, they are rarely designed with learning or knowledge for disaster prevention in mind. Instead, they are often designed for other functions like repairs, the consequences of which are powerfully demonstrated in the domestic gas pipeline failure at San Bruno, California, in 2010 (Hayes & Hopkins, 2014). Within these systems, process safety data is often diluted by personal safety data (Hayes, 2009). Incident reports often lack analysis of organizational causes (Dien, Dechy, & Guillaume, 2012; Jacobsson, Sales, & Mushtaq, 2009). There can also be a lack of clarity over the types of failures that should be recorded (Hopkins, 2009). In response to these issues, scholars typically focus on development and communication of warning signals (Körvers & Sonneman, 2008) and ways to support the reporting process (Dekker, 2012; Phimister, Oktem, Kleindorfer, & Kunreuther, 2003; Wahlström, 2011).

A more foundational issue is the status of incident reporting systems as knowledge. While incidents and incident reporting systems are typically positioned as a lack of, or a call for, double loop learning (Argyris, 2004) we can also view these systems as built around a particular appreciation of what knowledge is. Incident reporting systems assume that knowledge is information that can be stored and retrieved as data, separable from individual knowers and social groups. As part of the tradition of knowledge management, these systems are based on logic that organizations have data and they need systems to harness it toward robust decisions (Davenport, Harris, De Long, & Jacobson, 2001).

However, this conceptualization of knowledge is misguided. Data in databases does not in itself constitute knowledge. Rather, in Hecker’s language, databases are collective artifacts that transfer, translate, transform, and distribute knowledge—and critically but all too overlooked—they must transfer, translate, transform, and distribute knowledge between people. Hecker explains: “Artifactual knowledge is in an essential sense not self-contained and detached from the collective as it requires complex processes of collective sense-making to reinterpret, re-contextualize and re-appropriate its content and meaning” (Hecker, 2012, p. 429). We could say that information stored in a database is collected information, but it is not collective knowledge. Accordingly, it is critical to conceptualize embedded knowledge in the context of other types of collective knowledge—shared knowledge and complementary knowledge—as they are interdependent. Shared knowledge refers to common social rules, conventions, and tacit knowledge within a group or organization that facilitates decision making and communication across social groups and is often passed off as just “the way things are done here” (Schein, 1992, pp. 8–9). Complementary knowledge refers to knowledge that is specialized and distributed across individuals, but coordinated through social practices that allow it to add up to more than the sum of its parts. Critically, both of these forms of collective knowledge emphasize its social location and practice in groups.

In this context, databases, manuals, and employee newsletters act as boundary-spanning objects, providing a mechanism for disparate pieces of knowledge to be connected (mediation) or for specific pieces of knowledge to be shared (synchronization). Mediation refers to “the combination and consolidation of complementary knowledge distributed across individuals in time and space by the use of knowledge artifacts,” which works to “structure knowledge exchange and collaboration” and “channel knowledge flows and coordinate social interaction” (Hecker, 2012, p. 434). The more complementary knowledge is dispersed in organizations, the more infrastructure for supporting mediation becomes important. In hazardous industry, the predominance of large, multinational organizations coupled with the relevance of lessons across industries means that infrastructure is essential for this mediation acting as a key facilitation mechanism. For example, incident reporting systems can mediate between workers in the field who are aware of specific incidents (or hazards) and process safety experts who are able to understand the meaning of this information when considered in the context of organizational safety performance.

Synchronization refers to the formation of collective knowledge within a specialist group. This idea has links to the arguments about the need for expertise among workers (Weick et al., 1999). Again, given the realities of large, multinational organizations and the broad-reaching sources of lessons, knowledge-embedding artifacts such as incident reporting systems have the potential to align and extend knowledge across a group of workers in the same specialization. Critically, however, the use of knowledge artifacts to support synchronization does not negate the value of informal knowledge sharing in face-to-face work groups. Hayes (2013, p. 117) found that shift managers in a nuclear power station maintained a local file of past decisions for their specialist reference despite the comprehensive and generally well-used incident reporting system. We can view this group of shift managers as a community of practice, collecting and synchronizing their own stories that they have identified as beyond the scope of formal, coordinated efforts at knowledge management in their organization.

Making Knowledge Collective for Industrial Disaster Prevention

Collective knowledge is about shared understanding within a group of people, networks of complementary knowledge that can be shared across social groups; and when engaged in the context of these broader social relationships and conventions, collective knowledge also includes knowledge embedded in artifacts (Hecker, 2012). There is a strong focus here on the more tacit, informal, practical, and embodied forms of knowledge that are given increasing importance in the knowledge and expertise literatures (Barth, 1995; Ingold, 2000; Klein, 1998; Knorr-Centina, 1999; Turnbull, 2000). Yet, these forms of knowledge are met with discomfort within hazardous industry given the unique challenges due to the scale of what could go wrong and the distribution of decisions temporally, socially, organizationally, and geographically. This discomfort is understandable. However, common solutions to this challenge that seek to make explicit and formalize knowledge (either via updates to rules and procedures, or storage of information in databases) fall short of the type of understanding we could call collective knowledge and they also fall short of their stated goals. Given this, it is useful to examine knowledge more inclusively for clues about how to make knowledge collective. This requires an engagement with the definition of knowledge as understanding, as well as the ways in which knowledge is socially located.

The concept of legitimate peripheral participation sheds light on the development of shared knowledge within work groups and the engagement of this understanding in interpreting knowledge artifacts. Legitimate peripheral participation captures the idea that learning is situated in particular activities, contexts, and cultures and relies on social participation and collaboration in these authentic settings (Lave & Wenger, 1991; Wenger, 1998). In keeping with this, building expertise and a safety imagination (Pidgeon & O’Leary, 2000) among junior engineers in hazardous industry have been shown to rely on mentoring and experience under the guidance of others (Maslen, 2014). Development of this specialized form of shared knowledge requires strong social networks and extended periods of time working together. While engineering professionals complete formal tertiary education and there are procedures and standards that guide their work, informal mentoring is critical in building understanding of how this formal knowledge and its artifacts are operationalized. Where this mentoring is limited things can go wrong, as is evidenced in the case of a pipeline strike where a junior engineer was working in a remote area and was unable to correctly interpret the standard without this guidance (Maslen, 2015).

No one individual can know everything, and so workers specialize and through social coordination, knowledge sets can combine. This is the idea that part of the collective knowledge landscape is complementary knowledge. This type of knowledge is clearly visible in the case of hazardous industry, where workers in different specializations may know their aspect of the system intimately and must combine their knowledges to design a “safe” facility or to maintain an overarching understanding of the state of a system. Wegner (1986) gives an example of aircraft mechanics asked about an aircraft’s safety to illustrate the way in which knowledge can be distributed across a team. He explains that each might recall and provide different safety knowledge: “Betty might note an unexplained bit of oil on the runway, while Veronica remembers that a hydraulic indicator light was not functioning” (Wegner, 1986, p. 197). While independently these observations may not raise alarm bells, when considered in combination (that is, as complementary) they indicate the aircraft may have an oil leak. Critically, this coordination of complementary knowledge relies on a foundational shared knowledge, highlighting the ways in which forms of collective knowledge are interlinked as well as the significance of a full spectrum of knowledge types.

The notion of sensemaking is another conceptual approach useful in our consideration of collective knowledge for industrial disaster prevention. Knowledge embedded in artifacts is the most problematic form of collective knowledge to the extent that without surrounding social practices the information they contain is not collective knowledge at all. One way this is expressed is to say that knowledge embedded in artifacts requires processes of sensemaking to make it alive. At its most basic, the concept of sensemaking refers to the sense we make of situations. More specifically, it directs our attention to the ways in which our understandings and interpretations of phenomena are a matter of identity. It also reasons that our understandings and interpretations are retrospective, are ongoing, depend on socialization, build on extracted cues, and are less a matter of accuracy and more a matter of sufficiency (Weick, 1995). Whether we are talking about building an appreciation of how rules and procedures are to be interpreted in a particular work group context or building understanding out of information mediated and synchronized with the aid of an incident reporting system, this understanding comes through informal processes.

One social practice engaged in sensemaking is storytelling. As argued by Adorisio (2014, p. 466), “Corporate narratives, working life stories, anecdotes and instructions become the tapestry that arises from interaction at different organizational levels, becoming the frame within which organizational members negotiate their organizational everyday life.” In this sense, stories come to be the lens through which workers interpret their working worlds, whether this means formalized knowledge captured in rules and procedures, the data that they contribute to or retrieve from incident reporting databases, or everyday decisions formed more intuitively. Stories are drawn upon in decision making among experts in critical contexts because they are a powerful tool in pattern matching and mental simulation (Klein, 1998). They convert experiences into memorable, meaningful lessons by drawing out the significance and implications of actions and events. This significance of stories holds for sensemaking of major accident risk. Through stories, workers come to understand the potential consequences of their decisions (Hayes & Maslen, 2015; Størseth & Tinmannsvik, 2012). When faced with relatively minor operating anomalies, they are able to draw connections to past major events and so dig deeper into the state of the system (Macrae, 2009).

A key feature of these processes and practices through which knowledge becomes collective is their informality and their tacitness, to the extent that there are serious questions about whether collective knowledge is controllable through managerial intervention (Hecker, 2012, p. 440). It is these features of knowledge and their challenges that efforts to embed lessons in formal systems, rules, procedures, and incident reporting databases aim to avoid. Where organizations are large and disparate, there is the need for artifacts to mediate and synchronize the “right” knowledge. Given the stakes, this is all the more important when it comes to disaster prevention. However, it is critical that different varieties of knowledge are interdependent. In other words, incident reporting systems cannot operate independently from informal systems that facilitate knowledge sharing, but instead are vital tools to enhance them. Equally, while rules, procedures, and standards economize on individual cognition and aid knowledge sharing and memory, this externalization alone is not sufficient to make them collective knowledge.

Discussion of the Literature

Beneath discussions about accident causation and prevention are assumptions about knowledge, human learning, and mechanisms for organizational memory. High reliability researchers have been investigating why some companies that operate complex technologies appear to have a better-than-average safety record. Among other things, findings have shown that such organizations actively seek out opportunities to learn about their organizations and their technologies and display deference to expertise (Weick et al., 1999). The lens of learning in particular has triggered substantial inquiry, with researchers examining the importance of learning from incident investigations (Carroll et al., 2001), how best to present data for learning (Chevreau et al., 2006), the role of team leaders in group learning (Edmondson, 2003) and reasons why organizations fail to learn (Pidgeon & O’Leary, 2000). Most recently, Taleb’s book The Black Swan (2007) has captured the interest of scholars working in industrial risk management as another way to examine the connection between knowledge and disaster prevention (Aven, 2013, 2014; Hayes & Hopkins, 2014; Murphy & Conner, 2012, 2014; Paté-Cornell, 2012). Black swan events are unexpected; have extreme outcomes; and, most important, the concept implies that they cannot be predicted by those in control based on their available knowledge. The word “availability” is most important here, for this literature examines examples of where important information does exist but is not engaged to prevent disastrous outcomes. This emphasizes the importance of building an appreciation of how knowledge is best developed and shared to prevent disaster.

There have been debates over whether disaster prevention is a matter of expertise among individual staff (Roe & Schulman, 2008; Weick et al., 1999), or an embedding of lessons in formal systems such as rules, procedures, and databases (Hale & Borys, 2013a, 2013b; Hopkins, 2012). The interest in formalized solutions stems from the unique demands for knowledge in the case of hazardous industry. Trial-and-error learning is insufficient because disasters must be prevented before they occur. Failures within complex sociotechnical systems can also take decades to incubate and decisions are distributed over many actors in an organization operating in different areas and at different levels. The difficulty is that knowledge researchers have demonstrated that formalized knowledge does not represent all or even most of what guides action (Barth, 1995; Turnbull, 2000). Instead, formal knowledge relies on informal, practical, embodied, tacit, and local knowledges for its interpretation (Duguid, 2005; Knorr-Centina, 1999; Turnbull, 2000). In the case of hazardous industry, we can see the limitations of formal systems of knowledge management manifest in the failure of incident reporting systems to deliver on their stated goals.

These challenges for knowledge in the case of disaster prevention may seem insurmountable. However, there are outstanding questions about how knowledge can be effectively supported in this context. To progress this, we need a critical conversation about the nature of knowledge, knowledge needs for major accident risk management, and measures to facilitate knowledge development and knowledge sharing. To date, the research has lost sight of these deeper epistemic questions. The concept of collective knowledge has recently been introduced to the organizational safety literature toward this end (Maslen & Hayes, 2015). This approach to understanding knowledge highlights the intersections between different knowledge types, including more informal and tacit knowledges, expert knowledges as well as formalized knowledges including tools such as databases (Hecker, 2012). There are many fertile areas for future research, including managerial interventions to support collective knowledge in hazardous industry, the relationship between the knowledge needs of individuals and groups, incident reporting arrangements in the context of collective knowledge, and additional ways in which epistemic contexts are bridged across departments and professional groups.

Further Reading

Feldman, R., & Feldman, S. (2006). What links the chain: An essay on organizational remembering as practice. Organization, 13(6), 861–887.Find this resource:

Hayes, J. (2013). Operational decision-making in high-hazard organizations: Drawing a line in the sand. Farnham, U.K.: Ashgate.Find this resource:

Hecker, A. (2012). Knowledge beyond the individual? Making sense of a notion of collective knowledge in organization theory. Organization Studies, 33(3), 423–445.Find this resource:

Knorr-Centina, K. (1999). Epistemic cultures: How the sciences make knowledge. Cambridge, MA: Harvard University Press.Find this resource:

Lam, A. (2000). Tacit knowledge, organizational learning and societal institutions: An integrated framework. Organization Studies, 21(3), 487–513.Find this resource:

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.Find this resource:

Klein, G. (1998). Sources of power: How people make decisions. Cambridge, MA: MIT Press.Find this resource:

Maslen, S., & Hayes, J. (2016). Preventing black swans: Incident reporting systems as collective knowledge management. Journal of Risk Research, 19(10), 1246–1260.Find this resource:

McDermott, R. (2000). Why information technology inspired by cannot deliver knowledge management. In E. L. Lesser, M. A. Fontaine, & J. A. Slusher (Eds.), Knowledge and communities (pp. 21–36). Woburn, MA: Butterworth-Heinemann.Find this resource:

Spender, J.-C. (1996). Organizational knowledge, learning and memory: Three concepts in search of a theory. Journal of Organizational Change Management, 9(1), 63–78.Find this resource:

Weick, K., (1995). Sensemaking in organizations. Thousand Oaks, CA: SAGE.Find this resource:

Weick, K., Sutcliffe, K., & Obstfeld, D. (1999). Organizing for high reliability: Processes of collective mindfulness. In R. I. Sutton & B. M. Staw (Eds.), In Research in Organizational Behavior (pp. 81–123). Stamford, CT: JAI.Find this resource:

References

Adorisio, A. (2014). Organizational remembering as narrative: “Storying” the past in banking. Organization, 21(4), 463–476.Find this resource:

Argyris, C. (1992). On organizational learning. Oxford: Blackwell.Find this resource:

Argyris, C. (2004). Reasons and rationalisations: The limits to organizational knowledge. Oxford: Oxford University Press.Find this resource:

Argyris, C., & Schon, D. (1996). Organizational learning II: Theory, method and practice Reading, MA: Addison-Wesley.Find this resource:

Aven, T. (2013). On the meaning of a black swan in a risk context. Safety Science, 57, 44–51.Find this resource:

Aven, T. (2014). The concept of antifragility and its implications for the practice of risk analysis. Risk Analysis, 35(3), 476–483.Find this resource:

Barth, F. (1995). Other knowledge and other ways of knowing. Journal of Anthropological Research, 51(1), 65–68.Find this resource:

Bloom, H. (2000). Global brain: The evolution of mass mind from the Big Bang to the 21st century. New York: John Wiley.Find this resource:

Bonar, C., & Tuft, P. (2009). Experience with the Australian Pipeline Incident Database. APIA Convention Conference Paper, Cairns, Australia.Find this resource:

Braganza, A. (2004). Rethinking the data–information–knowledge hierarchy: Towards a case-based model. International Journal of Information Management, 24(4), 347–356.Find this resource:

Carroll, J. S., Rudolph, J., Hatakenaka, S., Wiederhold, T., & Boldrini, M. (2001). Learning in the context of incident investigation: Team diagnoses and organizational decisions at four nuclear power plants. In E. Salas & G. Klein (Eds.), Linking expertise and naturalistic decision making (pp. 349–366). Mahwah, NJ: Lawrence Erlbaum Associates.Find this resource:

Chevreau, F. R., Wybo, J. L., & Cauchois, D. (2006). Organizing learning processes on risks by using the bow-tie representation. Journal of Hazardous Materials, 130(3), 276–283.Find this resource:

Collins, H., & Evans, R. (2007). Rethinking expertise. Chicago: University of Chicago Press.Find this resource:

Davenport, T., Harris, J., De Long, D., & Jacobson, A. (2001). Data to knowledge to results: Building an analytic capability. California Management Review, 43(2), 117–138.Find this resource:

Dekker, S. (2012). Just culture: Balancing safety and accountability (2d ed.). Aldershot, U.K.: Ashgate.Find this resource:

Dien, Y., Dechy, N., & Guillaume, E. (2012). Accident investigation: From searching direct causes to finding in-depth causes—Problem of analysis or/and of analyst? Safety Science, 50(6), 1398–1407.Find this resource:

Drucker, P. (1993). Post-capitalist society. New York: HarperCollins.Find this resource:

Drupsteen, L., & Guldenmund, F. (2014). What is learning? A review of the safety literature to define learning from incidents, accidents and disasters. Journal of Contingencies and Crisis Management, 22(2), 81–96.Find this resource:

Duguid, P. (2005), “The art of knowing”: Social and tacit dimensions of knowledge and the limits of the community of practice. The Information Society, 21(2), 109–118.Find this resource:

Dzisah, J., & Etzkowitz, H. (Eds.). (2012). The age of knowledge: The dynamics of universities, knowledge & society (Studies in critical social science). Leiden, The Netherlands: Brill.Find this resource:

Edmondson, A. C. (2003). Speaking up in the operating room: How team leaders promote learning in interdisciplinary action teams. Journal of Management Studies, 40(6), 1419–1452.Find this resource:

Feldman, R., & Feldman, S. (2006). What links the chain: An essay on organizational remembering as practice. Organization, 13(6), 861–887.Find this resource:

Gettier, E. L. (1963). Is justified true belief knowledge? Analysis, 23(6), 121–123.Find this resource:

Goldman, A. I. (1979). What is justified belief? In G. S. Pappas (Ed.), Justification and knowledge: New studies in epistemology (pp. 1–23). Dordrecht, The Netherlands: Springer.Find this resource:

Grover, V., & Davenport, T. (2001). General perspectives on knowledge management: Fostering a research agenda. Journal of Management Information Systems, 18(1), 5–21.Find this resource:

Haas, P. (1992). Introduction: Epistemic communities and international policy coordination. International Organization, 46(1), 1–35.Find this resource:

Hale, A., & Borys, D. (2013a). Working to rule, or working safely? Part 1: A state of the art review. Safety Science, 55, 207–221.Find this resource:

Hale, A., & Borys, D. (2013b). Working to rule or working safely? Part 2: The management of safety rules and procedures. Safety Science, 55, 222–231.Find this resource:

Hardt, M., & Negri, A. (2000). Empire. Cambridge, MA: Harvard University Press.Find this resource:

Hayes, J. (2009). Incident reporting: A nuclear industry case study. In A. Hopkins (Ed.), Learning from high reliability organisations (pp. 117–134). Sydney, Australia: CCH.Find this resource:

Hayes, J. (2013). Operational decision-making in high-hazard organizations: Drawing a line in the sand. Farnham, U.K.: Ashgate.Find this resource:

Hayes, J. (2015). Investigating design office dynamics that support safe design. Safety Science, 78, 25–34.Find this resource:

Hayes, J., & Hopkins, A. (2014). Nightmare pipeline failures: Fantasy planning, black swans and integrity management. Sydney, Australia: CCH.Find this resource:

Hayes, J., & Maslen, S. (2015). Knowing stories that matter: Learning for effective safety decision-making. Journal of Risk Research, 18(6), 714–726.Find this resource:

Hecker, A. (2012). Knowledge beyond the individual? Making sense of a notion of collective knowledge in organization theory. Organization Studies, 33(3), 423–445.Find this resource:

Hopkins, A. (2009). Identifying and responding to warnings. In A. Hopkins (Ed.), Learning from high reliability organisations (pp. 33–58). Sydney, Australia: CCH.Find this resource:

Hopkins, A. (2012). Disastrous decisions: The human causes of the Gulf of Mexico blowout. Sydney, Australia: CCH.Find this resource:

Ingold, T. (2000). The perception of the environment: Essays on livelihood, dwelling, and skill. London: Routledge.Find this resource:

Jacobsson, A., Sales, J., & Mushtaq, F. (2009). A sequential method to identify underlying causes from industrial accidents reported to the MARS database. Journal of Loss Prevention in the Process Industries, 22(2), 197–203.Find this resource:

Klein, G. (1998). Sources of power: How people make decisions. Cambridge, MA: MIT Press.Find this resource:

Knorr-Centina, K. (1999). Epistemic cultures: How the sciences make knowledge. Cambridge, MA: Harvard University Press.Find this resource:

Körvers, P. M. W., & Sonnemans, P. J. M. (2008). Accidents: A discrepancy between indicators and facts! Safety Science, 46, 1067–1077.Find this resource:

Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago: University of Chicago Press.Find this resource:

Lam, A. (1997). Embedded firms, embedded knowledge: Problems of collaboration and knowledge transfer in global cooperative ventures. Organization Studies, 18(6), 973–996.Find this resource:

Lam, A. (2000). Tacit knowledge, organizational learning and societal institutions: An integrated framework. Organization Studies, 21(3), 487–513.Find this resource:

Latour, B., & Woolgar, S. (1979). Laboratory life: The construction of scientific facts. Princeton, NJ: Princeton University Press.Find this resource:

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.Find this resource:

Lynch, M. (1984). Art and artifact in laboratory science: A study of shop work and shop talk in a research laboratory. London: Routledge.Find this resource:

Macrae, C. (2009). From risk to resilience: Assessing flight safety incidents in airlines. In A. Hopkins (Ed.), Learning from high reliability organisations (pp. 95–115). Sydney, Australia: CCH.Find this resource:

Mannheim, K. (1952). The problem of generations. Essays on the sociology of knowledge. London: Routledge.Find this resource:

Maslen, S. (2014). Learning to prevent disaster: An investigation into methods for building safety knowledge among new engineers to the Australian gas pipeline industry. Safety Science, 64, 82–89.Find this resource:

Maslen, S. (2015). Organisational factors for learning in the Australian gas pipeline industry. Journal of Risk Research, 18(7), 896–909.Find this resource:

Maslen, S., & Hayes, J. (2014). Experts under the microscope: The Wivenhoe Dam case. Environment, Systems and Decisions, 34(2), 183–193.Find this resource:

Maslen, S., & Hayes, J. (2016). Preventing black swans: Incident reporting systems as collective knowledge management. Journal of Risk Research, 19(10), 1246–1260.Find this resource:

McDermott, R. (2000). Why information technology inspired by cannot deliver knowledge management. In E. L. Lesser, M. A. Fontaine, & J. A. Slusher (Eds.), Knowledge and communities (pp. 21–36). Woburn, MA: Butterworth-Heinemann.Find this resource:

Miller, H. T., & Fox, C. J. (2001). The epistemic community. Administration and Society, 32(6), 668–685.Find this resource:

Murphy, J. F., & Conner, J. (2012). Beware of the black swan: The limitations of risk analysis for predicting the extreme impact of rare process safety incidents. Process Safety Progress, 31(4), 330–333.Find this resource:

Murphy, J. F., & Conner, J. (2014). Black swans, white swans, and 50 shades of grey: Remembering the lessons learned from catastrophic process safety incidents. Process Safety Progress, 33(2), 110–114.Find this resource:

Nyiri, J. C. (1988). Tradition and practical knowledge. In J. C. Nyiri & B. Smith (Eds.), Practical knowledge: Outlines of a theory of traditions and skills (pp. 17–52). London: Croom Helm.Find this resource:

Paté-Cornell, E. (2012). On “black swans” and “perfect storms”: Risk analysis and management when statistics are not enough. Risk Analysis, 32(11), 1823–1833.Find this resource:

Phimister, J. R., Oktem, U., Kleindorfer, P. R., & Kunreuther, H. (2003). Near-miss incident management in the chemical process industry. Risk Analysis, 23, 445–459.Find this resource:

Pidgeon, N., & O’Leary, M. (2000). Man-made disasters: Why technology and organizations (sometimes) fail. Safety Science, 34(1–3), 15–30.Find this resource:

Polanyi, M. (1958). Personal knowledge: Towards a postcritical philosophy. London: Routledge and Kegan Paul.Find this resource:

Polanyi, M. (1966). The tacit dimension. Chicago: University of Chicago Press.Find this resource:

Quinn, J. (1992). Intelligent enterprise: A knowledge and service based paradigm for industry. New York: Free Press.Find this resource:

Reason, J. (1997). Managing the risks of organizational accidents. Aldershot, U.K.: Ashgate.Find this resource:

Reason, J. (2000). Human error: Models and management. BMJ, 320(7237), 768–770.Find this resource:

Roe, E., & Schulman, P. R. (2008). High reliability management: Operating on the edge Stanford, CA: Stanford University Press.Find this resource:

Russell, P. (1983). The global brain. Los Angeles: Jeremy P. Thatcher.Find this resource:

Schein, E. (1992). Organisational culture and leadership (2d ed.). San Francisco: Jossey-Bass.Find this resource:

Snook, S. A. (2000). Friendly fire: The accidental shootdown of U.S. Black Hawks over Northern Iraq. Princeton, NJ: Princeton University Press.Find this resource:

Spence, C., & Carter, D. (2011). Accounting for the general intellect: Immaterial labour and the social factory. Critical Perspectives on Accounting, 22(3), 304–315.Find this resource:

Starbuck, W. H., & Farjoun, M. (Eds.). (2005). Organization at the limit: Lessons from the Columbia disaster. Oxford: Blackwell.Find this resource:

Størseth, F., & Tinmannsvik, R. K. (2012). The critical re-action: Learning from accidents. Safety Science, 50(10), 1977–1982.Find this resource:

Taleb, N. N. (2007). The black swan: The impact of the highly improbable. New York: Random House.Find this resource:

Turnbull, D. (2000). Masons, tricksters and cartographers: Comparative studies in the sociology of scientific and indigenous knowledges. Amsterdam: Harwood Academic Publishers.Find this resource:

Vaughan, D. (1996). The Challenger launch decision: Risky technology, culture and deviance at NASA. Chicago: University of Chicago Press.Find this resource:

Wahlström, B. (2011). Organisational learning—Reflections from the nuclear industry. Safety Science, 49, 65–74.Find this resource:

Walsh, J., & Ungson, G. (1991). Organizational memory. Academy of Management Review, 16(1), 57–91.Find this resource:

Wegner, D. (1986). Transactive memory: A contemporary analysis of the group mind. In B. Mullen & G. Goethals (Eds.), Theories of group behavior (pp. 185–208). New York: Springer-Verlag.Find this resource:

Weick, K. (1995). Sensemaking in organizations. Thousand Oaks, CA: SAGE.Find this resource:

Weick, K., Sutcliffe, K., & Obstfeld, D. (1999). Organizing for high reliability: Processes of collective mindfulness. In R. I. Sutton & B. M. Staw (Eds.), Research in organizational behavior (pp. 81–123). Stamford, CT: JAI.Find this resource:

Weick, K., & Sutcliffe, K. (2001). Managing the unexpected: Assuring high performance in an age of complexity. San Francisco: Jossey-Bass.Find this resource:

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge, U.K.: Cambridge University Press.Find this resource: