The ORE of Communication will be available for subscription in late September. Speak to your Oxford representative or contact us to find out more.

Dismiss
Show Summary Details

Page of

 PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, COMMUNICATION (communication.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 25 September 2017

Absent Information in Integrative Environmental and Health Risk Communication

Summary and Keywords

Communication is typically understood in terms of what is communicated. However, the importance of what is intentionally or unintentionally left out from the communication process is high in many fields, notably in communication about environmental and health risks. The question is not only about the absolute lack of information. The rapidly increasing amount and variability of available data require actors to identify, collect, and interpret relevant information and screen out irrelevant or misleading messages that may lead to unjustified scares or hopes and other unwanted consequences. The ideal of balanced, integrative, and careful risk communication can only rarely be seen in real-life risk communication, shaped by competition and interaction between actors emphasizing some risks, downplaying others, and leaving many kinds of information aside, as well as by personal factors such as emotions and values, prompting different types of responses. Consequently, risk communication is strongly influenced by the characteristics of the risks themselves, the kinds of knowledge on them and related uncertainties, and the psychological and sociocultural factors shaping the cognitive and emotive responses of those engaged in communication. The physical, economic, and cultural contexts also play a large role. The various roles and factors of absent information in integrative environmental and health risk communication are illustrated by two examples. First, health and environmental risks from chemicals represent an intensively studied and widely debated field that involves many types of absent information, ranging from purposeful nondisclosure aimed to guarantee public safety or commercial interests to genuinely unknown risks caused by long-term and cumulative effects of multiple chemicals. Second, light pollution represents an emerging environmental and health issue that has gained only limited public attention even though it is associated with a radical global environmental change that is very easy to observe. In both cases, integrative communication essentially involves a multidimensional comparison of risks, including the uncertainties and benefits associated with them, and the options available to reduce or avoid them. Public debate and reflection on the adequacy of risk information and on the needs and opportunities to gain and apply relevant information is a key issue of risk management. The notion of absent information underlines that even the most widely debated risk issues may fall into oblivion and re-emerge in an altered form or under different framings. A typology of types of absent information based on frameworks of risk communication can help one recognize its reasons, implications, and remediation.

Keywords: absent information, chemicals, environmental communication, light pollution, risk communication

Introduction

The importance of relevant information and therefore implicitly also of absent information is crucial in risk communication. Various types of absent information—from purposeful nondisclosures to accidental omissions—influence risk communication processes. Some level of absent information or not knowing is always involved in risk communication because the concept of risk denotes, by standard and commonly approved definitions, the possibility of an adverse phenomenon (Oxford English Dictionary) that has not yet happened, and the “effect of uncertainty on objectives” (International Organization for Standardization [ISO], 2009).

Since the early days of the Enlightenment, the fundamental solution to the problems caused by absent information has been to produce more or better information through scientific methods, involving theoretical reasoning and empirical information from observations, notably in experiments and tests. The tradition of Enlightenment underlies the current debates on evidence-based policymaking (Pawson, Wong, & Owen, 2011). The ideal of evidence-based decision-making and policymaking, however, originates largely in medical care and health communication (Vahabi, 2007). It assumes that health-related and other societal problems can be better solved if decision-makers are provided with reliable and relevant advice based on scientific studies. This approach favors facts over values but also certain types of facts over others. In particular, quantitative and formal assessments and indicators based on measurable variables tend to be emphasized in communication processes, while other types of knowledge may remain unnoticed.

Contemporary societies are strongly shaped by communication and information technologies and rapidly increasing volumes of digital data. Therefore, the question of absent risk information is also a question about the proper ways to sort things out from the information flow (Bowker & Star, 1999). It has been estimated that the amount of data that humankind creates and copies to digital storage doubles every two years (International Data Corporation [IDC], 2014). Millions of scientific papers alone are published each year, and an increasing share of this knowledge is available through open-access publication channels (Björk, Laakso, Welling, & Paetau, 2014).

Ironically, the increasing amount and improving availability of science-based information means that many of the research papers published today sink into oblivion without any direct and discernible influence on scientific, public, or policy debates. Even the science-based reports specifically aimed at the general public or policymakers often fail to gain larger attention. For example, a study showed that 31% of the policy reports published online by the World Bank were never downloaded during a five-year period (Doemeland & Trevino, 2014). Information in such reports may be delivered through other channels such as press releases, news items, or direct interaction between providers and users. Nevertheless, the absence of information in our information age is an issue of increasing importance. All human activities—ranging from practical everyday choices to long-term management of global risks—involve needs to find the relevant and discard non-relevant information.

Communication processes always involve a potential for surprises. Information may not be used or it may be used by unexpected actors in an unexpected manner (Lyytimäki, Tapio, Varho, & Söderman, 2013). Scientific studies and results gaining wider publicity may not be the ones considered the most relevant ones by the experts in the field, as shown by several examples of chemical risks causing only marginal health effects on the population level but gaining considerable public attention (Bradbury, 1989; Peters & Slovic, 1996; Ropeik, 2010). For example, relatively minor risks related to certain chemical compounds in cosmetics or environmental contaminants in otherwise healthy food, such as fish, have been brought up as cases of unjustified alarmism (Mazur, 2004; Assmuth, 2011). Media reporting—and, increasingly, social media debates—highlighting individual risk may frighten consumers and make them more responsive to marketing messages promoting alternative products even if reliable information on risks related to such alternative products is absent.

Environmental and health risk communication provides an important case of the roles that absent information plays in public debates. They are fields characterized by the need for integrative knowledge giving a balanced, reliable, and timely overall picture of relevant risks and benefits while at the same time avoiding the creation of unjustified public scares, hasty management decisions, or ill-founded policy strategies (Mazur, 2004; Assmuth & Lyytimäki, 2015). Such knowledge creation and communication is essentially about the balance between inclusion and exclusion of knowledge. In practice, this ideal of balanced and integrative risk communication is rare in real-life communication contexts that are characterized by competition and interaction between various actors emphasizing some risks and their attributes, downplaying others, and leaving many kinds of information unnoticed (Kasperson, Kasperson, Pidgeon, & Slovic, 2003; Gross, 2010). Therefore, understanding different forms and roles of absent information is vital for both integrative and more narrowly focused risk communication.

The key characteristics of environmental and health risk communication are introduced and a general framework of absent information is described. Insights from the debates (and non-debates) on environmental and health risks of chemicals and light pollution are used to illustrate the forms of absent information and factors that are causing information to be intentionally or unintentionally left without attention. Key lessons are summarized for integrative environmental and health risk communication.

Key Characteristics of Risk Communication

Risks have many faces. They can be sudden and isolated events or slowly evolving and cumulative processes. They can be caused by humans and technology, other organisms, or nature as a whole, and they can be taken voluntarily or involuntarily. Some risks occur locally, while others are systemic and manifest themselves even on a global level. Risks may hit individuals, populations, or whole communities, and they affect health, safety, wealth, or other valued entities. Risk as loss of opportunity carries with it the notion that risks can be worth taking, that is, that there is a possibility of offsetting benefits. Risks are closely connected with trust, because trust is a basic social mechanism that enables dealing with uncertainty and ignorance.

Risks are formally defined as functions of the probability and consequence of adverse events. In the case of harmful agents such as chemicals or other environmental stressors, risks are defined as functions of dose and response. Their probabilities and consequences can be easy to assess or unpredictable at least in quantitative terms; they may even be hard to assess in retrospect, given multiple and indirect consequences or responses. Risks are further distinguished from hazards, defined by the inherent harmful properties of the agent without consideration of probability. Even more profoundly, risks can be ambiguous, that is, not objectively and crisply definable but subject to varying and shifting perspectives and values, and influenced by cultural factors and mindsets.

Reflecting this diversity of risks, the diversity of risk communication is great (Cho, Reimer, & McComas, 2015). Studies of risk communication address the presence and absence of information from various perspectives, such as psychological aspects of risk perception and cognitive processing of risk information (Tversky & Kahneman, 1974), social amplification and attenuation of risk (Kasperson, Kasperson, Pidgeon, & Slovic, 2003), or discourses and rhetorical aspects of risk representations in different sociocultural contexts and historical settings (Mazur, 2006). Together they can generate a rich understanding of the different forms and roles of absent information in risk communication.

On a general level, risk communication can be defined as an act of conveying or transmitting information about risks between different parties. These acts can vary greatly, including purposeful and non-purposeful modes of communication. There is great variation also in the contents being communicated—both in the substance matter and other concerns and circumstantial matters. This variation is influenced by the risks themselves, by those engaged in communication, and by the situation, including the channels and broader contexts. Specifically, risk communication in many of its forms in some way addresses both the uncertainty of risk—by assessing the quality of available evidence—and ambiguity—by assessing the limits of evidence.

Risk communication includes all of the diverse forms of press, online, broadcast, social media, interpersonal, organisational and other types of communication that make up the social debate about risks. Core elements of risk communication as a dynamic process involving multiple parties can be seen from these definitions:

  • Risk communication is an interactive process of exchanging information and opinion among individuals, groups, and institutions (National Research Council [NRC], 1989, p. 21; emphasizing interaction and including opinion-building).

  • Risk communication is a process of exchanging information among interested parties about the nature, magnitude, significance, or control of a risk (Covello, 1992, p. 359; emphasizing information exchange).

  • Risk communication is the process of informing people about potential hazards to their person, property, or community (Environmental Protection Agency [EPA], 2007, p.1; emphasizing the provision of information).

It is remarkable that these authoritative definitions proceed from more to less inclusive, suggesting a pervasive variability of viewpoints. Yet, Fischhoff (1995) already summed up two decades of research and practice, characterizing the progress by the following developmental stages of increasing consideration of qualitative aspects, context, interaction, deliberation, participation, and trust-building: (1) “All we have to do is get the numbers right”; (2) “All we have to do is tell them the numbers”; (3) “All we have to do is explain what we mean by the numbers”; (4) “All we have to do is show them that they’ve accepted similar risks in the past”; (5) “All we have to do is show them that it’s a good deal for them,” (6) “All we have to do is treat them nicely”; (7) “All we have to do is make them partners.”

Theories of risk communication have been summarized by Sheppard, Janoske, and Liu (2012) focusing on risks of terrorism, but treating them also more generally. In this connection, risk communication was divided in action in the preparedness, response, and recovery phases, reflecting the emphasis on emergency-type risks but even more generally applicable. The cross-cutting theories discussed include (a) Crisis and Emergency Risk Communication (CERC) model; (b) Situational Theory of Publics; (c) Heuristic-Systematic Model of message processing; and (d) Deliberative Process Model, including divergent views. These may be regarded as corresponding to the broadening scope of communication described by Fischhoff (1995).

Reciprocal interaction has been emphasized as the key success factor of risk communication by the research community (Callon, 1999; Aven & Renn, 2010). The International Risk Governance Council (IRGC, 2012) similarly emphasized risk communication as a deliberate two-way process of knowledge exchange and negotiation. This differs from the comprehension of science-based risk communication as one-way and linear dissemination of expert knowledge to the ignorant or misguided receiver (see also the emphasis on informing people in EPA, 2007). Risk communication is also increasingly seen as an integral part of risk management and assessment, rather than an end of the continuum between risk assessment producing the knowledge and risk management focusing on the implementation. The two different approaches to risk communication are summarized in Figure 1.

Absent Information in Integrative Environmental and Health Risk CommunicationClick to view larger

Figure 1. Risk communication perceived as one-way information translation process (above) and reciprocal learning process (below).

The motivations behind risk communication activities are manifold. They can include regulatory and legal requirements regarding community right-to-know, minimization of reputation damage and economic losses in cases of accident or misconduct, or a need to prevent accidents from happening by building good safety culture. The International Risk Governance Council (IRGC, 2012) identified two main spheres and functions for risk communication from the perspective of risk management. First, internal risk communication enables risk assessors and risk managers to develop a common understanding of their tasks and responsibilities. Second, external risk communication empowers stakeholders and civil society to understand the risk and the rationale for risk management.

Focusing on health and environmental risk communication, Aven and Renn (2010, p. 160) identified the following major challenges for risk communication:

  • It needs to explain the concept of probability and stochastic effects.

  • It needs to cope with different time scales and provide an understanding of synergistic effects.

  • It should improve the credibility of the agencies and institutions that provide risk information.

  • It should be able to properly address the diversity of stakeholders and intercultural differences.

Both the one-way and reciprocal risk communication strategies have weaknesses and strengths. In some cases, one-way communication can be adequate, but in other cases wide-based inclusion of stakeholders is essential. Rapid and efficient one-way informing is often a priority in acute cases of crisis communication involving imminent health or environmental risks. However, integrative risk communication inherently calls for comprehensive frameworks capable of supporting multi-actor societal deliberation during risk assessment and management (Assmuth & Hildén, 2008). In particular, long-term interaction involving careful listening of local concerns is needed to build trust and social acceptance. Trust is vital because it enables social stability and makes interaction between different parties of communication possible. In any case, risk communication processes must be tailored to meet the characteristics of specific risks as well as historical setting and physical context of risk management. The right selection of risk communication strategy depends on what kind of risk information is available or absent.

Forms of Absent Information

Absent information has various forms and roles in integrative environmental and health risk communication. Figure 2 presents a general-level typology of absent information highlighting six different types of non-recognition, unawareness, and missing information relevant for risk communication. Borderlines between these types are overlapping and dynamic because the risk issues and their socioeconomic and ecological contexts change over time. Furthermore, the knowledge, values, and attitudes influencing recognition and non-recognition change as well.

The typology builds on the notions of uncertainty, ambiguity, and ignorance. Uncertainty refers to risks with known outcomes and probabilities while ambiguity refers to risks with unknown outcomes and probabilities. Ignorance refers to absence of knowledge about the risk issue itself (Hildén, 1997; Gross, 2010).

There are different levels and types of knowledge and non-knowledge (Gross, 2010; 2016). Known unknowns refer to active and informed ignorance where actors are aware of their lack of knowledge. This kind of ignorance can also be awareness about what should remain unknown, as the knowledge is considered irrelevant or unimportant, too costly to obtain, or dangerous to be known by a certain actor.

The issues that people do not know that they do not know can be described as latent non-knowledge or unknown unknowns (Kerwin, 1993; Gaudet, 2013). This kind of total absence of information poses challenges to risk communication because it can be detected only in retrospect. Corresponding definitions of uncertainty include statistical or aleatory, epistemic, and ontological uncertainty, with the latter corresponding to unknown unknowns (Paté-Cornell, 1996). However, we focus on the less commonly analyzed social construction, deconstruction, or lack of ignorance that introduces important additional dimensions to these categories of uncertainty and to their roles in risk communication.

Absent Information in Integrative Environmental and Health Risk CommunicationClick to view larger

Figure 2. Typology of absent information in risk communication. The framework distinguishes between the reason for the absence of information (intentional or inadvertent) and the access to information (some have access vs. nobody has access). The typology is based on earlier categorisations of non-recognition and unawareness (Gross, 2010; Lyytimäki et al., 2011, 2012).

Deliberate Nondisclosure

Deliberate nondisclosure occurs when an actor possessing the information purposefully restricts the information flow. A key motivation for deliberate nondisclosure of the risk information is the perceived need to avoid unwanted consequences of delivering the information. An actor may aim to keep the issue out from a specific public or policy arena, to prevent the discussion during a certain time period, or to prevent the discussion altogether. Nondisclosure may be achieved by refusing to communicate the information or by indirect ways such as labeling the calls for information as ridiculous or by diverting the discussion off track by downplaying the importance of the issue and highlighting other issues or framings. Nondisclosure can be further achieved even through apparent and superficial openness, for example, by hiding information in a great amount of other data. Deliberate use of overly technical language to present the information can also be an efficient technique for nondisclosure.

Motivations for deliberate nondisclosure are highly variable. It can be about purposeful omitting of a certain risk considered unpleasant, in order to secure personal or collective interests. Also commercial interests frequently motivate the withholding of information. Deliberate nondisclosure can also be about apparently benevolent attempts to secure the public good or spare a fellow citizen from stressful information, that is, the motives can be altruistic. For example, the access to information on publicly funded research can be deliberately constrained in order to avoid the misuse of the data or the creation of confusion or misunderstandings; or information on personal health risks can be discomforting and even distracting for patients and their caretakers. However, even allegedly altruistic attempts to withhold risk information can violate the principles of transparency and right-to-know. Moreover, any attempt to restrict open discussion may lead to increasing public suspicion and erode trust toward risk communicators.

Deliberate Inattention

Deliberate inattention occurs when a person or organization chooses not to obtain certain information it knows to exist. The information may be publicly and freely available, or it may be behind pay-walls or institutional barriers. The information may be considered useless in the current situation, or acquiring it may be considered too troublesome or costly in relation to the potential benefits. Specific expertise to acquire or interpret the information may not be possessed by the actor.

The reason for such conscious rejection may also be that acquiring or preserving the given piece of information and disseminating it is considered the responsibility of some other actor. In contemporary sector-based public administrations and private organizations this is a common situation. In some cases the authority or civil servant is obliged by law to focus on information related to a certain risk issue or only on narrowly defined impacts. Inattention also depends on the risk and the situation in question. Notably, in acute emergency situations attention needs to be restricted to fewer salient facts.

Deliberate inattention is a necessity because of the abundant data available in information-intensive societies. This is therefore a growing need for those engaged in risk communication, professionals and lay persons alike. At the same time, it is also a dangerous strategy for risk communication. Because of increasing specialization of knowledge production, and too little resources put into interdisciplinary data mining, the advances in areas or disciplines outside the actors’ own core expertise are prone to remain unnoticed (Daughton, 2001; Lyytimäki, Assmuth, & Hildén, 2011). Therefore, a risk communicator may believe that all relevant information is already possessed, even though in reality the information is erroneous, insufficient, inaccurate, outdated, or misunderstood. Such false awareness or overconfidence is a common phenomenon in all human activities, including research. Researchers can even be particularly blinded by their knowledge and unable (also unwilling) to consider other facts and arguments. It has been demonstrated that experts are often overconfident of the certainty and relevance of their judgements (Tversky & Kahneman, 1974), and the resulting bias may be particularly common and strong in the area of risk communication, notably in areas such as health that have high stakes and requirements for authoritative knowledge.

Deliberate Unawareness

Deliberate unawareness is a situation where an actor knows or at least suspects that information dealing with a certain risk issue is inadequate or nonexistent but considers that generating new knowledge filling the gaps is not necessary or possible and hence the state of unawareness is acceptable. The most obvious reason is that the issue is not considered to be interesting or important and therefore investing resources to knowledge creation or acquisition is not seen as necessary.

On a personal level, actors tend to avoid cognitive dissonance resulting from incompatible pieces of information. A common way to deal with information that is inconsistent with one’s personal beliefs is to set it aside or even dismiss and oppose it by denying its truthfulness or relevance. With environmental and health risks, this is a particularly important type of inattention, as the fear and other emotional stress associated with risk information easily cause denial (Norgaard, 2006; Gross & Bleicher, 2013). A related phenomenon is reticence and numbness in the face of risk information. This can occur both with risks one is exposed to and risks affecting a multitude of people (or other exposed entities such as nonhuman organisms); the sheer amount of those exposed and the resultant magnitude of the risk can cause numbing and paradoxically result in inaction (Slovic, 2007).

Also the lack of perceived opportunities to act based on the information may lead to an unwillingness to acquire the information in the first place (Wynne, 1996). Motivation for investing resources in following the risk debate is prone to decrease, especially if clear and practical advice on avoiding the risks is perceived to be lacking. Such perpetuating and systemic inattention can take place both on the personal and collective levels, even as a response of a research community focusing on some risk at the cost of others.

In other cases, creating new information can be considered socially or politically unacceptable or unethical (Gross, 2016). For example, creating and delivering information that could be used to manufacture weapons of mass destruction or aid terrorist attacks can be criticized as too dangerous even when it can produce other beneficial insights (such as nuclear research). Likewise, creating and openly sharing information on easily transmissible diseases may help create new vaccines, but there is a risk that it may also be used to create new biological weapons. Strong and partially conflicting arguments exist for deliberate unawareness, deliberate nondisclosure, and open sharing of such information.

Unintentional Nondisclosure

Lack of resources for active communication or lack of interest from the media or target groups of communication may lead to nondisclosure even when the actor is highly motivated to share the risk information. Unintentional nondisclosure often results from difficulties of communication created by disciplinary or organizational borders or societal sectors. Local or lay knowledge can be omitted from expert debates because of a lack of technical vocabulary required for convincing communication, and expert knowledge can fail to reach lay arenas because of the professional expressions commonly used by experts. Resources needed for the disclosure may be linguistic ones, but they may also be social, cultural, or financial, or a combination of these.

Courage to speak up about controversial or uncertain risk issues is highly relevant for risk communication. Because of the positive connotations widely attached to a certain issue, a ring of silence preventing the people from expressing their concerns may emerge (Noelle-Neumann, 1991). For example, civil servants may fear being stigmatized as biased, and scientists may fear losing trustworthiness. Veritable cultures of silence can thus develop, especially in totalitarian contexts as shown, for example, by the nondisclosure of (and inattention to or disbelief in) information on the Holocaust both during and even after it. However, explicit or implicit norms and practices easily develop also in more open situations in organizations that impede disclosure because their members are sensitive to nondesirable disclosure. Such unintentional (self-)censorship is unconscious to a considerable extent, which makes it more difficult to identify and deal with.

Focusing on a personal level, Geiger and Swim (2016) have coined the term “pluralistic ignorance” to describe a situation where people decline to share information because they wrongly assume that their peers are more doubtful about the information than they actually are. This makes people afraid of losing their competence and prestige among their peers and to be silent even about risk issues important for themselves. A follow-up and a more perpetual effect may again be that people as well as organizations not only refuse to share information but are even reluctant to acquire it in the first place.

Unintentional Inattention

Unintentional inattention refers to a situation where relevant information exists but remains unnoticed by actors potentially benefiting from it; it also remains unnoticed by others such as patients and clients and even the whole community. Relevant information may be unnoticed because of culturally shaped blind spots, disciplinary boundaries, organizational or personal routines, or institutionalized practices of information retrieval. Unintentional inattention often results from certain heuristics and routines guiding the acquisition and communication of information but not actively followed by the actor following them. Importantly, so-called echo-chambers of social media can create filters that strongly influence the information received by the individual.

Unintentional inattention may be the result of forgetfulness. Existing information may be erased from short-term or long-term, individual or collective, and informal or institutional memory (Lyytimäki, Assmuth, & Hildén, 2011). The key function for risk communication in such cases is to remind people about the missing information, or at least generally about its possibility even when it is as yet not identified (or even about the certainty of missing some information). In such cases the actor may try to restore information by asking advice from other actors, by seeking information from relevant archives or other sources, or by practical experiments or theoretical reasoning to reproduce the information.

Unintentional Unawareness

Unintentional unawareness refers to unknown unknowns that can be recognized only in retrospect. Gross (2010) uses the term “nescience” to describe such a complete lack of any knowledge. It can also involve a total lack of knowledge on how such knowledge could be created or obtained.

Unintentional unawareness arises from inherent limitations in human conceptual frameworks and worldviews combined with limitations of the methods of knowledge generation. Issues characterized by unintentional unawareness are largely outside the scope of risk communication simply because there is nothing to communicate about. However, some elements of unintentional unawareness are always present in risk communication, due to the inherent limits of knowledge and human capacity. Moreover, the retrospective recognition of unintentional unawareness can lead to more active forms of unawareness and knowledge, providing important lessons for risk communication. In other words, this form of unawareness overlaps and interacts with the others both in the temporal dimension and in substantive terms.

Applications in Concrete Cases: Chemical Risks and Light Pollution

Two examples illustrate communication challenges related to the different forms of absent information (Table 1). First, health and environmental risks from chemicals represent an extensively studied field that has prompted intensive media debates and occasionally high levels of public and policy concern. It involves many types of absent information, ranging from secrets purposefully kept in order to guarantee public safety or commercial interests to unknown risks caused by the long-term effects of multiple chemical compounds. Corporate communication of large chemical firms has been accused of purposeful nondisclosure of risk information, especially after the early 1960s and the seminal book Silent Spring by Rachel Carson. The author and other environmentalists since her have, in turn, been accused of exaggerating the risks and paying little attention to the benefits (Kroll, 2001). On the other hand, the chemical enterprises have been considered to lack not just the willingness and incentive to disclose information, due, for example, to business secrecy and consumer scrutiny, but also the resources for nondisclosure. More recently, this has been observed specifically under the REACH legislation of the European Union (EU) that shifted the responsibility for detailed information to a higher extent to industry and other enterprises (Assmuth, Hildén, & Benighaus, 2010).

Table 1. Key characteristics of the selected risk communication issues.

Key aspects for risk communication

Chemicals

Light pollution

Type of stressor

Multiple types of chemicals causing different effects

Single stressor (artificial light) causing different effects

Level of scientific knowledge on risks to humans and ecosystems

High considering certain individual priority chemicals, low on cumulative effects

Generally low, some evidence of adverse health and ecological effects

Volume of media coverage

Variable, occasionally high regarding individual chemical risks

Low

Level of public awareness

Generally low, high among special interest groups

Generally very low, high among amateur astronomers regarding effects on night sky

Key management strategies

Several statutory and voluntary frameworks already implemented, mainly on the national and international levels.

Mainly voluntary frameworks, some municipal ordinances and national-level laws.

Second, light pollution represents an emerging environmental and health issue that has so far gained only limited public attention even though it is a profound and global environmental change that is easy to observe even without any technical devices. It is also a risk issue characterized by a relatively low level of scientific research and high uncertainty regarding potential health and ecological impacts.

Examples of different forms of absent information in communication related to chemical risks and light pollution are presented in Table 2, applying the generalized typology introduced in Figure 2. Chemical risks are caused by different kinds of substances and mixtures of substances, with some of them being reactive and degradable, while others are persistent and even bio-accumulating. Some cause acute effects and others have long-term impacts, being permanent or slowly phased-out factors in the environment (Assmuth, 2011). Environmental chemicals are natural, man-made or man-modified (semi-natural). Man-made chemicals can be released to the environment by sudden events or long-term processes, they involve various routes and durations of exposure, and they include various kinds of direct and indirect effects in humans, other organisms, and whole communities and ecosystems. These effects range from local to global. Risk assessment, management, and communication related to chemical risks typically deal with abstract information, because many of the effects are outside the limits of human sensory capabilities. Such risks can be experienced directly in immediate poisoning cases or accidents, while the recognition of low-level chronic exposure typically requires sophisticated technical measurements or tests (Assmuth, Hildén, & Craye, 2010).

Table 2. Examples of different types of absent information related to chemical risks and light pollution. (Based on Mazur, 2004; Lyytimäki, Assmuth, & Hildén, 2011; Lyytimäki, Assmuth, & Tapio, 2012; Harremoës et al., 2013).

Deliberate nondisclosure

Unintentional nondisclosure

Chemicals: Information on health effects of chemical compounds in tobacco smoke withheld by the tobacco industry

Chemicals: Difficulties of oversensitive people to communicate about importance of unscented environments

Light pollution: Framing of artificial lighting as solely beneficial

Light pollution: Inability of amateur astronomers to convince general public about the harmful impacts of light pollution

Deliberate inattention

Unintentional inattention

Chemicals: Initial discarding of the data showing depletion of ozone layer because of the accumulation of chlorofluorocarbons (CFCs) in the upper atmosphere

Chemicals: Ecological effects of pharmaceuticals in wastewater not recognized by medical research

Light pollution: Ecological and health effects of light pollution not recognized by lighting engineers

Light pollution: Inattention to information showing that reducing nighttime street lighting does not increase crime or accidents

Deliberate unawareness

Unintentional unawareness

Chemicals: Environmental effects of widely used chemicals not currently considered priority substances

Known only in retrospect

Light pollution: Effects of light pollution in general considered as not relevant

Risks associated with dramatic chemical accidents have strongly influenced public risk framings. In some cases, they have become symbols of chemical risks in general. For example, the EU Directive (2012/18/EU) on the prevention, preparedness, and response on industrial accidents is known as “the Seveso Directive,” highlighting the legacy of a single industrial accident that occurred in Seveso, Italy, in 1976. More generally, the attention given to industrial and other chemical accidents has focused risk assessment, management, and communication on clearly defined events and acute effects of single stressors (Lyytimäki & Assmuth, 2015). Also the debates related to certain persistent organic compounds (e.g., DDT, PCBs, CFCs, dioxins, and furans) and heavy metals, while stressing the need for a precautionary approach, have typically focused on single risk agents (Harremoës et al., 2013). Long-term cumulative effects of multiple chemicals have often been left with little attention. Such cumulative chemical risks caused by multiple stressors are particularly challenging for risk communication, partly because of a lack of frameworks and actors that would collate relevant pieces of disparate information together (Assmuth & Hildén, 2008; Lyytimäki, Assmuth, & Hildén, 2009).

Light pollution can be defined as nighttime outdoor artificial light that alters the natural cycles of light and dark or is perceived as harmful by humans (Rich & Longcore, 2006). Light pollution provides an especially interesting case for risk communication because it is an emerging environmental issue challenging deeply rooted positive sociocultural connotations and valuations of artificial lighting. The illumination of outdoor spaces has been commonly perceived as a symbol of progress, well-being, and even safety. Light pollution therefore illustrates, in complex ways, the epistemic and sensory as well as sociocultural aspects of non-knowledge. Some forms of light pollution, such as sky-glow brightening the night sky, light clutter formed by several light sources, and glare caused by too powerful or poorly directed light sources, are easy to sense without any technical instruments. Other ecologically relevant forms of light pollution escape human sensory abilities because the human eye is incapable of sensing low intensities of light, polarization of light or ultraviolet light.

Only limited scientific and practical attention has been paid to light pollution, compared to many other forms of environmental pollution (Lyytimäki, Assmuth, & Tapio, 2012). Therefore, it is likely that our knowledge covers only the most obvious ecological and health effects and risks related to light pollution, and that unintentional inattention is a core challenge for risk communication. Information is lacking on many ecological and health impacts of light pollution, even though existing studies indicate that various kinds of effects are possible, with some of them potentially of broad and great importance for many organisms and communities (Gaston, Visser, & Hölker, 2015). In particular, long-term effects of low intensity light and the interplay between light and other environmental stressors are still poorly known. The existing knowledge thus already indicates that the level of unintentional unawareness related to the ecological and health effects of light pollution is high.

The effects of light pollution on celestial observations are well known. Therefore, communication related to the potential negative aspects of light has so far focused largely on astronomy. Amateur and professional astronomers have a strong motivation to communicate about the issue, given that possibilities for high-quality celestial observations decrease with increasing levels of artificial light in the sky. However, such risk is easily perceived as a minor one among other target groups of communication, especially if compared with the perceived benefits of artificial illumination (Lyytimäki, 2013). The issue may be of high concern for stargazers, but deliberate inattention is the likely reaction by other groups, also because they fail to recognize the broader importance of astronomical effects as an indicator or a sentinel of effects on other human groups and on ecosystems.

Deliberate inattention can be and regularly is maintained whether or not it is well-justified, also as an evasive strategy or a more unconscious reaction. Risk communicators thus often face the difficult task of disproving at least some predispositions in order to overcome deliberate inattention. People surrounded by constant artificial nighttime lighting may not have personally experienced natural darkness, leading to a shifting baseline of the nocturnal environment (Lyytimäki, 2013). The concept of shifting baseline syndrome usually refers to the changing human perceptions of the condition of a certain system due to a loss of experience of past conditions (Kahn & Friedman, 1995; Pauly, 1995). A similar syndrome can, however, be caused as people forget, get used to, or are numbed by changes they have experienced, even when the system has changed considerably compared to earlier states. In this case, intensively illuminated environments may be considered natural, while natural darkness can be experienced as unnatural and unpleasant, even dangerous.

Implications for Risk Communication Strategies

Recognizing the Many Faces of Unawareness

Awareness of the different forms of absent information is highly important for communicators operating in fields such as environmental and health risk assessment and management, as well as for others engaged in such activities. It should be stressed that in addition to professional communicators, most if not all actors and groups, including researchers and other experts, policymakers and practitioners, other employers in enterprises, and citizen society actors, are subject to and also often actively engage in communication. Therefore, they are likely to benefit from a better understanding of unawareness and its long-term implications on power relations and the position of different actors and stakeholders.

Absent information can play many roles in risk communication, depending on the characteristics of the risks themselves, the level of knowledge, the actors involved, and the context of communication (or non-communication) (Gross, 2010; Lyytimäki, Assmuth, & Hildén, 2011). Absent information can refer both to various characteristics of information itself and to the processes of communication in their multiple forms, directions, stages, channels, and settings. It can refer to a reluctance to acknowledge certain kinds of information as valid, and this reluctance can be conscious and deliberate or unconscious and non-deliberate. The case of deliberate inattention in particular raises a difficult issue of how to address potential overconfidence on the adequacy of current knowledge.

To be successful, risk communication needs to be able to identify different forms and causes of absent information and find relevant strategies to tackle them under different communication and interaction contexts. A key difficulty for risk communication is that often many forms of absence of information are present simultaneously and interact with each other in dynamic processes.

Comparisons of Risks and Benefits

Comprehensive and reliable risk assessment is often seen as the fundamental basis for successful risk communication, especially in the context of one-way risk communication from experts to those exposed to risks (Fischhoff, 1995; Lyytimäki & Assmuth, 2015). The need to assess risks appropriately before communicating the information to larger audiences can be used as a justification for provisional deliberate nondisclosure. In cases of negligible or nonexistent risks the nondisclosure may be permanent. For example, health authorities may reason that disseminating information suggesting health risks related to light pollution or certain chemical compounds might cause unnecessary worry or even panic among the public (Mazur, 2004). Deliberate nondisclosure requires, however, that the risk communicators have meaningful methods to compare risks with other risks and with benefits associated with the risks. Such assessments are often difficult due to the multidimensionality of risk issues. In particular, various indirect adverse or beneficial consequences of alternative actions are often incommensurate, which precludes reducing comparisons to simple numerical estimates of risks, costs, or benefits (Assmuth & Hildén, 2008; Briggs, 2008). This calls for integrative risk communication where lacking knowledge and awareness of different aspects of the risks is articulated and addressed.

In the long term, reassessments may challenge the results from original assessments, even in cases where risks have been considered negligent with a high degree of confidence. This increases the communication challenges. Thus, the best long-term option for risk communication usually is to communicate not only about the results and details of risk assessments but also about the limitations, framings, and purposes of the assessments and of risk comparisons more generally (Gross & Bleicher, 2013). This can be a natural component of the follow-up of risk assessment and of updated advice that is often needed and also routinely implemented, for example, in healthcare contexts.

Changing Patterns of Media Use

Delivering just the right amount and type of information and specifically the right level of detail remains a fundamental challenge for risk communication. When risks are reassessed and introduced—sometimes under different framings—into news and social media debates, the public easily gets confused. In the case of chemical risks, experts often feel that the media exaggerates risks and presents the issues in an excessively vague and inherently sensationalistic manner, lacking the necessary details as well as the larger context (Lyytimäki, Assmuth, & Hildén, 2009). However, calls for more detailed information are challenged by the “quantity of coverage” theory suggesting that increasing the amount of news coverage is likely to increase uncertainty regardless of the level of detail in the environmental news stories (Mazur, 2006). The “quantity of coverage” theory was developed based on empirical results from newspaper coverage of global environmental issues.

More personalized media use and increasing use of various social media applications are changing the dynamics and actor roles of risk communication (Wendling, Radisch, & Jacobzone, 2013). However, the core question is not only about technical barriers and possibilities of information delivery but also about fundamental ethical and moral considerations related, for example, to the use and ownership of the personal health data that are collected from digital devices, applications, and platforms (Lupton, 2014). Justifications for deliberate nondisclosure of information may be seen very differently by different actors and should therefore be carefully and critically evaluated (Choma, Hanoch, Gummerum, & Hodson, 2013). This calls for risk communication strategies capable of involving different types of stakeholders and interaction processes for framing and negotiation of the salient issues and thus for co-creation of common understanding about them.

New knowledge unavoidably produces new uncertainties and new areas of absent information (Gross, 2010). The question is also about limited human cognitive capabilities. Individuals and organizations must be very selective when adopting new information. This is a great challenge, as a plethora of online information sources on risks, much of it unsubstantiated, is available. This information may improve risk awareness, but it may also cause confusion and, what is worse, greater new risks (such as through the avoidance of vaccination due to information highlighting potential personal-level health risks). Analysis of absent information can help judge whether all relevant information is likely to be possessed and communicated, or if the current information is, in fact, irrelevant, erroneous, incomplete, or outdated. Better understanding of different forms and implications of absent information is increasingly important in societies characterized by abundant risk information.

Further reading

Beck, U., & Wehling, P. (2012). The politics of non-knowing: An emerging area of social and political conflict in reflexive modernity. In F. D. Rubio & P. Baert (Eds.), The politics of knowing (pp. 33–57). London: Routledge.Find this resource:

Cox, R., & Pezzullo, P. C. (2016). Environmental communication and the public sphere (4th ed.). Los Angeles: SAGE.Find this resource:

Fischhoff, B. (2012). Risk analysis and human behavior. London: Earthscan.Find this resource:

Gross, M. (2007). The unknown in process: Dynamic connections of ignorance, non-knowledge and related concepts. Current Sociology, 55(5), 742–759.Find this resource:

Gross, M., & McGoey, L. (2015). Routledge international handbook of ignorance studies. London: Routledge.Find this resource:

International Risk Governance Council. (2006). White paper on risk governance: Towards an integrative approach. Geneva, Switzerland: International Risk Governance Council.Find this resource:

International Risk Governance Council. (2009). Risk governance deficits. An analysis and illustration of the most common deficits in risk governance. Geneva, Switzerland: International Risk Governance Council.Find this resource:

Lundgren, R. E., & McMakin, A. (2013). Risk communication. A handbook for communicating environmental, safety, and health risks (5th ed.). Hoboken, NJ: Wiley.Find this resource:

Renn, O. (2008). Risk governance: Coping with uncertainty in a complex world. London: Earthscan.Find this resource:

References

Assmuth, T. (2011). Policy and science implications of the framing and qualities of uncertainty in risks: Toxic and beneficial fish from the Baltic Sea. AMBIO, 40, 158–169.Find this resource:

Assmuth, T., & Hildén, M. (2008). The significance of information frameworks in integrated risk assessment and management. Environmental Science and Policy, 11, 71–86.Find this resource:

Assmuth, T., Hildén, M., & Benighaus, C. (2010). Integrated risk assessment and risk governance as socio-political phenomena: A synthetic view of the challenges. Science of the Total Environment, 408, 3943–3953.Find this resource:

Assmuth, T., Hildén, M., & Craye, M. (2010). REACH and beyond: Roadblocks and shortcuts en route to integrated risk assessment and management of chemicals. Science of the Total Environment, 408, 3954–3963.Find this resource:

Assmuth, T., & Lyytimäki, J. (2015). Co-constructing inclusive knowledge within converging fields: Environmental governance and health care. Environmental Science and Policy, 51, 338–350.Find this resource:

Aven, T., & Renn, O. (2010). Risk management and governance. Heidelberg, Germany: Springer.Find this resource:

Björk, B.-C., Laakso, M., Welling, P., & Paetau, P. (2014). Anatomy of Green Open Access. Journal of the Association for Information Science and Technology, 65(2), 237–250.Find this resource:

Bowker, G., & Star, S. L. (1999). Sorting things out: Classification and its consequences. Cambridge, MA: MIT.Find this resource:

Bradbury, J. A. (1989). The policy implications of differing concepts of risk. Science, Technology and Human Values, 14, 380–399.Find this resource:

Briggs, D. J. (2008). A framework for integrated environmental health impact assessment of systemic risks. Environmental Health, 7, 61.Find this resource:

Callon, M. (1999). The role of lay people in the production and dissemination of scientific knowledge. Science, Technology & Society, 4, 81–94.Find this resource:

Cho, H., Reimer, T., & McComas, K. (Eds.). (2015). The SAGE handbook of risk communication. Thousand Oaks, CA. SAGE.Find this resource:

Choma, B., Hanoch, Y., Gummerum, M., & Hodson, G. (2013). Relations between risk perceptions and socio-political ideology are domain- and ideology-dependent. Personality and Individual Differences, 54(1), 29–34.Find this resource:

Covello, V. T. (1992). Risk communication: An emerging area of health communication research. In S. A. Deetz (Ed.), Communication yearbook 15 (pp. 359–373). Newbury Park, CA: SAGE.Find this resource:

Daughton, C. G. (2001). Literature forensics? Door to what was known but now forgotten. Environmental Forensics, 2, 277–282.Find this resource:

Doemeland, D., & Trevino, J. (2014). Which World Bank reports are widely read?. Policy Research Working Paper WPS 6851. Washington, DC: World Bank Group.

Environmental Protection Agency. (2007). Risk communication in action: The risk communication workbook. Cincinnati, OH: Environmental Protection Agency.Find this resource:

Fischhoff, B. (1995). Risk perception and communication unplugged: Twenty years of process. Risk Analysis, 15(2), 137–145.Find this resource:

Gaston, K. J., Visser, M. E. & Hölker, F. (2015). The biological impacts of artificial light at night: The research challenge. Philosophical Transactions of the Royal Society B, 370, 20140133.Find this resource:

Gaudet, J. (2013). It takes two to tango: Knowledge mobilization and ignorance mobilization in science research and innovation. Prometheus, 31(3), 169–187.Find this resource:

Geiger, N., & Swim, J. K. (2016). Climate of silence: Pluralistic ignorance as a barrier to climate change discussion. Journal of Environmental Psychology, 47, 79–90.Find this resource:

Gross, M. (2010). Ignorance and surprise: Science, society, and ecological design. Cambridge, MA: MIT.Find this resource:

Gross, M. (2016). Risk as zombie category: Ulrich Beck’s unfinished project of the “non-knowledge” society. Security Dialogue, 47(5), 386–402.Find this resource:

Gross, M., & Bleicher, A. (2013). It’s always dark in front of the pickaxe: Organizing ignorance in the long term remediation of contaminated land. Time & Society, 22(3), 316–334.Find this resource:

Harremoës, P., Gee, D., MacGarvin, M., Stirling, A., Keys, J., Wynne, B., & Guedes Vaz, S. (2013). The precautionary principle in the 20th century: Late lessons from early warnings. London: Earthscan.Find this resource:

Hildén, M. (1997). Risk, uncertainty, indeterminacy and ignorance in fisheries management—an analysis of management advice. Monographs of Boreal Environmental Research 5. Helsinki, Finland: Finnish Environment Institute.Find this resource:

International Data Corporation. (2014, April). The digital universe of opportunities: Rich data and the increasing value of the Internet of Things. EMC Digital Universe with Research & Analysis by IDC.Find this resource:

International Organization for Standardization. (2009). ISO Guide 73:2009—Risk management—Vocabulary. Geneva, Switzerland: International Organization for Standardization.Find this resource:

International Risk Governance Council. (2012). An introduction to the IRGC Risk Governance Framework. Geneva, Switzerland: International Risk Governance Council.Find this resource:

Kahn, P. H., & Friedman, B. (1995). Environmental views and values of children in an inner-city black community. Child Development, 66, 1403–1417.Find this resource:

Kasperson, J., Kasperson, R., Pidgeon, N., & Slovic. P. (2003). The social amplification of risk: Assessing fifteen years of research and theory. In N. Pidgeon, R. Kasperson, & P. Slovic (Eds.), The social amplification of risk (pp. 13–46). Cambridge, U.K.: Cambridge University Press.Find this resource:

Kerwin, A. (1993). None too solid: Medical ignorance. Knowledge: Creation, Diffusion, Utilization, 15(2), 166–185.Find this resource:

Kroll, G. (2001). The “Silent Springs” of Rachel Carson: Mass media and the origins of modern environmentalism. Public Understanding of Science, 10(4), 403–420.Find this resource:

Lupton, D. (2014). Critical perspectives on digital health technologies. Sociology Compass, 8(12), 1344–1359.Find this resource:

Lyytimäki, J. (2013). Nature’s nocturnal services: Light pollution as a non-recognised challenge for ecosystem services research and management. Ecosystem Services, 3, e44–e48.Find this resource:

Lyytimäki, J., & Assmuth, T. (2015). Down with the flow: Public debates shaping the risk framing of artificial groundwater recharge. GeoJournal, 80(1), 113–127.Find this resource:

Lyytimäki, J., Assmuth, T., & Hildén, M. (2009). Communicating chemical risks for social learning: Findings from an expert opinion survey. Applied Environmental Education and Communication, 8(3–4), 174–183.Find this resource:

Lyytimäki, J., Assmuth, T., & Hildén, M. (2011). Unrecognized, concealed or forgotten—The case of absent information in risk communication. Journal of Risk Research, 14(6), 757–773.Find this resource:

Lyytimäki, J., Assmuth, T., & Tapio, P. (2012). Unawareness in environmental protection: The case of light pollution from traffic. Land Use Policy, 29(3), 598–604.Find this resource:

Lyytimäki, J., Tapio, P., Varho, V., & Söderman, T. (2013). The use, non-use and misuse of indicators in sustainability assessment and communication. International Journal of Sustainable Development and World Ecology, 20(5), 385–393.Find this resource:

Mazur, A. (2004). True warnings and false alarms: Evaluating fears about technology, 1948–1971. Washington, DC: Resources for the Future.Find this resource:

Mazur, A. (2006). Risk perception and news coverage across nations. Risk Management, 8, 149–174.Find this resource:

National Research Council. (1989). Improving risk communication. Washington, DC: National Academy Press.Find this resource:

Noelle-Neumann, E. (1991). The theory of public opinion: The concept of the spiral of silence. In J. A. Andersson (Ed.), Communication Yearbook 14 (pp. 256–287). Newbury Park, CA: SAGE.Find this resource:

Norgaard, K. M. (2006). “We don’t really want to know”: Environmental justice and socially organized denial of global warming in Norway. Organization and Environment, 19(3), 347–370.Find this resource:

Paté-Cornell, M. E. (1996). Uncertainties in risk analysis: Six levels of treatment. Reliability Engineering & System Safety, 54, 95–111.Find this resource:

Pauly, D. (1995). Anecdotes and the shifting baseline syndrome of fisheries. Trends in Ecology and Evolution, 10, 430.Find this resource:

Pawson, R., Wong, G., & Owen, L. (2011). Known knowns, known unknowns, unknown unknowns: The predicament of evidence-based policy. American Journal of Evaluation, 32(4), 518–546.Find this resource:

Peters, E., & Slovic, P. (1996). The role of affect and worldviews as orienting dispositions in the perception and acceptance of nuclear power. Journal of Applied Social Psychology, 26(16), 1427–1453.Find this resource:

Rich, C., & Longcore, T. (2006). Ecological consequences of artificial night lighting. Washington, DC: Island Press.Find this resource:

Ropeik, D. (2010). How risky is it, really? Why our fears don’t always match the facts. New York: McGraw-Hill.Find this resource:

Sheppard, B., Janoske, M., & Liu, B. (2012). Understanding risk communication theory: A guide for emergency managers and communicators. Report to Human Factors/Behavioral Sciences Division, Science and Technology Directorate, U.S. Department of Homeland Security. College Park, MD: National Consortium for the Study of Terrorism and Responses to Terrorism (START).Find this resource:

Slovic, P. (2007). “If I look at the mass I will never act”: Psychic numbing and genocide. Judgment and Decision Making, 2(2), 79–95.Find this resource:

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.Find this resource:

Vahabi, M. (2007). The impact of health communication on health-related decision making: A review of evidence. Health Education, 107, 27–41.Find this resource:

Wendling, C., Radisch, J., & Jacobzone, S. (2013). The use of social media in risk and crisis communication. OECD Working Papers on Public Governance 24. Paris: OECD Publishing.Find this resource:

Wynne, B. (1996). May the sheep safely graze? A reflexive view of the expert-lay knowledge divide. In S. Lash, B. Szerszyski, & B. Wynne (Eds.), Risk, environment and modernity: Towards a new ecology (pp. 44–83). London: SAGE.Find this resource: