Lay Risk Management
Summary and Keywords
How do individuals relate to risk in everyday life? Poorly, judging by the very influential works within psychology that focus upon the heuristics and biases inherent to lay responses to risk and uncertainty. The point of departure for such research is that risks are calculable, and, as lay responses often under- or overestimate statistical probabilities, they are more or less irrational. This approach has been criticized for failing to appreciate that risks are managed in relation to a multitude of other values and needs, which are often difficult to calculate instrumentally. Thus, real-life risk management is far too complex to allow simple categorizations of rational or irrational.
A developing strand of research within sociology and other disciplines concerned with sociocultural aspects transcends the rational/irrational dichotomy when theorizing risk management in everyday life. The realization that factors such as emotion, trust, scientific knowledge, and intuition are functional and inseparable parts of lay risk management have been differently conceptualized: as, for example, bricolage, in-between strategies, and emotion-risk assemblage. The common task of this strand is trying to account for the complexity and social embeddedness of lay risk management, often by probing deep into the life-world using qualitative methods. Lay risk management is structured by the need to “get on” with life, while at the same time being surrounded by sometimes challenging risk messages.
This perspective on risk and everyday life thus holds potentially important lessons for risk communicators. For risk communication to be effective, it needs to understand the complexity of lay risk management and the interpretative resources that are available to people in their lifeworld. It needs to connect to and be made compatible with those resources, and it needs to leave room for agency so that people can get on with their lives while at the same time incorporating the risk message. It also becomes important to understand and acknowledge the meaning people attribute to various practices and how this is related to self-identity. When this is not the case, risk messages will likely be ignored or substantially modified. In essence, communicating risk requires groundwork to figure out how and why people relate to the risks in question in their specific context.
Risk and the Limits of Rationality in Everyday Life
Risk communication typically rests on the assumption that people are rational consequentialists to some degree, that they decide how to act in the face of risk based on a structured assessment of alternatives and possible outcomes. Indeed, if this were not at all true, there would be little point in communicating risks. Correspondingly, risk communication would be a fairly uncomplicated task if the intended audience was made up of individuals who paid close attention to risk information, understood and evaluated it correctly, and then acted upon it to minimize the risk. This is not the case, as demonstrated by psychological research into risk perceptions over and over (e.g., Fischhoff, Slovic, & Lichtenstein, 1982; Sjöberg, 2002; Slovic, 1987; Tversky & Kahneman, 1975). Lay people make errors in their risk-related decisions, in the sense that their perceptions and actions do not always mirror the expert knowledge available, and they under- or overestimate probabilities. For instance, lay people tend to neglect factors such as dose and exposure when thinking about chemical risks. Having contact with toxic or cancerogenous substances is often considered harmful, regardless of the extent of such contacts (Kraus, Malmfors, & Slovic, 1992). Apart from the risk of making bad decisions, such errors can make it difficult to get on with life as they generate unnecessary stress and anxiety. Errors may also have wider societal implications. When environmentally conscious individuals try to take everyday action in the face of climate change, they sometimes have problems distinguishing between effective and ineffective strategies—avoiding spray cans while neglecting energy conservation. This has been related to their “flawed mental models,” which make it difficult to find the most effective strategies (Bostrom, Morgan, Fischhoff, & Read, 1994, p. 969).
The fact that people make errors in the face of risk has prompted research into the more general mental models or strategies (shortcuts) that people apply when processing information and the biases (errors) they result in. Such strategies are referred to as cognitive heuristics. One example is the availability heuristic, which makes familiar phenomena seem more likely than less familiar ones. As a consequence, people who have been in a certain kind of accident will perceive such accidents as more likely, compared to how other people perceive them (cf. Olofsson, 2009, p. 45; Renn, 1998, p. 58). Another example is the affect heuristic, which makes us underestimate risks related to phenomena that evoke positive emotions, and vice versa. A related heuristic is called probability neglect, occurring when the consequences of a risk—if realized—appear so frightening that any considerations of probability are blocked out. Many people avoided air travel and instead got into their car in the weeks following the September 11 attacks, despite the microscopic risk of falling victim to a terror attack and the much higher risk of having a fatal car accident (see Slovic & Peters, 2006; Sunstein, 2003).
Most of the “errors” people make can be related to emotions in a broad sense. Risks evoke emotional responses in us, and this sometimes leads us to make poor decisions in the face of risk. But, there is support for the positive role of emotions in lay risk management (Loewenstein, Weber, Hsee, & Welch, 2001; Zinn, 2008). They can help us navigate and act more quickly than the slow cognitive process of structured assessment of alternatives and outcomes allows. The main problem with the division into cognitive and emotional—or rational and irrational—responses to risk is the presumption that the emotional components can be isolated in lay responses to risk, and hence, that some ideal type of rationality can be achieved. A developing strand of risk research transcends this rational/irrational dichotomy and contends that emotions should be seen as functional and inseparable parts of lay risk management. In this article, as in much risk research and in general discussions, the concepts of affect, feelings, and emotions are used interchangeably. They all denote individual and/or social phenomenological sensations. It is sometimes productive to make distinctions between them (e.g., emotions can be seen as social displays of feelings, while affect can be seen as almost non-conscious intensities), but with Lupton (2013), I refer to this constellation as “emotion,” for the sake of brevity.
The purpose of this article is to discuss some insights on how risks are experienced and dealt with in everyday life, insights that might be important to consider in risk communication. It draws mainly on qualitative research into the lifeworld—the world in which people subjectively experience themselves to be living (cf. Berger & Luckmann, 1967; Brante, Andersen, & Korsnes, 1998). There is limited engagement with the literature on practical risk message design or risk communication (handbooks, etc.) in the following, as the focus is on sociocultural insights and what we can learn from them about risk and everyday life.
To conceptualize what people do with risk information, this article employs the term strategy in a broad sense, as including everything individuals do in the face of risk that constitutes a pattern in a stream of decisions (Mintzberg, 1978). This use does not presuppose a strategic-rational actor; strategies may be more or less consciously applied. In fact, such processes in the lifeworld quickly blur any categorization into rational and irrational decision making, simply because they involve far more aspects than those directly linked to the risk in question. Instrumental rationality is rarely available in the many decisions of everyday life. Individuals do not have time or resources (nor the inclination) to engage in formal methods of risk analysis and management. One equally important point is that the lived experience of risk in everyday life has more to do with experiencing uncertainty and coping with worry than it has to do with assessing probabilities. Rather than cognitive rationality, resources like confidence, intuition, and trust are what actually enable decision making in everyday life (Alaszewski & Coxon, 2009; Zinn, 2008).
Horlick-Jones (2005), Horlick-Jones, Walls, and Kitzinger (2007) describes the practical risk reasoning that lay people engage in as a bricolage-like activity that makes use of the interpretive resources at hand. This includes technical risk knowledge, but also the emotional components mentioned above and considerations of accountability and of what is morally acceptable (including value-commitments). There is also the realization that people often listen to and actively seek out information about risks, but they may also make a conscious decision not to. This can only be understood if we consider how lay risk management is embedded in the lifeworld; risks are experienced and dealt with alongside all other aspects of everyday life, including other risks. Hence, social context influences lay risk management alongside individual considerations of both physical and emotional well-being and judgments of whether or not a source can be trusted etc. (cf. Alaszewski, 2005).
Several dramatic examples of what this could mean for risk communicators are provided by the modern public health policies of the Western world. Often aimed at influencing life styles by communicating risk knowledge to risk groups, they assume that people are rational in the sense that they will act to maximize their health. Health decisions, however, are influenced by a variety of other factors. Information campaigns related to the HIV/AIDS epidemic, for instance, illustrate that even when medical knowledge and treatment is available and trusted, people may refrain from seeking help due to the risk of social stigma (Alaszewski & Tom Horlick-Jones, 2003; Wallman, 2000). This may seem irrational given the severity of the HIV infection, but only if we decide that physical health is more important than social inclusion. This may not always be the case, and hence it’s important to understand how people make such judgments. In any case, fighting the social stigma related to HIV/AIDS quickly becomes a prerequisite for effective risk communication. Risk communicators need to understand how people interpret and value risk information in relation to other aspects, such as values and morality, within their community.
The following two sections discuss some important insights on the subject. First, we consider the social embeddedness of lay risk management and the individual need to get on with life. Then we move on to the role of emotions as interpretative resources in this process. This is followed by a concluding discussion of the literature that also summarizes the possible implications for risk communication.
Social Embeddedness and the Need to Get on With Life
What people perceive as risky, and how they act based upon their perceptions, varies across time, space, and context. As common values lead to common fears, there is a particular “risk portfolio” tied to each form of social life, usually mirrored in how societies are organized (cf. Douglas & Wildavsky, 1982). Within a society, different strata of the population relate differently to risk. Ulrich Beck (1992) suggested that the grand risks of late modernity (e.g., climate change) will eventually erode class structures, as they affect everyone. Thus far, however, available evidence suggest that both risk perceptions and exposure to risk vary with factors such as ethnicity, gender, social class, and sexuality (cf. Dewilde, 2008; Pintelon, Cantillon, Van den Bosch, & Whelan, 2011; Vandecasteele, 2010). Studies have demonstrated that white men are the most likely group to underestimate their exposure to risk (cf. Finucane, Slovic, Mertz, Flynn, & Satterfield, 2000, on the white male effect). People who underestimate the statistical risk of death from a given phenomenon are typically better educated and have a higher income, compared to the population in general. Correspondingly, people who overestimate the same risk tend to be less well off (cf. Bastide, Moatti, & Fagnani, 1989). Why is this? The short answer from both quantitative and qualitative research is that different groups are differently situated in relation to risks (Flynn, Slovic, & Mertz, 1994; Tulloch & Lupton, 2003). Risk perceptions and the ability to deal with a risk vary with differences in power, status, alienation, trust, etc. between groups. On the basis of such variation, Wall (2010) suggests that “sense-making of risk” is preferable to “risk perception” as a concept for understanding risk in the life world. Her study shows that people with a similar perception of risk (they identify the same risks) still differ in their interpretations of these risks and in the actions they take in the face of them. This is because people have different structures of meaning informing their sense-making; they have different experiences, they differ in their orientation towards the individual or the collective, they differ in their attachment to place, in social class, etc. Such variation is important to consider in risk communication, but it is not further considered here. The focus is instead on some general dynamics of lay risk management.
One ambitious attempt to account for how social complexity increases or decreases public perceptions of risk is the Social Amplification of Risk Framework (SARF; see Kasperson et al., 1988). The main idea is that a piece of communicated risk information or news of a risk event will travel from sender to receiver via societal filters (e.g., the media, experts, institutions, personal networks) that amplify, attenuate, or otherwise process the message in some way, much like how a signal travels thru a stereo system. SARF helps explain why technical risks that are seen as relatively small by the experts can cause strong public reactions, while risks the experts view as substantial are sometimes ignored. The technical information about the risk interacts with other social phenomena; risks are not communicated in a vacuum. This interaction also causes “ripple effects” in a society; risk messages evoke behavioral responses in a population that is not directly linked to the risk itself (e.g., social disorder or decline of businesses). SARF is focused on the broader social workings of risk. It does consider the role of individuals, but they are treated as stations or filters along the signal chain. If we expand the stereo system metaphor, we can say that lay risk management begins after the loudspeaker. Here, the risk message will continue to resonate among other social phenomena and personal aspects of human existence as people try to make sense of the risk message and accommodate it into their lifeworld.
The complexity and abundance of available risk information in late modern societies is, in itself, a factor that structures everyday risk management. To a large extent, we depend on expertise to tell us both what is risky and how to act in the face of risk. This makes the question of trust particularly important to consider in risk communication, and we shall return to how trust functions as a resource in lay risk management further down. Here, just a few word on trust as a general aspect of the relationship between experts and lay people—as evident in the SARF above, even very “small” technical risks can have a substantial impact. Yet, as Brian Wynne (1996) points out, a lot of risk research operates with a clear demarcation between real and socially constructed risks. Anthony Giddens (1991), for instance, emphasizes how the modern focus on risk has little or nothing to do with the actual occurrences of lethal dangers. While this might be true, it rests on the assumption that (natural) science can provide us with a true and relevant list of the dangers we face. This ignores the fact that expert knowledge is culturally and socially situated. More importantly for risk communicators, even if science is the dominating knowledge paradigm in the Western world, it cannot be assumed that the general public uncritically views science as a source of relevant risk knowledge. Rather, there is a considerable amount of ambivalence involved (Lidskog, 1996). Scientific practice is seen as a social practice among other social practices, and at times, people will question experts and the advice they bring. This is why trust is such an important aspect of risk communication, and it is only further accentuated by the enormous increase in risk information sources that came with the Internet.
The constant flow of risk information in modern societies both facilitates and challenges our need to uphold what Giddens (1991, p. 38) refers to as “ontological security,” that is, our basic sense of order and safety in life. Such a sense is necessary for taking action in the lifeworld—what we can refer to as “getting on with life.” However, we depend on various expert systems in this process, allowing us to act even when we have little or no direct control over the outcome. Boarding a plane, eating processed foods, or consuming nuclear energy are examples of actions that are made possible only because of expert systems. But the various expert systems also provide us with information of possible negative outcomes, to such an extent that it becomes impossible to consider all of this information, or even to act upon all the information we do consider. However, this is not only a problem of information overload. Risk information typically centers on the knowable and predictable—the calculable risk—and therefore, by definition leaves out the aspect of uncertainty and the worry it gives rise to (Alaszewski, 2005). Living with abstract probabilities of adverse outcomes might be possible in itself, but living with high levels of worry is indeed difficult. Some information will therefore be blocked out, some will be renegotiated, before it is implemented in the lifeworld, and some can be implemented directly. This challenge is at the heart of everyday risk management. For risk communication to “get through” in this process, a qualitative understanding of this process is needed.
In 2011, the Swedish Chemicals Agency provided an illustrative example of risk communication that is likely to be blocked out, simply because it is very difficult to implement in everyday life. The agency published an information booklet on chemicals in the everyday life of children (Kemikalieinspektionen, 2011, p. 4). It states that the intention of the booklet is to inform parents on were the chemical risks are so that they can be avoided. By doing so, the agency hopes to reduce anxiety and enable parents to enjoy parenthood. Then follows a list of where the risks are, here sorted by the subheadings (and substantially shortened): bedding, mattresses, clothes and other textiles, shoes, shampoo, soap, bubble bath, toothpaste, bath crayons, bath bombs, wet wipes, lotions, hair dyes, plastic, feeding bottles, plastic toys, crayons, pencils, [. . .] sand boxes and wooden play equipment, mosquito and tick agents, sunscreen. The list goes on, but apparently there are risks almost everywhere, and to avoid them would be very complicated. It is therefore not evident that this kind of risk message fulfills the stated aim of reducing unnecessary anxiety. It can be argued that life with children can hardly go on if all these risks are to be avoided (Löfmarck, 2014). Risk communication thus needs to leave some room for agency.
Risks are perceived and dealt with in the lifeworld, alongside other commitments, and other risks and values in it. For instance, new mothers are often exposed to extensive risk communication (e.g., dietary recommendations) that they are supposed to make sense of and heed, while at the same time adjusting to their new role as mothers and getting on with life in general. There is a substantial amount of social pressure involved here. Mothers are often held "infinitely responsible" for their children's welfare by society, and they often come to internalize the general risk consciousness that is communicated by agents and institutions in their society (Lupton, 2008; see also Knaak, 2010 on how public health discourse frames personal decisions on breastfeeding as an issue of moral and social responsibility). Under such circumstances, there is very little room for structured assessments of alternatives and possible outcomes. Rather, risk management becomes a psychologically and socially risky endeavor in itself. Being perceived as a poor or carless mother can lead to social exclusion—a form of “ripple effect” in the lifeworld. As evident from qualitative studies into parenting, most mothers view it as neither possible nor desirable to try and protect children against all possible risks, but it is still a laborious process to decide which risks are acceptable and which are not (Löfmarck, 2014; Smyth, 2012). To keep stress levels in check, while at the same time being perceived as morally accountable individuals by their surroundings etc., mothers need to develop strategies for dealing with risk information. Such strategies typically involve some degree of simplification and filtering. One such basic strategy described by Löfmarck (2014) is called comparison and consists of evaluating one’s own risk managing practice by comparing it to that of others (including other parents, one’s own parents, parents in general, parents of the past and/or in other countries, etc.). It is often very difficult to relate to abstract probabilities, but by considering how others have dealt with a particular risk, this risk becomes more concrete. This strategy may result in a modification of the risk management practice, or a practice may be found acceptable in light of how others manage the risk in question. Comparison is not to be viewed as a heuristic or “shortcut,” as it typically draws on a wide set of sources and involves reflection, emotion, and discussion. It does however resemble the heuristic for decision making called imitate the majority, described by Boyd and Richerson (2005), see also Gigerenzer (2008). This heuristic consists of looking at how a majority of individuals within a reference group deals with a certain problem, and then simply doing the same. It has been proven effective (equal or better than probabilistic counting) in relatively stable situations were information retrieval is costly or time consuming.
Considerations of what is morally or socially acceptable can make people pay attention to and heed risk information, even when they are not particularly concerned with the risk itself. Indeed, risk communication is a central component of the public health policies in late modern societies, policies often aimed at getting populations to regulate themselves and each other by fostering risk awareness. This governmental perspective on risk communication is not further elaborated here (but see, for instance, Petersen & Lupton, 1996). However, risk communication may also be ignored or rejected because it threatens or challenges the individual’s self-identity. In a study of smokers and how they relate to risk information, Gjernes (2010) found that the respondents were provoked by the lack of understanding among health communicators of what smoking represented in their everyday life. Smoking is a bad habit, but it is also a social practice with different meanings in different settings (a stress reliever at work, a social pleasure at a dinner party, etc.). Risk communication that doesn’t acknowledge the value or meaning of a risky activity might alienate the intended recipients. In a study of young people's interpretations of health and health related risks, Spencer (2013) found a strong call among the respondents for health promotion efforts that challenge the negative and problem-based approach to young people’s health that currently dominates. For young people to relate to the health communication, it needs to connect with their lived experience of being young and also acknowledge the positive aspects. In this sense, there is always groundwork to be done using qualitative research methods to facilitate effective risk communication, that is, to probe the deeper meaning people attribute to various risky activities. Of course, this is true for both positive and negative meanings. The “dispassionate tone” in many writings on risk has been addressed as a problem by some critics, because it leaves out aspects like fear and loss, which are fundamental to the lay experiences of risk (Hallowell, 2006; Lupton, 2013; Wilkinson, 2006).
Thus far we can deduce two things. First, risk information needs to be compatible with the circumstances of the lifeworld. If it is not, it will somehow be made compatible by the recipient or perhaps disregarded, as life must go on and identities upheld. Second, the use of comparisons and imitations tells us that accounts of how others relate to risk might be a valuable component in risk communication. There has been an increased interest in using stories in risk communication. For instance, vivid examples of real accidents can create images and elicit strong emotions, almost like first-hand experience (see Ricketts, Shanteau, McSpadden, & Fernandez-Medina, 2010, on this use of narratives in risk communication). Appealing to fear in risk communication is problematic for several reasons—we shall return to this shortly—but it would be worth exploring how positive examples can help to personalize the risk message. A fundamental difference between scientific (expert) risk knowledge and lay risk knowledge is that the former is typically formulated in terms of probabilities within a population, while lay risk knowledge is always personalized in some way—people are interested in their own risk status (Alaszewski, 2005). Stories and narratives can help in this process of personalization.
Emotions as Interpretative Resources in Lay Risk Management
Modern psychological research into risk perceptions recognizes the practical importance of emotion in lay risk management. For instance, Finucane states that as “emotion and affect may be used as a source of information in judgment and decision making, risk communicators need to be cognizant of the affective content of the messages they are imparting” (Finucane, 2008, p. 985). This does not mean that such research has abandoned the idea of at least theoretically separating the rational from the irrational—the “affective content” is still seen as possible to isolate. Within sociology, Zinn (2008) has suggested that, since we cannot describe lay risk management strategies based on trust, hope, faith, intuition, etc., as neither irrational, nor fully rational, we should label them “in-between strategies.” If we take intuition as an example, it represents a kind of acquired experience that almost unconsciously helps us to decide how to act. Our feelings put us in touch with our intuition (“this feels right to me “), and among other things, they help us determine what kind of trust it is reasonable to entertain (“I have a positive attitude towards this person or this organization”). Zinn contends that such in-between strategies seem effective “. . . not only in dealing with too little knowledge, but can also help with the overload of knowledge when heightened complexity prevents or distorts ‘rational’ decision making” (Zinn, 2008, p. 8).
However, a basic problem with the notion of in-between strategies is that it requires fixed points on rationality and irrationality (that they can be in-between). Relying on intuition when making decisions is not typically seen as rational, but to the extent that intuition rests on acquired experience it would be irrational to disregard it. If relying on hope or faith puts me at risk, while it helps me get on with life, it may be “rational” in relation to the later aim. A more fundamental objection is also raised by Lupton (2013), who contends that emotion and risk cannot be even theoretically separated: risks create emotions and emotions create risk, constituting an emotion-risk assemblage. Put differently, emotions have a big part in why we come to perceive something as risky (“this is dear to me”), and risk perceptions create emotions (“this is scary”). This also holds for scientific judgements of risk—experts are not free of emotional involvement. Thus, rather than trying to separate risk from emotion (and the rational from the irrational, etc.) it might be more productive to view emotions as functional and inseparable parts of lay risk management, along with technical risk knowledge and all other components that can facilitate decision making. The notion of bricolage (Horlick-Jones et al., 2007) fits precisely because it underlines the fact that people will make use of the resources at hand when making sense of risk and acting upon it in everyday life. For risk communicators and others who are concerned with lay risk management, it is thus essential to ponder how their efforts fit and interact with the interpretative resources of the target audience.
The importance of understanding the interpretative resources at hand can be illustrated by trust. We can find striking differences in generalized and interpersonal trust between countries. For instance, Sweden is a high-trust society, while Poland is a low-trust society (Sztompka, 1999; Trägårdh, 2009; see also World Values Survey Wave 6: 2010–2014). There are historical explanations for this that we will not dwell on here, but these differences are mirrored in how people relate to risk. Swedes tend to trust official recommendations and regulations (e.g., regarding food production), while Poles are more cautious (Löfmarck, 2014). While trust is, in this sense, something that develops under certain social circumstances, there is also individual room for making conscious decisions to trust or not to trust (e.g., expert advice), or to trust that risks are being managed by someone else (e.g., by the state) in everyday life. In this sense, trusting can be seen as a risk management strategy. As Wynne (1996) points out, trust can coexist with ambivalence or even distrust. Sometimes people have to act as if they trust a social actor because they are socially dependent on that actor (see also Brown & Meyer, 2015, on forced options to trust). Our everyday risk management is embedded in a myriad of needs and practices, and trust is sometimes what facilitates choice and action, enabling us to go on with life in confidence. Trust may come in many forms (conditional, blind, forced, strategic etc.), but it always reduces complexity, freeing up resources for other activities.
In a comparative qualitative study of how Swedish and Polish mothers handle food-risks, Löfmarck (2014) finds that Swedish mothers have two advantages related to trust. First, they have a basic level of trust in the government and its risk regulation, making it easier to trust when needed. Second, they seem to assume that the Swedish state is a bit “overprotective.” The notion of overprotection means that the mothers can assume there are some safety margins when it comes to risk regulation, e.g., that maximum consumption amounts, etc., are set below the actual safe level. This makes it possible for them to deal with recommendations in a more flexible manner—there is more room for agency. Some have warned of the risk of public complacency (Wang & Kapucu, 2008) related to this, that people put too much faith in the authorities and take few risk-reducing measures themselves. In the above study, however, the mothers still paid close attention to risk information and acted upon it, but their overall risk management was characterized by reflexivity and critical thinking. When communicating risk to a Swedish (or other high-trust) audience, it is arguably important to be aware of this notion of overprotection and consider what it might means for the risk message in question.
Since emotional resources (including trust) are central to lay risk management, it is no surprise that they also are central to risk communication. Above, we discussed the importance of acknowledging meaning and identities among the intended audience. However, linking emotion and risk in risk communications is sometimes done in ways that are ethically questionable. Lupton (2015) points to “the pedagogy of disgust” that is fairly common in health promotion—for instance add campaigns showing overweight individuals eating junk food, internal organs surrounded by fat, or dying smokers in a hospital bed. Such messages are intended to elicit disgust to persuade the target audience to change their behavior in some way. Disgust can easily be combined with emotions like fear or regret in such campaigns. Lupton points to how such tactics challenge human dignity and risk marginalizing already disadvantaged individuals or groups, as they are presented as inferior in some way. There is also research showing that fear messages can trigger threat avoidance behavior that, in itself, can be damaging to individual health (Hastings, Stead, & Webb, 2004). This relates to the question of effectiveness. It is clear that risk messages that elicit strong emotions receive more attention. Less is known about the long-term effect on behavior, but there is an obvious risk that people will somehow try to avoid messages that are too disturbing (cf. Brown & Richardson, 2012, on attentional disengagement). Hastings et al. (2004) also points out that fear messages can be particularly ineffective with people who have low self-efficacy, and therefore they risk increasing societal health inequity. In turn, we might ask what such pedagogies do for the basic trust that needs to exist between risk communicators and their audiences. Eliciting fear or disgust might be effective if the sender is already trusted, but it might hinder new establishment of trustful relationships.
Discussion of the Literature
Today, risk research is an interdisciplinary field. Risk is an integral part of human life, and each scientific discipline concerned with uncertainty of some kind has developed ways for analyzing risk. It was within the field of psychology, however, that research into lay perceptions of risk took off in the early 1970s. The focus then was on the heuristics that people apply when relating to risk, and the biases they result in—as in the errors people make. There was a clear assumption in this research that some ideal type of rationality could be achieved, in principle, if only it was possible to understand and possibly correct or compensate for the errors. Among other things, this would enhance the possibilities for effective risk communication. Emotion or affect was (and still often is) particularly implicated as a problem in lay approaches to risk, and was often contrasted with the cognitive-rational approach that is the hallmark of scientific activity. The fields of sociology, anthropology, and cultural theory have historically been more concerned with the broad social workings of risk—for instance how societies come to view a phenomenon as risky, or how risk drives social change. It is from such perspectives that these disciplines have eventually taken an interest in the individual’s relationship to risk, what it means to live in a society occupied with risk, or how processes of individualization make it important to handle risk in everyday life.
From a sociocultural perspective, lay risk management is seen as structured by the need to “get on with life,” while at the same time being surrounded by sometimes challenging risk messages. This is why there is less concern with the errors people make and more with the strategies they develop for maintaining a sense of order and safety, for being perceived as morally accountable, for keeping stress levels in check, and so on. The main critique launched against risk perception research by sociocultural theorists is that it fails to appreciate how risks are managed in relation to a multitude of other values and needs, which are difficult or impossible to calculate instrumentally. From this follow three things: first, a call for more qualitative research into every risk management, using methods that can generate an in-depth understanding of how risks are experienced and dealt with. Second, everyday risk management is too complex to allow for simple categorizations into rational or irrational. Third, emotions are an inseparable part of risk and lay risk management. The notion of bricolage has been suggested to capture the fact that people makes use of the interpretive resources at hand when dealing with risk, including technical risk knowledge, but also confidence, intuition, and trust, etc.
What does this all mean for risk communicators? The heuristics and biases approach to risk have identified several general mistakes people make in the face of risk (e.g., underestimating risks related to phenomena that evoke positive emotions), and such knowledge is of course very valuable for planning communicative efforts. For instance, given the tendency to block out considerations of probabilities (probability neglect) following scary events, it might be a good idea to launch information campaigns in relation to them, for instance, pointing out, in the aftermath of an air disaster, that airplanes are still safer than cars. There is, however, a good chance that people still would refrain from boarding airplanes, for reasons that have little or nothing to do with probabilities. Perhaps they do not wish to worry their families, or perhaps they simply do not want to deal with their own anxiety. Effective risk communication needs to take lay risk management at face value, considering and adjusting to its inherent complexity and the tools (interpretative resources) that are available to people in their lifeworld. There is a need to leave room for agency when communicating risk, to acknowledge identities, and to understand lay risk management as imbedded in a social practice. When this is not the case, risk messages will likely be ignored or substantially modified. Communicating risk requires groundwork to figure out how and why people relate to the risks in question in their specific context.
Arnoldi, J. (2009). Risk. Polity.Find this resource:
Beck-Gernsheim, E. (1996). Life as a planning project. In S. Lash, B. Szerszynski, & B. Wynne (Eds.), Risk, environment and modernity: Towards a new ecology (pp. 139–153). London: SAGE.Find this resource:
Christensen, P., & Mikkelsen, M. R. (2008). Jumping off and being careful: Children’s strategies of risk management in everyday life. Sociology of Health & Illness, 30(1), 112–130.Find this resource:
Douglas, M. (2013). Risk and blame. London: Routledge.Find this resource:
Dunn, E. C. (2015). Privatizing Poland: Baby food, big business, and the remaking of labor. Ithaca, NY: Cornell University Press.Find this resource:
Ericson, R. V., & Doyle, A. (2003). Risk and morality. Toronto: University of Toronto Press.Find this resource:
Farrell, A. E. (2011). Fat shame: Stigma and the fat body in American culture. New York: NYU Press.Find this resource:
Furedi, F. (2002). Paranoid parenting: Why ignoring the experts may be best for your child. Chicago: Chicago Review PressFind this resource:
Jensen, K. K., & Sandøe, P. (2002). Food safety and ethics: The interplay between science and values. Journal of Agricultural and Environmental Ethics, 15(3), 245–253.Find this resource:
Lupton, D., & Tulloch, J. (2002). “Life would be pretty dull without risk”: Voluntary risk-taking and its pleasures. Health, Risk & Society, 4(2), 113–124.Find this resource:
Taylor-Gooby, P., & Zinn, J. O. (2006). Current directions in risk research: New developments in psychology and sociology. Risk Analysis, 26(2), 397–411Find this resource:
Alaszewski, A., & Coxon, K. (2009). Uncertainty in everyday life: Risk, worry, and trust. Health, Risk, and Society, 11(3), 201–207.Find this resource:
Alaszewski, A., & Horlick-Jones, T. (2003). How can doctors communicate information about risk more effectively? Commentary: Communicating risk in the United Kingdom. British Medical Journal, 327(7417), 728.Find this resource:
Alaszewski, A. (2005). Risk communication: Identifying the importance of social context. Health, Risk, and Society, 7, 101–105.Find this resource:
Bastide, S., Moatti, J. P., & Fagnani, F. (1989). Risk perception and social acceptability of technologies: The French case. Risk Analysis, 9(2), 215–223.Find this resource:
Beck, U. (1992). Risk society: Towards a new modernity. London: SAGE.Find this resource:
Berger, P., & Luckmann, T. (1967). The social construction of reality. New York: Anchor Books.Find this resource:
Bostrom, A., Morgan, M. G., Fischhoff, B., & Read, D. (1994). What do people know about global climate change? 1. Mental models. Risk Analysis, 14(6), 959–970.Find this resource:
Boyd, R., & Richerson, P. J. (2005). The origin and evolution of cultures. New York: Oxford University Press.Find this resource:
Brante, T., Andersen, H., & Korsnes, O. (1998). Sociologiskt lexikon. Universitetsförlaget.Find this resource:
Brown, P. R., & Meyer, S. E. (2015). Dependency, trust, and choice? Examining agency and “forced options” within secondary-healthcare contexts. Current Sociology, 63(5), 729–745.Find this resource:
Brown, S. L., & Richardson, M. (2012). The effect of distressing imagery on attention to and persuasiveness of an antialcohol message: A gaze-tracking approach. Health Education & Behavior, 39, 8–17.Find this resource:
Dewilde, C. (2008). Individual and institutional determinants of multidimensional poverty: A European comparison. Social Indicators Research, 86(2), 233–256.Find this resource:
Douglas, M., & Wildavsky, A. (1982). Risk and culture: An essay on the selection of technological and environmental dangers. Berkeley: University of California Press.Find this resource:
Finucane, M. L. (2008). Emotion, affect, and risk communication with older adults: Challenges and opportunities. Journal of Risk Research, 11(8), 983–997.Find this resource:
Finucane, M. L., Slovic, P., Mertz, C. K., Flynn, J., & Satterfield, T. A. (2000). Gender, race, and perceived risk: The “white male” effect. Health, Risk & Society, 2(2), 159–172.Find this resource:
Fischhoff, B., Slovic, P., & Lichtenstein, S. (1982). Lay foibles and expert fables in judgments about risk. The American Statistician, 36(3b), 240–255.Find this resource:
Flynn, J., Slovic, P., & Mertz, C. K. (1994). Gender, race, and perception of environmental health risks. Risk Analysis, 14(6), 1101–1108.Find this resource:
Giddens, A. (1991). Modernity and self-identity: Self and society in the late modern age. Stanford, CA: Stanford University Press.Find this resource:
Gigerenzer, G. (2008). Why heuristics work. Perspectives on Psychological Science, 3, 20–29.Find this resource:
Gjernes, T. (2010). Facing resistance to health advice. Health, Risk & Society, 12(5), 471–489.Find this resource:
Hallowell, N. (2006). Varieties of suffering: Living with the risk of ovarian cancer. Health, Risk & Society, 8(1), 9–26.Find this resource:
Hastings, G., Stead, M., & Webb, J. (2004). Fear appeals in social marketing: Strategic and ethical reasons for concern. Psychology & Marketing, 21(11), 961–986.Find this resource:
Horlick-Jones, T. (2005). Informal logics of risk: Contingency and modes of practical reasoning. Journal of Risk Research, 8(3), 253–272.Find this resource:
Horlick-Jones, T., Walls, J., & Kitzinger, J. (2007). Bricolage in action: Learning about, making sense of, and discussing, issues about genetically modified crops and food. Health, Risk & Society, 9(1), 83–103.Find this resource:
Kasperson, R. E., Renn, O., Slovic, P., Brown, H. S., Emel, J., Goble, R., et al. (1988). The social amplification of risk: A conceptual framework. Risk Analysis, 8(2), 177–187.Find this resource:
Kemikalieinspektionen (2011). Kemikalier i barns vardag. Retrieved from https://www.kemi.se/global/broschyrer/kemikalier-i-barns-vardag.pdf.
Knaak, S. J. (2010). Contextualising risk, constructing choice: Breastfeeding and good mothering in risk society. Health, Risk & Society, 12(4), 345–355.Find this resource:
Kraus, N., Malmfors, T., & Slovic, P. (1992). Intuitive toxicology: Expert and lay judgments of chemical risks. Risk Analysis, 12(2), 215–232.Find this resource:
Lidskog, R. (1996). In science we trust? On the relation between scientific knowledge, risk consciousness, and public trust. Acta Sociologica, 39(1), 31–56.Find this resource:
Loewenstein, G. F., Weber, E. U., Hsee, C. K., & Welch, N. (2001). Risk as feelings. Psychological Bulletin, 127(2), 267.Find this resource:
Lupton, D. (2008). “You feel so responsible”: Australian mothers’ concepts and experiences related to promoting the health and development of their young children. In H. Zolle & M. Dutta (Eds.), Emerging perspectives in health communication: Meaning, culture, and power (pp. 113–128). New York: Routledge.Find this resource:
Lupton, D. (2013). Risk and emotion: Towards an alternative theoretical perspective. Health, Risk & Society, 15(8), 634–647.Find this resource:
Lupton, D. (2015). The pedagogy of disgust: The ethical, moral, and political implications of using disgust in public health campaigns. Critical Public Health, 25(1), 4–14.Find this resource:
Löfmarck, E. (2014). Den hand som föder dig: En studie av risk, mat och moderskap i Sverige och Polen. Acta Universitatis Upsaliensis. Studia Sociologica Upsaliensia. 61. Uppsala: Uppsala universitet.Find this resource:
Mintzberg, H. (1978). Patterns in strategy formation. Management Science, 24(9), 934–948.Find this resource:
Olofsson, A. (2009). Individ och risk. In A. Olofsson & S. Öhman (Eds.), Risker i det moderna samhället: Samhällsvetenskapliga perspektiv. Lund, Sweden: Studentlitteratur.Find this resource:
Petersen, A., & Lupton, D. (1996). The new public health. Health and self in the age of risk. London: SAGE.Find this resource:
Pintelon, O., Cantillon, B., Van den Bosch, K., & Whelan, C. (2011). The social stratification of “old” and “new” social risks. Centrum voor Sociaal Beleid Herman Deleeck (CSB), Working Paper, No. 11/04.Find this resource:
Renn, O. (1998). Three decades of risk research: Accomplishments and new challenges. Journal of Risk Research, 1(1), 49–71.Find this resource:
Ricketts, M., Shanteau, J., McSpadden, B., & Fernandez-Medina, K. M. (2010). Using stories to battle unintentional injuries: Narratives in safety and health communication. Social Science & Medicine, 70(9), 1441–1449.Find this resource:
Sjöberg, L. (2002). Factors in risk perception. Risk Analysis, 20(1), 1–12.Find this resource:
Slovic, P. (1987). Perception of risk. Science, 236, 280–285.Find this resource:
Slovic, P., & Peters, E. (2006). Risk perception and affect. Current Directions in Psychological Science, 15(6), 322–325.Find this resource:
Smyth, L. (2012). The demands of motherhood: Agents, roles, and recognitions. Basingstoke, U.K.: Palgrave Macmillan.Find this resource:
Spencer, G. (2013). The “healthy self” and “risky” young other: Young people’s interpretations of health and health-related risks. Health, Risk & Society, 15(5), 449–462.Find this resource:
Sunstein, C. R. (2003). Terrorism and probability neglect. Journal of Risk and Uncertainty, 26(2), 121–136.Find this resource:
Sztompka, P. (1999). Trust: A sociological theory. Cambridge, U.K.: Cambridge University Press.Find this resource:
Trägårdh, L. (2009). Tillit i det moderna Sverige: Den dumme svensken och andra mysterier. Stockholm, Sweden: SNS Förlag.Find this resource:
Tulloch, J., & Lupton, D. (2003). Risk and everyday life. London: SAGE.Find this resource:
Tversky, A., & Kahneman, D. (1975). Judgment under uncertainty: Heuristics and biases. In Utility, probability, and human decision making (pp. 141–162). Dordrecht, The Netherlands: Springer.Find this resource:
Vandecasteele, L. (2010). Poverty trajectories after risky life course events in different European welfare regimes. European Societies, 12(2), 257–278.Find this resource:
Wall, E. (2010). Riskförståelse: Teoretiska och empiriska perspektiv. Östersund, Sweden: Mittuniversitetet.Find this resource:
Wallman, S. (2000). Risk, STD, and HIV infection in Kampala. Health, Risk & Society, 2(2), 189–203.Find this resource:
Wang, X., & Kapucu, N. (2008). Public complacency under repeated emergency threats: Some empirical evidence. Journal of Public Administration Research and Theory, 18(1), 57–78.Find this resource:
Wilkinson, I. (2006). Health, risk, and social suffering. Health, Risk & Society, 8(1), 1–8.Find this resource:
Wynne, B. (1996). May the sheep safely graze? A reflexive view of the expert-lay knowledge divide. In S. Lash, S. Bronislaw, & B. Wynne (Eds.), Risk environment and modernity: Towards a new ecology (pp. 44–83). London: SAGE.Find this resource:
Zinn, J. (2008). Heading into the unknown: Everyday strategies for managing risk and uncertainty. Health, Risk & Society, 10, 439–450.Find this resource: