image missing
Date: 2024-04-29 Page is: DBtxt003.php txt00005415

Metrics
Quantifying value

NAS ... Thomas Dietzl ... Bringing values and deliberation to science communication

Burgess COMMENTARY

Peter Burgess

Open a pdf version of this paper

Bringing values and deliberation to science communication Next Section Abstract Decisions always involve both facts and values, whereas most science communication focuses only on facts. If science communication is intended to inform decisions, it must be competent with regard to both facts and values. Public participation inevitably involves both facts and values. Research on public participation suggests that linking scientific analysis to public deliberation in an iterative process can help decision making deal effectively with both facts and values. Thus, linked analysis and deliberation can be an effective tool for science communication. However, challenges remain in conducting such process at the national and global scales, in enhancing trust, and in reconciling diverse values. Humans learn through both direct experience and by observing and engaging in conversations with other humans. Our ability to learn from others, social learning, is a defining characteristic of our species (1). Human history is the coevolution of our ability to govern ourselves, to shape ecosystems, and to learn from our actions and those of others. The process is not always successful. In a recent study of societies under severe stress, Butzer and Endfield (2) found that less than half were able to avoid breakdown. Adaptive social learning is not an easy challenge to meet. In the 21st century, the scale of human activity will expand substantially (3⇓–5), as will the power of our technology. Social learning is the basis both for the unprecedented scale of human activity and for the power of our technologies. If we are to avoid serious adverse consequences from these changes, we must accelerate social learning for sustainability and for governing technology (6). Our growing capabilities in nanotechnology, biotechnology, information technology, cognitive technology, and robotics (NBIC) will be a special challenge. They add to the already daunting problems of sustainability and the long-standing issues of violent conflict and poverty. Without continuous and effective social learning, we are ill equipped as individuals, as a nation, and as a global society to make sound decisions about these complex matters. We need social learning about facts so that our beliefs about how the world works are well aligned with reality. We also need social learning around values to think through the emerging implications of major social transformations. During the 17th century, science began to take its modern form as a systematic way to learn about the world (7⇓–9). The rules of science have proven to be a highly effective way for scientists to communicate with each other and to build cumulative understanding. Science is an example of social learning at its best. However, most people are not trained in science. And even those of us who are cannot easily read the literature outside of our specialties. Thus, we all must rely on science communication for information about issues on which we make decisions. In decision making, science communication is a substitute for the social learning that takes place within a scientific community. We rely on science communication to inform us about the facts we need to know to make decisions. That alone is a substantial challenge. In making decisions, we have a further challenge: We have to assess both the facts and our values and bring them together to make decisions. The term “values” is often used quite informally. However, values are a well-developed and well-researched concept in the social sciences and are at the core of much of our understanding of environmental concern (10, 11). Values are defined as “(a) concepts or beliefs, (b) about desirable end states or behaviors, (c) that transcend specific situations, (d) guide selection or evaluation of behavior and events, and (e) are ordered by relative importance” (ref. 12, p. 551). Values underpin more specific preferences for one course of action over another. Our preferences depend on what we believe about how actions will affect things we value. Science communication usually focuses on facts, not on values. That is appropriate for many contexts. However, decisions always involve values, and there is rarely complete agreement about values on the part of interested and affected parties. Public participation has been proposed as a mode of science communication that can, at least in principle, lead to better decisions by addressing both facts and values. What is public participation? Any form of democratic input into decision making, including voting, expressing opinions in surveys, holding demonstrations, or other modalities of attempting to bring about social change, can be thought of as public participation. However, the literature I draw on uses the term in a more narrow sense. Public participation is “organized processes adopted by elected officials, government agencies, or other public- or private-sector organizations to engage the public in environmental assessment, planning, decision making, management, monitoring, and evaluation” (ref. 13, p. 11). A line of theory stretching from Dewey to Habermas, among others, argues that public deliberation is the essence of democracy and that public participation processes can improve the ability of democracies to deal with serious challenges (14⇓⇓⇓⇓⇓⇓⇓–22). Is public participation up to that challenge? What can we learn from experience with participation that may help us improve science communication intended to inform decision making? Before addressing these questions, I will suggest criteria for a good decision and then examine the reasons why it is difficult to make good decisions about many contemporary problems, using climate change as an example. Research on public participation raises some ideas that can inform science communication in the service of decision making. It also makes clear that issues of scale, trust, and the integration of values with facts require special attention. Previous Section Next Section Why Is It Hard to Make Good Decisions? Three characteristics of good decisions make it clear why science is important to decision making (17). First, a good decision must be factually competent. The beliefs used in making decisions should accurately reflect our understanding of how the world works. Here, the role of science is obvious: Science is our best guide to developing factual understanding. My interest in a decision depends, in part, on my beliefs about the facts: What will happen if one decision is taken instead of another? However, my interest also depends on what I value. Thus, a second criterion for good decisions is that they must be value-competent. We know values differ substantially across individuals and vary to some degree within an individual over time. We also know that most people have some degree of flexibility in the values they deploy in making a decision. Science can help us achieve value competence by informing us about what values people bring to a decision and how the decision process itself facilitates or impedes cooperation or conflict. Third, good decision making must be adaptive. We have to acknowledge that our understanding of facts is based on uncertain knowledge and that values will evolve over time. Thus, our decisions must allow us to shift our strategy as our understanding and values change. Science can help us assess uncertainty about facts and values, properly take account of uncertainty in weighing alternatives, and monitor change over time. In 1923, Dewey (20) identified the importance of scientific information for sound public decision making and raised concerns about the public’s access to that information, defining the public as all those interested in or affected by a decision. Dewey’s concerns persist. We cannot assume that science will be adequately deployed in making decisions for at least four reasons. First, we have learned that humans have trouble thinking about uncertainties, nonlinear systems, and complex adaptive systems, all of which are involved in problems like climate change, managing NBIC technologies, or other emerging challenges (23⇓–25). Second, science and technologies pervade nearly all critical societal decisions to an extent that was not true in the past. The future of health, national security, the economy, and the environment all rest on how we deal with emerging knowledge and new technologies; thus, the need for scientific understanding is ubiquitous. Third, science and technological development are becoming global, and their transfer around the world is nearly instantaneous. At the same time, the power of our technology is unprecedented because of its biospheric scope and its ability to shape matter at the molecular level and to reshape, or even to create, living processes. This scope and power mean that no individual, or even any organization, can fully grasp all the implications as we intervene, intentionally and unintentionally, in coupled human and natural systems. Even our best efforts provide no certainty of a desirable outcome. Decisions based on inept handling of science could have dire consequences, and those consequences may occur across the globe from where the decision was made. Finally, and regrettably, as science has become more important to critical societal decisions, we have seen increasing efforts to politicize science and promote beliefs that support entrenched interests despite their scientific inaccuracy (26, 27). The public learns about science in three ways: mass media, organized education, and the processes labeled as public participation. Media coverage of science and formal science education often focus on science that, however important and beautiful, is rather distant from decision making. In contrast, public participation is a mode of science communication focused on using science to inform decisions. The science engaged in public participation is nearly always characterized by substantial uncertainty, especially when general principles and literature have to be applied to the particular, often local, circumstances around which a decision must be made. In contrast, other forms of science communication need not pay much attention to uncertainty or even embrace it as the frontier of knowledge. Delaying decisions until certainty increases often will have substantial consequences, and we cannot assume that the knowledge we need will become significantly more certain in a time frame that is realistic for decision making. The problem of uncertainty is so pervasive that Rosa (28) has argued that science applied to decision making requires careful thinking about epistemology. Adaptive risk management, sometimes called adaptive risk governance, is a response to the challenges of uncertainty we face in dealing with environment, sustainability, and technology (29⇓–31). Recent studies from the US National Academy of Sciences call for adaptive risk management as the best way to cope with “America’s climate choices” (32⇓⇓⇓–36). Similar arguments are made for nearly all domains of environmental policy, sustainability, energy policy, and policy around NBICs (37⇓⇓⇓–41). Although the character of adaptive risk management will differ across applications, the core idea is that decisions should take explicit account of uncertainty, facilitate social learning, maintain some flexibility, and revisit the decision periodically. Uncertainty about facts is challenging. However, we also must cope with value uncertainty because different people bring different values to a decision-making process. On issues on which there is value consensus, it may be possible to reduce decision making to a largely technical exercise in deploying tools like benefit–cost analysis. However, for climate change, energy policy, sustainability, and the governance of NBICs, we cannot assume value consensus. For example, the implications of our choices in these areas will play out over decades and centuries; thus, we must decide the degree to which we weight (i.e., discount) the more distant future relative to the near term. In addition, some of the future events we are concerned about have a relatively low probability of happening, but would have catastrophic effects if they did happen. Dealing with the future is clearly a value question, and it is evident that people, including scientists who analyze long-term decisions, differ about the proper way to proceed (42⇓⇓–45). To make things even more complex, the value implications of some decisions are hard to assess and will not be clear for most people when they first hear about an emerging issue. The scale of our interventions into the biosphere is unprecedented in human history. NBICs make possible changes in human and animal life we have never before contemplated. Most people have never discussed and worked through the implications for their values of decisions about these very complicated issues. Clearly, public deliberation is essential both to clarify what values are at play and to try to reach consensus, or at least delineate the lines of disagreement. However, having informed public deliberation around matters with substantial scientific content was challenging when Dewey raised the issue, and it is even more challenging today (20). The idea of adaptive risk management is appealing, but implementing it will require careful thought about how to engage both uncertain facts and uncertain values, and how to learn as we move forward. Previous Section Next Section Why Scientifically Informed Deliberation Is Difficult In scientific discourse, we expect reasoned and balanced arguments and a willingness to shift from a currently held position when evidence refuting it accumulates. We know from historians, philosophers, and sociologists of science that the evolution of science is a bit bumpier than this ideal image. However, we hold to this ideal as a model for how scientists should behave. At the core of this model is the norm that we should let our beliefs about the world be shaped by evidence and by the ongoing deliberation of the scientific community. We are very cautious about letting our values have too much influence on how we assess evidence, and thus on our beliefs. We know that it is hard to adhere to this norm, but it is central to our identity as scientists. This careful and cautious process is not how most people, including scientists in their role as members of the public, deal with most decisions they have to make. Most individuals do not need information about climate change or nanotechnology risks or other scientific and technological information to make day-to-day decisions. Such large concerns seem remote from the pressing matters of our everyday lives. If we encounter them, we handle them quickly and with little reflection. For example, most people probably first hear about climate change through a casual conversation or by reading, watching, or listening to a media account. Not much is at stake for the average member of the public, certainly not in the short term, and we are all busy. Thus, for most people, climate change or other technological issues are incorporated into cognition quickly, using shortcuts, contextual cues, and fast mental processes (46, 47). Rather than carefully weighing the strength of the evidence underpinning an assertion about climate change, most listeners will parse the information almost instantaneously, calling on their values and general beliefs as a guide (48). Once an initial impression is formed, people then tend to accumulate more and more evidence that is consistent with their prior beliefs. They may be skeptical or unaware of information incongruent with prior beliefs and values. Over time, this process of biased assimilation of information can lead to a set of beliefs that are strongly held, elaborate, and quite divergent from scientific consensus (49⇓⇓–52). In policy systems, this can lead to groups having beliefs that are increasingly divergent from one another, although increasingly homogenous within the group. That, in turn, makes it difficult to develop sound policy (53). We also know that in social networks, even relatively modest preferences to associate with similar people and avoid dissimilar people can have substantial effects on how network structure evolves (54). The hope for public participation as a mode of science communication is that, at its best, participation can provide a way of enhancing mutual understanding of facts, including their uncertainty, and values, as well as value differences. Good participation practice acknowledges that members of the public will have different positions because of both different beliefs and different values. It accepts that we must address both facts and values in making good decisions. In pursuit of this goal, many studies have argued for linking scientific analysis with public deliberation, or what has become known as an analytic deliberative process (33, 36, 55⇓⇓⇓–59). In an analytic deliberative process, scientific analysis informs and is informed by public deliberation about the issues. The research agenda in support of a decision is shaped by both the views of the scientific community and the information the public believes it needs to make informed decisions. In turn, public discussion engages science to build trust in scientific results and to clarify the nature of uncertainty and how best to deal with it. The goal of analytic deliberative processes is to provide a sound way of incorporating our best understanding about uncertain facts and diverse values into public decision making. Analytic deliberation is a mode of science communication in which the communication is ongoing and involves not just information moving from the scientific community to the public, but information moving from the public to the scientific community. The logic of linking scientific analysis and public deliberation in an iterative process is compelling. But how well do such processes actually work? A very substantial empirical literature has examined the performance of public participation, especially as it has been applied to environmental assessment and decision making. That literature is our key source of knowledge about using analytic deliberative processes to undergird adaptive risk management and social learning around the important challenges of the 21st century. It can provide useful insights for guiding efforts at science communication, where both facts and values matter. Previous Section Next Section What Do We Know About Public Participation? In this brief discussion, it is not possible to review the vast and complex literature on public participation thoroughly. Nor is it necessary, because the goal here is to summarize what is known about public participation as background for thinking about science communication that is intended to inform decision making. The National Research Council report entitled Public Participation in Environmental Assessment and Decision Making (PPEADM) provides a recent and thorough review of the literature on public participation in the United States (13). PPEADM argues that participatory processes have three goals: improving the quality of decisions, enhancing the legitimacy of the decision-making process, and advancing the capacity of the participants for future decision making (13). PPEADM concludes that, “When done well, public participation improves the quality and legitimacy of decisions and builds the capacity of all involved to engage in the policy process” (ref. 13, p. 226). This conclusion is based on a review of roughly 1,000 empirical studies, reports of practitioner experience, and theoretical analyses spanning the social sciences. It is by far the most extensive analysis available of what we know about public participation processes (see also ref. 60). What do we mean by public participation that is “done well”? The report offers 15 design principles (Table 1) and guidance on how to diagnose a situation in order to apply the principles. Looking at the design principles, it is clear that there are many ways to implement public participation processes. The report emphasizes that what will work or what will fail is very sensitive to the context. The conclusion, that it is possible to have successful participation processes, does not imply that all of these processes are successful. Indeed, given the immense diversity of the kinds of processes that fall under PPEADM’s definition of public participation, it is probably not meaningful to attempt to estimate a success rate. Rather, the report uses the extensive available literature to elucidate what leads to success and, conversely, what contributes to less than ideal outcomes. View this table: In this window In a new window Table 1. Design principles for public participation Good public participation practice requires iteration in which the questions addressed by the science are shaped, in part, by public concerns. It also requires a process in which science can influence the beliefs deployed in public deliberation. Science communication usually engages only facts, not values. However, science communication in the service of decision making must attend to values as well. There are a number of careful examinations of the kind of epistemology we need for making decisions under uncertainty. They lead to several taxonomies of expertise (28, 61). For this discussion, it is useful to distinguish three types of expertise: scientific, community, and political (59, 62). Scientific expertise is knowledge grounded in the rules of science and is the “gold standard” for factual understanding. The evolving evidence indicates that linking analysis and deliberation in an iterative process is a powerful way to ensure that the best science is trusted and used in public deliberation. Community expertise is what most members of the public develop in their day-to-day lives. It can contribute to factual understanding by helping to ground abstract knowledge in the local context, what is often called “traditional ecological knowledge” (63, 64). However, in addition, it is expertise on what the public cares about: expertise about values. For example, in one deliberation experiment on carbon sequestration, members of the public expressed strong concerns about the effectiveness a policy would have when implemented, an issue that scientific assessments of carbon sequestration might miss (65). Finally, political expertise is also grounded in the community rather than in scientific discourse, but it is shaped by regular interaction among those involved in politics, an engagement that is not typical of most members of the public. It is expertise not only in values, but in what might work and what might not, given the stance of other political actors and the capacities of local organizations and institutions. Political expertise understands a history of trust or mistrust, and norms about how to proceed in making a decision, including norms about what is and what is not on the public agenda. It is the kind of expertise carried by members of the policy system. We have several pools of literature studying this kind of expertise, notably the sophisticated literature on the commons (66⇓–68) and on the advocacy coalition framework, as well as other literature on policy networks (53, 69, 70). For effective public participation, and for science communication intended to inform decisions, we have to meld these forms of expertise into an alloy that is better at informing decisions than any one form of expertise would be acting alone. Dialogue with those who carry political and community expertise can help scientists understand the constraints on decision making, the local context to which scientific analysis must be applied, and the issues of concern to those who will influence a decision and those who will be affected by it. As the pioneering report entitled Understanding Risk: Informing Decisions in a Democratic Society put it, effective linkage of analysis and deliberation helps “get the science right” and “get the right science” (55). A process that engages multiple kinds of expertise can help build both trust in science and a more nuanced understanding of scientific uncertainty. It can help clarify what conflicts are about differences in values, about differences in interests, and about different understandings of the facts. Because it emphasizes an iterative process, it encourages careful reflection on values rather than the fast mapping of an issue to a set of preexisting positions. The model of science communication that emerges from the public participation literature is very much communication as a multiway interaction, not communication as simply passing factual knowledge from scientists to others. This is not to say that all efforts at deliberation are successful in meeting this ideal. Rather, the point made by PPEADM is that the literature shows a much higher probability of success when communication is interactive rather than one-way. Several challenges face deliberative processes. One is the problem of scale. Most research has focused on processes at the local to regional level, whereas many of the emerging challenges require decisions at the national or global scale or across scales (68). Further, when we move from the local to the national and global scales, adaptive risk management must avoid the myopia of considering only one problem at a time, such as addressing climate change while ignoring poverty, or vice versa (39). Maintaining trust, especially trust in science, is a second challenge. Of course, finding ways to deal more effectively with value differences is also a major issue. Previous Section Next Section Challenges of Scale, Trust, and Values Scale. Although there are rich research traditions on the role of science in national environmental policy processes (53, 71), most work on public participation is about local to regional processes. The majority of evidence about public participation at the national scale is from policy of narrow scope, such as regulatory negotiation, where the participants are usually professionals with substantial political and scientific expertise (72). There are some experiments in the United States with processes like deliberative polling (73⇓–75) and some national deliberative processes in other industrial nations (76). Using the Web and other interactive media for participation could allow for national or even global scale links between analysis and deliberation. However, there is little systematic research on the strength and weaknesses of Web-based approaches (77). Overall, we have little experience in applying the lessons from local and regional public participation processes to problems that require national, or even international, deliberation. As we move to the national and global scales, we face a special challenge. At larger scales, it becomes increasingly artificial to consider only one issue at a time. Sophisticated analyses of climate change need to engage other changes in the biosphere. We also have to consider how climate change, and our actions to cope with it, affects and is affected by other global challenges, such as the emergence of NBIC technologies, changing global demographic and economic patterns, and the millennia-old problems of violence and poverty. Rosa et al. (39) have suggested that adaptive risk management, although not wholly adequate for cross-domain thinking, nonetheless offers insights that can get us started. Trust. Trust is a complicated topic; there are several forms of trust, each with its own dynamics (52, 78). Betrayal of trust is a very serious matter for most people, eliciting strong behavioral responses and reactions that can be detected even in brain functioning. For most large-scale policies, we are asking members of the public to trust large organizations (e.g., governments) and institutions (e.g., science) with which they have limited direct experience. As a result, the degree of public trust entrained by a policy proposal may be shaped largely by fast cognition and contextual clues, rather than careful weighing of evidence. Trust in science is presumed in most science communication. It was not long ago when there was considerable bipartisanship in climate concern. However, that has faded both across the general public and among US political elites as a result of a concerted campaign to erode trust in the science of climate change (26, 79). Raising issues of scientific uncertainty is a conscious strategy to influence societal decision making. It has played out around a number of scientific and technological issues, going back at least to debates in the 20th century about the health risks of lead in gasoline or of smoking (27, 80⇓–82). Believing that there is scientific disagreement is strongly linked to rejecting the need for policy on climate change (83). Scientific information is always, to some degree, vulnerable to concerns about uncertainty because scientists are trained to focus on uncertainty, whereas the public, at least when using shortcuts and fast cognitive processing, equates uncertainty with a lack of sufficient understanding to warrant action. When an issue has become politicized, more information from the media and informal sources may enhance polarization through biased assimilation. This may be especially true of new media that facilitate or even encourage viewing sources aligned with prior positions (84). The result is that a substantial fraction of the population can hold views that are quite incongruous with scientific consensus (85). Those who hold these views filter information based on their values and general beliefs. They may view scientific evidence contradictory to their current beliefs with great skepticism. In these circumstances, people may see media coverage of science and educational materials as political rather than scientific. They are applying a political frame and see all actions, including communication, as political. This politicization does not bode well for public decision making on issues with substantial scientific content. We have not been very successful in efforts to counter ideological frames applied to science. Indeed, it is plausible that things are getting much worse in the sense that more and more domains of science are being interpreted by some segments of the public as statements about values rather than statements about facts. For example, Gauchat (86) shows that although overall trust in science in the United States has not changed much over the past 35 years, self-identified conservatives have gone from having the highest levels of science trust to the lowest. This finding parallels McCright and Dunlap’s analysis of the evolution of polarization around climate change (26, 79, 87). Values. PPEADM argues that effective participation enhances the capacity of those involved, both the public and the agencies. The idea that effective deliberative processes can lead to change in the individuals deliberating goes back to Dewey (19, 20) and is a key point for Habermas (21, 22). Emergent environmental and technological challenges, including climate change and many new technologies, raise value questions that are hard to relate to the day-to-day experiences of most people. We hope linked analysis and deliberation will improve the ability of the public to handle uncertain scientific information. We also hope that deliberative processes can lead to an evolution of values in the face of emerging and highly complex issues. However, at present, we know relatively little about value change. Indeed, most standard definitions of values note that they are relatively stable over the life course (10). A substantial body of evidence from around the world notes that two dimensions of values are nearly universal: the distinction between self-interest and altruism and the distinction between openness to change and traditionalism (10, 88, 89). Cultures and individuals differ in the weight they give to each of these. However, both of these dimensions are strong predictors of risk perceptions and concern with environmental issues (90⇓⇓⇓–94). Early theoretical expectations have suggested that deliberative processes might make people more altruistic by encouraging them to see the point of view of others, and there is some evidence in support of the argument (95⇓–97). Value change is probably a long-term process. In the short run, deliberation seems more likely to shift beliefs and encourage participants to think through issues to which they have not previously given much thought, including issues about policy implementation as noted above (65, 74). People change beliefs about facts because we hold to norms that tell us beliefs should change with new evidence: a norm that comes from science. Changes in beliefs can come from being aware of scientific evidence that comes from trusted sources. Of course, beliefs that are based on faith and not on evidence are less amendable to change. We can hope that effective science communication, including linked analysis and deliberation, will lead to consensus on beliefs that are well aligned with science. However, for values, there is no correct position on which we can converge. Policy analysis tools, such as benefit–cost analysis, assume agreement both on the values we assign to decision outcomes and on the appropriate process for reconciling value differences to reach a decision. However, people may differ not only in what they value but in how they believe value differences should be resolved. Some people may want to use a specific logic for tradeoffs, such as cost–benefit analysis; some believe the key principle is to limit intrusive government; and still others give priority to the intrinsic worth of other species. Science can claim a special role in informing us about what we should believe about the facts of how the world works. Science also can inform us about what people value and the decision rules they consider appropriate. But science cannot tell us what we should care about: Science has no privilege with regard to values. However, continuing research on how values influence and are influenced by decision-making processes can help us hone better processes for identifying and coping with the diversity of values engaged around complex societal decisions. Previous Section Next Section Lessons for Science Communication The literature on public participation, juxtaposed with the literature on values and beliefs in decision making, helps us understand the difficulties of science communication in support of decision making. I offer some conjectures about the lessons that emerge from that understanding. Acknowledge the Importance of Values. If your beliefs indicate that something of value to you be will affected by a decision, then you have an interest in that decision. Better scientific understanding might clarify the likelihood of various outcomes, perhaps solidifying your interest or perhaps reducing your concerns. Science can also clarify whose interests are harmed by a course of action and who benefits. Then, finding compromises and compensation could make some courses of actions more acceptable than they would be otherwise. If there are value differences on an issue, simply clarifying the facts will not always lead to a consensus decision. Further, there is a tendency to avoid discussion of values. I may assume that the values I hold are universal, and thus the only reason people will disagree with me is that they have different beliefs about the facts. This is a comfortable view because it means that debate can be conducted on the relatively safe grounds of conflict about facts. We can avoid the much more dangerous arena in which I have to argue that your values are wrong or consider that my values might be seen as unethical by you. When we acknowledge value differences, we are also accepting that our differences are going to be more difficult to reconcile than if they were based solely on different beliefs about the facts. Our reluctance to debate values may lead us astray. It is a form of cognitive bias to think that disagreements are mostly about facts. It is a comfortable bias because it leads us to believe we can resolve disagreements by better information about facts. Certainly, we do not want decisions made based on incorrect factual beliefs. We need to identify concerns that can be addressed by providing scientific information in a way that facilitates adaptive change in beliefs: social learning. However, we make a serious mistake if we assume such fact-based processes will resolve conflicts based on value differences. We are likely to be much more effective if we focus our attention on identifying value differences and designing processes that allow articulation of and reflection on values in the light of decisions that must be taken. Decision sciences provide many helpful tools that allow individuals and groups to clarify their values, and there is some evidence that reflection about and articulation of value positions can reduce conflict and allow for a more effective search for compromises (98⇓⇓–101). Use Approaches That Enhance Trust. In the long run, it is likely that the accumulation of scientific evidence will lead to shifts in beliefs, even in the face of campaigns to highlight uncertainty and encourage views of science as politically motivated. However, delaying action on an issue like climate change has substantial consequences. What can we do to enhance trust in science? One step, fully in line with the norms of science, is to encourage open and transparent processes for reporting scientific results. Major climate assessment activities, such as the Intergovernmental Panel on Climate Change (IPCC) and the US National Climate Assessment, are making major efforts to provide traceable accounts of how they reached their conclusions (102). There is a long scientific history of sharing data and algorithms, and most journals now make open access to data a requirement for publication. These processes will continue to evolve and will gradually have positive effects on trust in science. Linking scientific analysis to public deliberation is another key step. The public will certainly have more trust in science and will be less paralyzed by uncertainties if it has some input on what questions are addressed by research and is engaged with researchers from early on in a process leading to a decision. On a cautionary note, trust is not well served when scientists confuse competencies. Scientists are experts on the facts, on how the world works. Most major assessment processes, like those of the IPCC, the US National Climate Assessment, or the US National Academy of Sciences, are careful to provide conclusions and recommendations that are “policy relevant but not policy prescriptive.” They recognize that policy decisions must always involve values, not just facts, and that a scientific body is neither authorized, nor particularly well constituted, to make value judgments for the larger society. Scientists are also members of the public interested in and affected by decisions. Thus, it is natural for scientists to have strong preferences about what should be done. After all, they have often thought very carefully about the implications of decisions for things people value. As citizens, scientists certainly have a right and perhaps even an obligation to make value-based arguments. However, nonscientists are not aware of the struggle to keep values from influencing scientific assessment of facts. Thus, if value-based arguments are not carefully differentiated from conclusions based on science, they can erode trust in science. Our tendency to argue about facts when values are at stake may make us expect that factual arguments are often value arguments in disguise, and thus make us suspicious of facts that are difficult to reconcile with our values. To maintain trust, scientists and science communicators must be very careful to clarify which statements are grounded in science and differentiate those from statements grounded in both facts and values. When scientists make arguments about what we should do, they should make clear that their views are grounded in both their understanding of the facts and their values. Toward Better Decisions. The challenges of global environmental change, sustainability, NBIC, and the interaction of these emerging issues with traditional human travails, such as violence and poverty, are formidable. To deal with them successfully, we need to make decisions that are competent about facts, that are competent about values, and that allow for social learning as we go forward in the face of uncertainty. There is an emerging consensus that adaptive risk management is a reasonable way to frame decision making. Further, we have reason to believe that linking scientific analysis and public deliberation so they inform each other can enhance our competence about facts and values, and allow us to learn as we proceed. Perhaps for the first time, social learning and processes for making good decisions are supported by a body of science. This science includes not only research about the biophysical and social world but also research about how to make decisions and govern ourselves and the ecosystems we affect. The test we face is to develop and deploy the science of decision making and science communication at a pace that will allow us to make sound decisions even as the scope and power of our actions transform the world around us. Previous Section Next Section Acknowledgments D. Bidwell, A. Henry, L. Kalof, A. McCright, O. Renn, E. Rosa, and P. Stern offered many insights on these issues. M. Charters, R. Kelly and C. Leshko improved the paper through their close reading. This work was supported by Michigan AgBio Research, by the Michigan State University Center for Systems Integration and Sustainability, and by the National Oceanic and Atmospheric Administration's Climate Program Office through the Great Lakes Integrated Sciences and Assessments Center. Previous Section Next Section Footnotes 1E-mail: tdietz@msu.edu. Author contributions: T.D. wrote the paper. The author declares no conflict of interest. This paper results from the Arthur M. Sackler Colloquium of the National Academy of Sciences, “The Science of Science Communication,” held May 21–22, 2012, at the National Academy of Sciences in Washington, DC. The complete program and audio files of most presentations are available on the NAS Web site at www.nasonline.org/science-communication. This article is a PNAS Direct Submission. B.F. is a guest editor invited by the Editorial Board. Previous Section References ↵ Richerson PJ, Boyd R (2005) Not by Genes Alone: How Culture Transformed Human Evolution (Univ of Chicago Press, Chicago). Search Google Scholar ↵ Butzer KW, Endfield GH (2012) Critical perspectives on historical collapse. Proc Natl Acad Sci USA 109(10):3628–3631. Abstract/FREE Full Text ↵ Dietz T, Rosa EA, York R (2007) Driving the human ecological footprint. Front Ecol Environ 5(1):13–18. CrossRef ↵ United Nations Environment Programme (2012) Global Environmental Outlook 5 (United Nations Environment Programme, New York). ↵ Rosa EA, Dietz T (2012) Human drivers of national greenhouse gas emissions. Nat Clim Chang 2(8):581–586. CrossRef ↵ Henry AD (2009) The challenge of learning for sustainability: A prolegomenon to theory. Human Ecology Review 16(2):131–140. Web of Science ↵ Snyder LJ (2011) The Philosophical Breakfast Club (Broadway Books, New York). Search Google Scholar ↵ Margolis H (2002) It Started with Copernicus: How Turning the World Inside Out Led to the Scientific Revolution (Univ of Chicago Press, Chicago). Search Google Scholar ↵ Bowler PJ, Morus IR (2005) Making Modern Science: A Historical Survey (Univ of Chicago Press, Chicago). Search Google Scholar ↵ Dietz T, Fitzgerald A, Shwom R (2005) Environmental values. Annu Rev Environ Resour 30:335–372. CrossRef ↵ Schwartz SH (2011) Studying values: Personal adventure, future directions. J Cross Cult Psychol 42(2):307–319. Abstract/FREE Full Text ↵ Schwartz SH, Bilsky W (1987) Toward a universal psychological structure of human values. J Pers Soc Psychol 53(3):550–562. CrossRefWeb of Science ↵ US National Research Council (2008) Public Participation in Environmental Assessment and Decision Making, eds Dietz T, Stern PC (National Academy Press, Washington, DC). ↵ Chambers S (2003) Deliberative democratic theory. Annual Review of Political Science 6:307–326. CrossRefWeb of Science ↵ Ryfe DM (2005) Does deliberative democracy work? Annual Review of Political Science 8:49–71. CrossRefWeb of Science ↵ Warren ME, Pearse H, eds (2008) Designing Deliberative Democracy: The British Columbia Citizens’ Assembly (Cambridge Univ Press, Cambridge, UK). ↵ Dietz T (1994) ‘What should we do?’ Human ecology and collective decision making. Human Ecology Review 1(2):301–309. Search Google Scholar ↵ Mansbridge JJ (1980) Beyond Adversarial Democracy (Basic Books, New York). Search Google Scholar ↵ Dewey J (1886/1969) The ethics of democracy. The Early Works of John Dewey, 1882-1898, ed Boydston JA (Southern Illinois Univ Press, Carbondale, Illinois), Vol 1, pp 227–249. Search Google Scholar ↵ Dewey J (1923) The Public and Its Problems (Henry Holt, New York). Search Google Scholar ↵ Habermas J (1970) Towards a Rational Society (Beacon, Boston). Search Google Scholar ↵ Habermas J (1991) Moral Consciousness and Communicative Action (Beacon, Boston). Search Google Scholar ↵ Weber E (2006) Experience-based and description-based perceptions of long-term risk: Why global warming does not scare us (yet) Clim Change 77(1-2):103–120. Search Google Scholar ↵ Weber E, Stern PC (2011) Public understanding of climate change in the United States. Am Psychol 66(4):315–328. CrossRefMedlineWeb of Science ↵ Pidgeon N, Fischhoff B (2011) The role of social and decision sciences in communicating uncertain climate risks. Nature Clim Change 1(1):35–41. CrossRefWeb of Science ↵ McCright AM, Dunlap RE (2010) Anti-reflexivity: The American conservative movement’s success in undermining climate science and policy. Theory, Culture and Society 27(2-3):100–133. Abstract/FREE Full Text ↵ Oreskes N, Conway EM (2010) Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (Bloomsbury, New York). Search Google Scholar ↵ Rosa E (1998) Metatheoretical foundations for post-normal risk. J Risk Res 1(1):15–44. CrossRef ↵ Renn O (2008) Risk Governance: Coping with Uncertainty in a Complex World (Earthscan, London). Search Google Scholar ↵ Arvai J, et al. (2006) Adaptive management of the global climate problem: Bridging the gap between climate research and climate policy. Clim Change 78(1):217–225. CrossRefWeb of Science ↵ Rosa EA, McCright A, Renn O (2013) The Risk Society: Social Theory and Governance (Temple Univ Press, Philadelphia). Search Google Scholar ↵ US National Research Council (2011) America’s Climate Choices (National Academies Press, Washington, DC). ↵ US National Research Council (2010) Advancing the Science of Climate Change (National Academies Press, Washington, DC). ↵ US National Research Council (2010) Adapting to the Impacts of Climate Change (National Academies Press, Washington, DC). ↵ US National Research Council (2010) Limiting the Magnitude of Climate Change (National Academy Press, Washington, DC). ↵ US National Research Council (2010) Informing an Effective Response to Climate Change (National Academies Press, Washington, DC). ↵ Davis JM (2007) How to assess the risks of nanotechnology: Learning from past experience. J Nanosci Nanotechnol 7(2):402–409. CrossRefMedline ↵ Renn O, Klinke A (2011) Complexity, uncertainty and ambiguity in inclusive risk governance. Risk and Social Theory in Environmental Management, eds Lockie S, Measham T (CSIRO Publishing, Collingwood, VIC, Australia), pp 53–70. Search Google Scholar ↵ Rosa EA, Dietz T, Moss RH, Atran S, Moser S (2012) Managing the risks of climate change and terrorism. Solutions 3(2):59–65. Search Google Scholar ↵ Rosa EA, et al. (2010) Nuclear waste: Knowledge waste? Science 329(5993):762–763. Abstract/FREE Full Text ↵ Armitage DR, et al. (2009) Adaptive co-management for social–ecological complexity. Front Ecol Environ 7(2):95–102. CrossRef ↵ Dasgupta P (2008) Discounting climate change. J Risk Uncertain 37(2-3):141–169. CrossRefWeb of Science ↵ Portney P, Weyant J, eds (1999) Discounting and Intergenerational Equity (Resources for the Future, Washington, DC). ↵ Weitzman ML (2009) On modeling and interpreting the economics of catastrophic climate change. Rev Econ Stat 91(1):1–19. CrossRefWeb of Science ↵ Nordhaus W (2012) Economic policy in the face of severe tail events. Journal of Public Economic Theory 14(2):197–219. CrossRef ↵ Cialdini RB (2007) Influence: The Psychology of Persuasion (Harper Collins, New York), Revised Ed. Search Google Scholar ↵ Kahneman D (2011) Thinking Fast and Slow (Farrar, Straus & Giroux, New York). Search Google Scholar ↵ Dietz T, Stern PC (1995) Toward a theory of choice: Socially embedded preference construction. Journal of Socio-Economics 24(2):261–279. Search Google Scholar ↵ Lord CG, Ross L, Lepper MR (1979) Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. J Pers Soc Psychol 37(11):2098–2109. CrossRefWeb of Science ↵ Munro GD, et al. (2002) Biased assimilation of sociopolitical arguments: Evaluating the 1996 U.S. presidential debate. Basic and Applied Social Psychology 24(1):15–26. Web of Science ↵ Corner A, Whitmarsh L, Xenias D (2012) Uncertainty, scepticism and attitudes towards climate change: Biased assimilation and attitude polarisation. Clim Change 114(3-4):463–478. CrossRefWeb of Science ↵ Henry AD, Dietz T (2011) Information, networks, and the complexity of trust in commons governance. International Journal of the Commons 5(2):188–212. Web of Science ↵ Sabatier PA, Weible CM (2007) The advocacy coalition framework: Innovation and clarification. Theories of the Policy Process, ed Sabatier PA (Westview Press, Boulder, CO), pp 189–222. Search Google Scholar ↵ Henry AD, Prałat P, Zhang C-Q (2011) Emergence of segregation in evolving social networks. Proc Natl Acad Sci USA 108(21):8605–8610. Abstract/FREE Full Text ↵ US National Research Council (1996) Understanding Risk: Informing Decisions in a Democratic Society, eds Stern PC, Fineberg HC (National Academy Press, Washington, DC). ↵ US National Research Council (1999) Perspectives on Biodiversity: Valuing Its Role in an Everchanging World (National Academy Press, Washington, DC). ↵ US National Research Council (2007) Analysis of Global Change Assessments: Lessons Learned (National Academy Press, Washington, DC). ↵ Stern PC (2005) Deliberative methods for understanding environmental systems. Bioscience 55(11):976–982. CrossRefWeb of Science ↵ Dietz T (1987) Theory and method in social impact assessment. Sociol Inq 57(1):54–69. CrossRef ↵ Delli Carpini MX, Cook FL, Jacobs LR (2004) Public deliberation, discursive participation and citizen management: A review of the empirical literature. Annual Review of Political Science 7:315–433. CrossRefWeb of Science ↵ Collins H, Evans R (2007) Rethinking Expertise (Univ of Chicago Press, Chicago). Search Google Scholar ↵ Dietz T, Pfund A (1988) An impact identification method for development program evaluation. Policy Stud Rev 8(1):137–145. CrossRef ↵ Berkes F (2009) Evolution of co-management: Role of knowledge generation, bridging organizations and social learning. J Environ Manage 90(5):1692–1702. CrossRefMedlineWeb of Science ↵ Berkes F, Reid WV, Wilbanks T, Capistrano D (2006) Conclusions: Bridging scales and knowledge systems. Bridging Scales and Knowledge Systems: Concepts and Applications in Ecosystem Assessment, eds Reid WV, Berkes F, Wilbanks T, Capistrano D (Island Press, Washington, DC), pp 315–331. Search Google Scholar ↵ Dietz T, Stern PC, Dan A (2009) How deliberation affects stated willingness to pay for mitigation of carbon dioxide emissions: An experiment. Land Econ 85(2):329–347. Web of Science ↵ US National Research Council (2002) The Drama of the Commons, ed Ostrom E, et al. (National Academy Press, Washington, DC). ↵ Dietz T, Ostrom E, Stern PC (2003) The struggle to govern the commons. Science 302(5652):1907–1912. Abstract/FREE Full Text ↵ Ostrom E (2010) Polycentric systems for coping with collective action and global environmental change. Glob Environ Change 20(4):550–557. CrossRef ↵ Henry AD (2011) Ideology, power, and the structure of policy networks. Policy Stud J 39(3):361–383. CrossRefWeb of Science ↵ Henry AD, Lubell M, McCoy M (2011) Belief systems and social capital as drivers of policy network structure: The case of California regional planning. Journal of Public Administration Research and Theory 21(3):419–444. Abstract/FREE Full Text ↵ Dietz T, Rycroft RW (1987) The Risk Professionals (Russell Sage Foundation, New York). Search Google Scholar ↵ Langbein LI (2005) Negotiated and Conventional Rulemaking at E.P.A.: A Comparative Case Analysis (US National Research Council, Washington, DC). Search Google Scholar ↵ Fishkin JS (2009) When the People Speak: Deliberative Democracy and Public Consultation (Oxford Univ Press, Oxford). Search Google Scholar ↵ Hall TE, Wilson P, Newman J (2011) Journal of Public Deliberation, Evaluating the short- and long-term effects of a modified deliberative poll on Idahoans’ attitudes and civic engagement related to energy options. Article 6. Available at http://www.publicdeliberation.net/jpd/vol7/iss1/art6. Accessed February 25, 2013, 7, 1. ↵ Mansbridge J (2012) Deliberative polling as the gold standard. The Good Society 19(1):55–62. Search Google Scholar ↵ Kasemir B, Jager J, Jaeger C, Gardner MT, eds (2003) Public Participation in Sustainability Science (Cambridge Univ Press, Cambridge, UK). ↵ Coleman S, Shane P, eds (2012) Connecting Democracy: Online Consultation and the Flow of Political Communication (MIT Press, Cambridge, MA). ↵ Fehr E (2009) On the economics and biology of trust. J Eur Econ Assoc 7(2-3):235–266. CrossRef ↵ McCright AM, Dunlap RE (2011) The politicization of climate change and polarization in the American public’s views of global warming, 2001-2010. Sociol Q 52(2):155–194. CrossRefWeb of Science ↵ Kitman JL (2000) The secret history of lead. Nation 270(11):11–44. Search Google Scholar ↵ McGrayne SB (2001) Leaded Gasoline, Safe Refrigeration, and Thomas Midgley, Jr.: May 18, 1889-November 3, 1944. Prometheans in the Lab: Chemistry and the Making of the Modern World (McGraw–Hill, New York), pp 79–105. Search Google Scholar ↵ Gould SJ (1991) The smoking gun of eugenics. Nat Hist (12):8–17. Search Google Scholar ↵ Ding D, Maibach E, Zhao X, Roser-Renouf C, Leiserowitz A (2011) Support for climate policy and societal action are linked to perceptions about scientific agreement. Nat Clim Change 1(9):462–466. CrossRefWeb of Science ↵ O’Neill S, Boykoff M (2011) The role of new media in engaging the public with climate change. Engaging the Public with Climate Change: Communication and Behaviour Change, eds Whitmarsh L, O’Neill SJ, Lorenzoni I (Earthscan, London), pp 233–251. Search Google Scholar ↵ Maibach E, Leiserowitz A, Roser-Renouf C, Mertz CK (2011) Identifying like-minded audiences for global warming public engagement campaigns: An audience segmentation analysis and tool development. PLoS ONE 6(3):e17571. CrossRefMedline ↵ Gauchat G (2012) Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. Am Sociol Rev 77(2):167–187. CrossRefWeb of Science ↵ McCright AM, Dunlap RE (2011) Cool dudes: The denial of climate change among conservative white males in the United States. Glob Environ Change 21(4):1163–1172. CrossRef ↵ Vauclair C-M, Hanke K, Fischer R, Fontaine J (2011) The structure of human values at the culture level: A meta-analytical replication of Schwartz value orientations Using the Rokeach Values Survey. J Cross Cult Psychol 42(2):186–205. Abstract/FREE Full Text ↵ Fontaine JRJ, Poortinga YH, Delbeke L, Schwartz SH (2008) Structural equivalence of the values domain across cultures. J Cross Cult Psychol 39(4):345–365. Abstract/FREE Full Text ↵ Slimak MW, Dietz T (2006) Personal values, beliefs, and ecological risk perception. Risk Anal 26(6):1689–1705. CrossRefMedlineWeb of Science ↵ Whitfield SC, Rosa EA, Dan A, Dietz T (2009) The future of nuclear power: Value orientations and risk perception. Risk Anal 29(3):425–437. CrossRefMedlineWeb of Science ↵ Dietz T, Dan A, Shwom R (2007) Support for climate change policy: Social psychological and social structural influences. Rural Sociol 72(2):185–214. CrossRefWeb of Science ↵ Stern PC, Dietz T, Abel T, Guagnano GA, Kalof L (1999) A social psychological theory of support for social movements: The case of environmentalism. Human Ecology Review 6(2):81–92. Search Google Scholar ↵ Schultz PW, et al. (2005) Values and their relationship to environmental concern and conservation behavior. J Cross Cult Psychol 36(4):457–475. Abstract ↵ Wilson MA, Howarth RB (2002) Discourse-based valuation of ecosystem services: Establishing fair outcomes through group deliberation. Ecol Econ 41(3):431–443. CrossRefWeb of Science ↵ Howarth RB, Wilson MA (2006) A theoretical approach to deliberative valuation: Aggregation by mutual consent. Land Econ 82(1):1–16. Web of Science ↵ Gastil J, Bacci C, Dollinger M (2010) Journal of Public Deliberation, Is deliberation neutral? Patterns of attitude change during “The Deliberative Polls” Article 3. Available at http://www.publicdeliberation.net/jpd/vol6/iss2/art3. Accessed February 25, 2013, 6, 2. ↵ Keeney R, von Winterfeldt D, Eppel T (1990) Eliciting public values for complex policy decisions. Manage Sci 36(9):1011–1030. CrossRefWeb of Science ↵ Crocker J, Niiya Y, Mischkowski D (2008) Why does writing about important values reduce defensiveness? Self-affirmation and the role of positive other-directed feelings. Psychol Sci 19(7):740–747. Abstract/FREE Full Text ↵ Fischhoff B (2000) Informed consent for eliciting environmental values. Environ Sci Technol 34(8):1439–1444. CrossRefWeb of Science ↵ Pelletier D, Kraak V, McCullum C, Uusitalo U, Rich R (1999) The shaping of collective values through deliberative democracy: An empirical study from New York’s North Country. Policy Sci 32(2):103–131. CrossRefWeb of Science ↵ Yohe G, Oppenheimer M (2011) Evaluation, characterization, and communication of uncertainty by the intergovernmental panel on climate change—An introductory essay. Clim Change 108(4):629–639. CrossRefWeb of Science

SITE COUNT Amazing and shiny stats
Copyright © 2005-2021 Peter Burgess. All rights reserved. This material may only be used for limited low profit purposes: e.g. socio-enviro-economic performance analysis, education and training.