SEDAC
Home
Featured link and image: NASA Watches Arctic Ice, click to see full story

Home Page (MVA) > Integrated Assessment Models (IAMs) and Resources > IAMs Thematic Guide

 

Thematic Guide Icon

Thematic Guide to Integrated Assessment Modeling

[HOME] [PREVIOUS] [NEXT] [BOTTOM]

 

Treatment of Uncertainty

The most basic job of integrated assessment, even more basic than the evaluation of specific response options, is the characterization of the current state of policy-relevant knowledge. Representing uncertainty is central to this task. Some assessments have used heuristic schemes to communicate uncertainties by assigning judgmental degrees of confidence to their main concluding statements. For example, the 1990 IPCC Scientific Assessment presented its main conclusions prefaced by statements such as "We are certain of the following," "We calculate with confidence that," and "Our judgment is that" (Houghton, Jenkins, and Ephraums 1991). This approach can be used in any assessment, including one integrated by less formal means than models. But in a model-based assessment, more systematic analysis and communication of uncertainty is also possible. Because component models in assessments are deterministic, uncertainty is normally treated through some form of meta-model analysis.

Several approaches are possible. In a model, where uncertainty is analyzed through repeated runs with variation, there is a tradeoff between the size and complexity of the basic model and how much uncertainty analysis is feasible. Specifying a set of future scenarios is one simple way of presenting uncertainty, with the scenarios selected to span a judgmentally determined range of plausible, representative futures. The difficulties of scenario analysis are that the origin and meaning of the range bounded by the scenarios cannot normally be explored precisely. Sensitivity analysis is a more systematic method of studying uncertainty, through which the sensitivity of outputs to variation in key input parameters, or to discrete changes in assumed models or policies, can be examined. The difficulty of a sensitivity-analysis approach is that one cannot examine sensitivity to every input, and determining which key ones to select for analysis must reflect some prior judgment. The most comprehensive approach to uncertainty involves specifying probability distributions for many inputs and running models many times, sampling over input values in some efficient way. The strength of this approach is that it allows simultaneous consideration of how uncertain an input value is and how sensitive important outputs are to it. The difficulties of this method are two: its requirement for vast quantities of data, as many inputs must be specified not just as point estimates but as distributions, and the number of repeated model runs required to sample over all inputs, inevitably limiting the possible complexity of the basic models. None of these approaches is particularly strong at handling the extremes of uncertainty: low-probability, potentially catastrophic events, such as major shifts in ocean circulation or large releases of methane from clathrates.

The crucial questions in the treatment of uncertainty bear on the communication of results, which in turn depend on how the assessment is intended to be used and by whom. Propagating point estimates through cascaded sets of large, deterministic models can provide great value in advancing the understanding of modelers and forcing disciplinary experts to address the exchange of information with other parts of the system. But assessments that present the best understanding of each component piece of the analysis with limited representation of uncertainty run the risk of excess precision. If knowledge of crucial parameters or relationships is limited, the point estimates that emerge from such a process can mean little. If a major purpose of the assessment is to identify what gaps in knowledge are of most importance for decision-making, then there is no substitute for full propagation of uncertainty through an assessment.

 

The next section is Discounting.

 

[SEDAC] [PREVIOUS] [NEXT] [TOP]

 

Sources

Parson, E.A. and K. Fisher-Vanden, Searching for Integrated Assessment: A Preliminary Investigation of Methods, Models, and Projects in the Integrated Assessment of Global Climatic Change. Consortium for International Earth Science Information Network (CIESIN). University Center, Mich. 1995.

 

Suggested Citation

Center for International Earth Science Information Network (CIESIN). 1995. Thematic Guide to Integrated Assessment Modeling of Climate Change [online]. Palisades, NY: CIESIN. Available at http://sedac.ciesin.columbia.edu/mva/iamcc.tg/TGHP.html [accessed DATE].

 

 

Our sponsors:

CIESIN - Center for International Earth Science Information Network
NASA - National Aeronautics and Space Administration SEDAC - Socioeconomic Data and Applications Center

Need HELP or information? Contact SEDAC User Services
About SEDAC | Acknowledgments

Copyright © 1997–2012.
The Trustees of Columbia University in the City of New York.
Privacy, Security, Notices