DescriptionClimate change is one of the largest issues that will determine our future. There is considerable uncertainty as to the nature and extent of the changes that may occur and the ways in which such changes will be influenced by human behaviour, for example in levels of carbon dioxide emissions. One of the major sources of information in assessing such uncertainty is by the analysis of computer simulators based on mathematical models of global climate, based on the solution of equations describing the evolution of climate over time. There are many approximations and uncertainties involved in matching the output of such simulators with actual climate. For example, to evaluate a climate model requires the specification of a collection of input parameters, whose true values are unknown. The evaluation of the model with any individual choice of input parameters takes a long time (from days to months) and so there are practical difficulties in identifying sensible choices of input parameters, quite apart from all of the other uncertainties involved in matching the computer model to the real climate system. Members of the Durham Mathematics department, are part of a consortium (with the University of Reading, the National Oceanography Centre of Southampton, the Met Office, Imperial College, the British Antarctic survey, and ClimatePrediction.net at Oxford) who are concerned with issues related to rapid climate change, and in particular are focusing on the likelihood and potential impact of a collapse in the Meridional Overturning Circulation system, which acts as a major source of heat transport around the world. As members of this consortium, we have access to large ensembles of evaluations of runs of models developed by the Met Office, where each evaluation is made under a different choice of input parameters, and a different scenario for future carbon dioxide emissions. This project will be concerned with uncertainty analysis for large computer models in climate analysis and in particular with the practical issues that are raised when analysing an ensemble of model evaluations of the above form. A basic tool that we use is the statistical modelling of the relationship between the inputs and the outputs of the computer simulator, by means of a statistical emulator. Therefore, this project draws on methods described in 2H Statistical Concepts and 3H Statistical Methods. PrerequisitesStatistical Concepts II and Statistical Methods III
ResourcesAn excellent web-site which describes the types of analyses which this project gives an introduction to is This is the web-site for the Managing Uncertainty in Complex Models (MUCM) project, another consortium in which we are involved, (with the Universities of Sheffield, Aston, LSE and Southampton). There are an enormous number of links to follow at this site. One in particular, which gives an introduction to emulation, is O'Hagan, A. (2006). Bayesian analysis of computer code outputs: a tutorial. Reliability Engineering and System Safety 91, 1290–1300. At that site, you will also find links to pdf's for the following papers, which relate the general MUCM methods to the problems of climate change, for example Challenor, P.G., Hankin, R.K.S. and Marsh, R. (2006) Towards the probability of rapid climate change. In Avoiding Dangerous Climate Change 55-63. Eds Schellnhuber, H.J., Cramer, W., Nakicenovic, N., Wigley, T. and Yohe, G. Cambridge University Press. [This paper shows how MUCM methods can be applied in a real problem with substantial uncertainty and important policy implications.] More details about the Rapid Climate Change project are given at
|
email: Michael Goldstein