Modelling and statistical methodologies are widely used to support the government's operational and policy decisions against complex backgrounds. As models inevitably represent the real world imperfectly, in order to make good decisions based on modelling analysis, it is necessary to quantify uncertainty in how the outputs of that analysis map on to the real world phenomena that they represent. Therefore, further research is required to shed light on how uncertainty in data and in the real world should be handled and incorporated within statistical and mathematical analysis/modelling.

Three recent documents - the Aqua Book (guidance on producing quality analysis for government), the Blackett Review of Computational Modelling, and the newly revised Treasury Green Book – have already stressed the importance of high quality data, sound statistical analysis, and mathematical modelling in decision support. This project will carry out research to support these needs along two paths: first, through developing methodologies and tools in specific areas such as energy and education; second, through pairing these developments with more general thinking on how scientifically valid methods can be widely applied across government, regardless of sector.

Explaining the science

Uncertainty quantification

Uncertainty quantification (UQ) assesses uncertainty in the relationship between the outputs of mathematical and computer models, and the real-world phenomena which they are intended to represent or predict.

At the simplest level, one might regard a model as taking numerical inputs and processing them to give numerical outputs; one could then assess how uncertainty in the values of the inputs propagates through the model to uncertainty in the values of the outputs. However, it is also necessary to consider uncertainty arising from the ways in which the mathematical assumptions represent the world imperfectly.

The challenge of quantifying uncertainty further increases when the model takes a lot of computer time to run, and thus its behaviour cannot be explored in detail due to the limited number of runs which can be performed. We can address this challenge, in part, by using surrogate models (also known as emulators), i.e. statistical approximations to the behaviour of the full model which can be evaluated much more rapidly. Emulators further allow us to encode uncertainty in the behaviour of the model when only a small number of runs is possible.

Expert judgment

A further important issue is the use of expert judgment to inform model inputs where numerical data are not available. This will typically be necessary for future capital planning or policy issues, as by definition historical data cannot be directly relevant to many aspects of the future planning background. Uncertainty in such judgments, and its consequences for model results, must be assessed. A related matter is the assessment within modelling of intangibles such as quality of life, or productivity improvements from quicker journeys, whose inclusion within a quantitative framework inevitably comes with considerable uncertainty.

One or more of the project team have already been involved in significant recent developments in UQ research: the recent programme on UQ at the Isaac Newton Institute, two EPSRC networks (M2D and Cruisse), KTN initiatives, the SIAM UQ initiative and the current workshop at the US SAMSI Institute. These all seek breakthroughs by bringing together experts in scientific programming, modelling, and statistical methodology. Important areas of modelling considered in these programmes include climate, geophysical, system biology, energy and bio-informatics.

Project aims

The goals of the project are:

  • To have direct impact on government practice around the selected case studies
  • To support improved modelling practice across government through wide dissemination of the methods developed, and specifically to encourage better ways of incorporating and quantifying uncertainty in government models
  • To develop all arising research code into software tools which can be used by a wide range of analysts outside the project team, in collaboration with the Turing’s Research Engineering group


The project will concentrate on three areas:

  1. Valuing uncertain future benefits and liabilities (e.g. cost-benefit analysis of major projects such as Crossrail, and accounting for significant liabilities on balance sheets such as clinical negligence claims and nuclear decommissioning)
  2. Resource planning over time, including quantification of, and uncertainty in, planning background and decision-making on investments (e.g. teacher training capacity, electric vehicle infrastructure)
  3. Propagating uncertainty through models (e.g. if there is uncertainty in the numerical inputs to a model, how does this map on to uncertainty in outputs and predictions?)


Contact info

[email protected]