Many scientific and industrial applications use mathematical ‘numerical methods’ for modelling dynamic systems and processes. However, there is always a difference, or error, between the real, physical system and its model’s approximation, that needs to be accurately accounted for. This project aims to use probability to better represent these errors and provide engineers with the tools to produce more robust, reliable models.

Explaining the science

A range of scientific and industrial applications require the accurate modelling of dynamic systems and processes, from heart function to manufacturing equipment. Practitioners in these fields will propose a model of the real system or process which consists of mathematical equations which represent what they believe is an exact representation of what’s happening in reality.

However, it is often not possible to make predictions from such exact models, so instead ‘numerical methods’ are used to produce an approximate model. Numerical methods being sets of algorithms used for numerical tasks, including linear algebra (manipulation of vectors and matrices), integration, optimisation, and solving differential equations. The better these algorithms are at solving a numerical task, the closer the predictions made from the approximated model will be to those of the exact model.

One issue in practice is that the quality of these approximated models can be limited by hardware capabilities and uncertainties in numerical calculation. Also, the exact difference, or error, between an exact model and its numerical approximation is usually impossible to know, with worst-case upper bounds often being obtained instead. These upper bounds of errors lack accuracy and can accumulate, further reducing reliability.

To remedy these issues, it’s possible to use the language of statistics to produce a richer quantification of the error by using probability distributions. These distributions can model the complex size and structure of numerical errors and, if well-calibrated, can be used to monitor, propagate, and control the quality of computations and the scientific and industrial models mentioned above.

Project aims

The Turing’s data-centric engineering programme in partnership with the Lloyd’s Register Foundation has partnered with the Statistical and Applied Mathematical Sciences Institute (SAMSI) in North Carolina, to deliver ground-breaking research into uncertainty quantification for numerical methods.

Activities are coordinated under a SAMSI ‘working group’ as part of their year-long programme on ‘Quasi Monte Carlo and High-Dimensional Sampling Methods for Applied Mathematics’. The working group consists of 22 researchers from around the world.

This working group aims to develop probabilistic numerical methods (see Explaining the science), which will provide engineers with tools to mitigate the ‘numerical risk’ associated with numerical approximations of physical models, across a range of potential applications.


  • Reference priors for the probabilistic solution of differential equations.
  • Heavy-tailed stable distributions for robust uncertainty quantification.
  • Statistical estimation with multi-resolution operator decompositions.
  • Probabilistic numerical methods as Bayesian inversion methods.


As discussed in ‘Explaining the science’, if a probabilistic method is well-calibrated, it can be used to monitor, propagate, and control the quality of computations, in a range of potential applications. These applications span multiple disciplines such as physics, mechanical and biomedical engineering, and astronomy.

One particular example of probabilistic numerical methods in action developed by project leader Dr Chris Oates is aiming to build better models of the human heart. The heart’s complex dynamical systems, regulated by electrical, chemical, and physiological variables, require intensive simulation to properly understand. Methods based on probabilistic numerics are helping to accelerate progress towards the clinical use of computational models.

The below video provides a quick overview of the project’s goals and explains some of the statistical theory underlying the work.


Recent updates


May 2018: Chris Oates, Tim Sullivan, Oksana Chkrebtii, Jon Cockayne and Han Cheng Lie presented research findings from this project at the SAMSI QMC Transition Workshop in North Carolina. Their slides from the workshop can be found here. For further details please visit:

February 2018: Turing PhD student Jon Cockayne won ‘Best Student Paper Award’ from the American Statistical Association Section on Bayesian Statistical Science. He picked up the prize for his work on Bayesian probabilistic numerical methods – co-authored by Chris Oates and Mark Girolami, director of the Turing’s data-centric engineering programme.

Research visits

As part of the project, a series of research visits have been undertaken and are planned:

  • July and August, 2017: F Schaefer to visit M Girolami and F-X Briol @ Alan Turing Institute and Imperial College London.
  • August and September, 2017: F-X Briol to visit H Owhadi, A Stuart and F Schaefer @ Caltech.
  • April 11-13, 2018: Meeting of the working group at the Alan Turing Institute, London. See the panel discussion from this meeting below.

In addition, the working group conducts regular discussions, an up-to-date schedule for which can be found at:



Contact info

[email protected]


SAMSI working group researchers

  • Philipp Hennig - Max Planck Institute, Tübingen
  • Houman Owhadi - California Institute of Technology
  • Florian Schaefer - California Institute of Technology
  • Andrew Stuart - California Institute of Technology
  • David Bortz - University of Colorado
  • Oksana Chkrebtii - Ohio State University
  • Vanja Dukic - University of Colorado
  • Ruituo Fan - Univesity of North Carolina
  • Jan Hannig - University of North Carolina
  • Fred Hickernell - Illinois Institute of Technology
  • Toni Karvonen - Aalto University
  • Han Cheng Lie - Free University of Berlin
  • Jagadeeswaran Rathinavel - Illinois Institute of Technology