SigGPDE: Scaling sparse Gaussian processes on sequential data

Abstract

Making predictions and quantifying their uncertainty when the input data is sequential is a fundamental learning challenge, recently attracting increasing attention. We develop SigGPDE, a new scalable sparse variational inference framework for Gaussian Processes (GPs) on sequential data. Our contribution is twofold. First, we construct inducing variables underpinning the sparse approximation so that the resulting evidence lower bound (ELBO) does not require any matrix inversion. Second, we show that the gradients of the GP signature kernel are solutions of a hyperbolic partial differential equation (PDE). This theoretical insight allows us to build an efficient back-propagation algorithm to optimize the ELBO. We showcase the significant computational gains of SigGPDE compared to existing methods, while achieving state-of-the-art performance for classification tasks on large datasets of up to 1 million multivariate time series.

Citation information

Lemercier M., Salvi C., Cass T., Bonilla E. V., Damoulas T., Lyons T. (2021). SigGPDE: Scaling Sparse Gaussian Processes on Sequential Data. arXiv:2105.04211.

Turing affiliated authors