Mathematics of Machine Learning - Summer School

This programme will equip researchers with the required tools to fully engage with modern literature on the theoretical foundations of machine learning.

Course overview

This course aims to provide a rigorous theoretical account of the main ideas in machine learning, with an emphasis on supervised learning. It will teach the main mathematical techniques used in the theory of machine learning, and cover numerical examples and exercises on first-order methods applied to statistical learning problems. These resources were delivered as part of the Mathematics of Machine Learning Summer School in 2021.

Who is this course for?

Learners should have already completed introductory courses in probability theory and linear algebra covering the basic inequalities contained in the Required Knowledge overview.

Learning outcomes

By the end of this course, learners will be able to:

  • Explain the basic formulation of machine learning in terms of a stochastic minimization problem and will understand what it means to solve the problem optimally
  • Use fundamental tools in non-asymptotic methods for the study of random structures in high-dimensional probability, statistics, and optimization
  • Use Python (TensorFlow) to quickly set up and run numerical experiments on Jupyter notebooks on Colab

Details

Module 1

Module Name

Topic

Lecture 1 Introduction
Lecture 2 Concentration Inequalities; Bounds in Probability
Module 2

Module Name

Topic

Lecture 3 Bernstein’s Concentration Inequalities; Fast Rates
Lecture 4 Maximal Inequalities and Rademacher Complexity
Module 3

Module Name

Topic

Lecture 5 Convex Loss Surrogates; Gradient Descent
Lecture 6 Mirror Descent
Module 4

Module Name

Topic

Lecture 7 Stochastic Methods; Algorithmic Stability
Lecture 8 Least Squares; Implicit Bias and Regularization
Module 5

Module Name

Topic

Lecture 9 High-Dimensional Statistics; Gaussian Complexity
Lecture 10 The Lasso Estimator; Proximal Gradient Methods

Instructors