Information-Theoretic Foundations and Algorithms for Large-Scale Data Inference

British Library, 17th December 2015

Organisers: Graham Cormode, Mike Davies, Miguel Rodrigues, Jared Tanner, Ramji Venkataramanan

Summary: The field of information theory and its interaction with the fields of mathematics, statistics, and computer science reveals fundamental limits— i.e., what can and cannot be computed—in areas such as data communication, data compression, data security, and networking. The techniques used to obtain these fundamental limits also guide the design of practical algorithms in these areas.

This workshop aims to explore the role of information theory and associated disciplines in massive data inference problems. In particular, it will consider how information-theoretic tools can be used to understand fundamental limits in large-scale data processing as well as help design optimal or nearly-optimal algorithms for large-scale data inference. It will bring together experts from the UK and beyond in information theory, signal processing, machine learning, statistics, and mathematics. The workshop will include three keynotes, short technical talks, brainstorming sessions, and a panel.

The workshop will produce recommendations for the ATI, including

1) Key research directions in information theory and algorithms for large-scale data inference (including a list of the most pressing and promising problems);

2) Identification of emerging problems in fields such as machine learning, statistics, computer vision or domains like business and finance, bioinformatics, and medical imaging that may benefit from information-theoretic ideas;

3) Identification of education, training and exchange programmes in information theory and algorithms for large-scale data inference;

4) Connections to key groups within the UK and abroad that are engaged in information theory and algorithms applied to massive data processing.