Jon Crowcroft has been the Marconi Professor of Communications Systems in the Computer Laboratory since October 2001. He has worked in the area of Internet support for multimedia communications for over 30 years. Three main topics of interest have been scalable multicast routing, practical approaches to traffic management, and the design of deployable end-to-end protocols. Current active research areas are Opportunistic Communications, Social Networks, and techniques and algorithms to scale infrastructure-free systems. He leans towards a “build and learn” paradigm for research.
He graduated in Physics from Trinity College, University of Cambridge in 1979, gained an MSc in Computing in 1981 and PhD in 1993, both from UCL. He is a Fellow the Royal Society, a Fellow of the ACM, a Fellow of the British Computer Society, a Fellow of the IET and the Royal Academy of Engineering and a Fellow of the IEEE.
He likes teaching, and has published a few books based on learning materials.
Researcher at Large role
Jon spent the last two years as chair of the Institute's Programme Committee, where a significant ongoing task has been the mapping of strategy for the Turing. As the Turing continues to grow, in his new role as 'Researcher at Large', Jon will continue in this task. With a remit to range over the wider set of activities at the Turing, he will be finding what the Institute does well and uncovering gaps where it needs to do more. The role will also involve helping with the Strategic Priorities Fund cross-cutting theme on tools, practices and systems.
Computing Systems at scale are the basis for much of the excitement over Data Science, but there are many challenges to continue to address ever larger amounts of data, but also to provide tools and techniques implemented in robust software, that are usable by statisticians and machine learning experts without themselves having to become experts in cloud computing. This vision of distributed computing only really works for “embarrassingly parallel” scenarios. The challenge for the research community is to build systems to support more complex models and algorithms that do not so easily partition into independent chunks; and to give answers in near real-time on a given size data centre efficiently.
Users want to integrate different tools (for example, R on Spark); don’t want to have to program for fault tolerance, yet as their tasks & data grow, will have to manage this; meanwhile, data science workloads don’t resemble traditional computer science batch or single-user interactive models. These systems put novel requirements on data centre networking operating systems, storage systems, databases, and programming languages and runtimes. As a communications systems researcher for 30 years, I am also interested in specific areas that involve networks, whether as technologies (the Internet, Transportation etc), or as observed phenomena (Social Media), or in abstract (graphs).