Professor Roy Ruddle

Roy Ruddle

Position

Turing Interest Group organiser, University of Leeds

Former position

Turing Fellow

Partner Institution

Bio

Roy Ruddle is a Professor of Computing at the University of Leeds, and Deputy Director (Research Technology) of the Leeds Institute for Data Analytics (LIDA). He has worked in both academia and industry, and researches visualization, visual analytics and human-computer interaction in spaces that range from high-dimensional data to virtual reality. In a 12-year collaboration with pathologists at the Leeds Teaching Hospitals NHS Trust (LTHT), he developed the Leeds Virtual Microscope (LVM) for visualizing tera-pixel image collections on Powerwall and ultra-high definition displays, leading to its use for pathology training in NHS hospitals and commercialisation by Roche.

The LVM has already won awards for its research (ACM ToCHI Best Paper 2016) and application (Yorkshire & Humber NHS Innovation Award for Medical Devices and Diagnostics, 2014). His industry-sponsored petrophysics research led to the PETMiner software, which integrates objective and subjective data in visualizations for interactive data analysis, and has been commercialised by the spin-out Petriva Ltd. He currently researches novel visualization methods for data profiling, investigating data quality, and analysing complex coded data such as electronic health records and retail transactions. That research involves collaborations with NHS Digital, Leeds City Council, LTHT, Bradford Institute for Health Research, Sainsbury's, and other organisations.

Research interests

Professor Roy Ruddle designs, develops and evaluates visual analytic methods that address two neglected but challenging topics: (a) data quality, and (b) broken workflows. Both are essential if data analysis pipelines and models are to be rigorously designed, and hence are fundamental to data science. Visual analytics places users in the driving seat during the analysis of complex data, by combining the unique power of humans for detecting and reasoning about patterns (via interactive visualisation tools) and of machines for handling scale (via sophisticated models and powerful on-the-fly computation).

Data quality spans everything from data that is missing ('completeness') to data that is invalid, inconsistent, incompatible, implausible or clearly an error (these are all types of 'correctness'). Broken workflow occurs because it is rare that users rigorously investigate the knock-on consequences of choices made in one analysis step on the output of the next.