Biology experiments can be very expensive and time intensive to perform, so researchers need to take a lot of care in designing their experiments. In this project various statistical strategies will be developed for helping biologists choose which experiments to perform by exploiting existing datasets. This will decrease research costs and improve research outcomes.
Explaining the science
Developing tools to help biology researchers design experiments requires an interdisciplinary skill set. It requires computer science expertise in optimisation, operations research expertise in scheduling, statistical expertise to model confidence, and biology expertise to understand scientific objectives.
Different methods and tools specifically designed for various experimental design tasks are being developed:
- Using experiments that are performed at a high temporal resolution to select a small set of time points to focus future research efforts on.
- Developing techniques for utilising large scale correlative datasets (such as those that arise from citizen science research) to design expensive controlled experiments.
- Developing strategies for using new experimental data to allow researchers to more thoroughly analyse existing data from large international databases.
- Building software to help researchers schedule experiments to infer gene regulatory networks.
Researchers in this project are also collaborating with Google Brain's genomics team.
In the first instance, the new techniques developed will be applied to plant science. Of particular interest is learning how plants integrate environmental signals to determine how much to grow, so we can predict how plants will respond to climate change.
In addition, it is hoped that the new methods will accelerate research in the life science. This will also help government and private research funding for the science stretch further.
July 2018: Open position for Research Associate.
June 2018: EPSRC Innovation Fellowship successful.