12 September 2019: The Bahamas is reeling from the loss of life and wholesale destruction of critical infrastructure caused by Hurricane Dorian – the most powerful storm to strike the islands since records began – which battered the region for about 68 hours and created a storm surge of 6 metres above sea level. The Caribbean Disaster Emergency Management Agency estimates that more than 15,000 people are currently in need of help on the islands.
A team led by Turing Fellow Steven Reece at the University of Oxford has deployed cutting-edge technology to support disaster-response agencies in dealing with Dorian’s immediate aftermath. This Impact Story reveals how this unique technology reached maturity.
In a warming world, extreme weather is driving increasingly frequent and devastating storms and floods, while rising temperatures increase the severity of wildfires. Earthquakes, while showing no signs of increasing in frequency, also cause catastrophic damage. Whatever the natural disaster, responding to them is inevitably a fraught business: lives are on the line, the clock is ticking, resources are limited, and knowledge of what’s happening on the ground is hard to come by, particularly in poorer nations.
What can be gathered quickly, however, is satellite imagery. Entire disaster-struck regions can be imaged in a matter of hours, cloud cover permitting. If this vast amount of data could quickly be compared with ‘before’ images of the same area, it could be turned into something immensely useful for aid agencies and disaster responders on the ground.
Harnessing the power of thousands of human volunteers to wade through this image data – labelling things like damaged buildings, flooded areas and blocked roads – can provide rapid insights to emergency agencies. What’s more, if the quality of such crowd-sourced data can be boosted with machine learning, and ultimately used to train neural networks to label the satellite images automatically, we’d see a data-driven revolution in disaster response.
That’s where Reece and his colleagues come in. In collaboration with the Zooniverse citizen science crowd-sourcing platform and Rescue Global, his team has developed a potent mix of crowd-sourcing, cutting-edge machine learning and neural networks to rapidly supply crucial intelligence to rescue organisations during natural disasters – helping them to allocate resources and potentially save many lives. The collaboration calls itself the Planetary Response Network (PRN).
When disaster strikes
When a natural disaster strikes, the researchers use Zooniverse to show online volunteers high-resolution satellite images of before and after the event. These people rapidly mark where detrimental changes have occurred. In the aftermath of an earthquake in Ecuador in 2016, for example, more than 2,000 Zooniverse volunteers helped to analyse about 25,000 square kilometres of satellite imagery in just 12 hours.
A bigger test came in 2017, when hurricanes Irma and Maria devastated the Caribbean in quick succession. In their wake, “we were told by the Zooniverse team that roughly 300,000 classifications from 7500 people had taken place through the platform [in just three days]”, said Global Rescue Project Director, Rebekah Yore, at the time. “This extraordinary effort is the equivalent to the output of one person working full-time for just over a year.” These classifications of imagery of the devastated islands covered an area of more than 11,000 square kilometres – 10 islands of the archipelago.
“You get lots of people working on this very rapidly, but they tend to come back with slightly different answers. For example, some people are better than others at recognising damage,” says Reece, who is also a Group Leader of the Data-centric Engineering programme at the Turing. So how does the team deal with these noisy, inconsistent human data? “We use machine learning to identify the ‘consensus labels’ – which data are the most accurate – using the wisdom of the crowd.” An algorithm called a ‘Bayesian classifier combination’ (BCC) was used for this. The researchers were then able to generate ‘heatmaps’ of the areas where damage was most serious, and therefore where emergency resources were required. The BCC algorithm was developed by PhD student Edwin Simpson at the University of Oxford, and then extended to heatmaps in collaboration with Reece.
Rapidly creating such heatmaps can prove extremely helpful: immediately following Irma and Maria, the PRN quickly passed such heatmaps to the United Nations, the US Federal Emergency Management Agency (FEMA), and over 60 NGOs. Hannah Pathak, who was Deputy CEO of Rescue Global at the time of the disaster, said of the heatmaps: “Before [rescue] work began in Dominica, an island that sustained 97% infrastructure damage, the PRN heat maps gave the Rescue Global team an indication of the extent of damage to the ports and airports, as well as road networks.”
“In addition to supplying an NGO with satellite communications on St Thomas island,” said Yore, the Rescue Global team “also evacuated a small number of patients with critical healthcare needs, including a pregnant lady, to San Juan. Both missions were aided by the heat maps.”
Cranking up the AI
Since this successful deployment of the PRN’s technology, Reece and his team have taken their system to another level, by adding a neural network into the mix, creating an augmented system called BCCNet. While the human volunteers are rapidly labelling the changes in the satellite imagery, the neural network is busy learning how to do the job, training itself with the most accurate human data. Before long, the neural net is as good as the cream of the crowd. “We can use the trained neural net to label the images of entire regions – even entire countries – automatically.”
“We can use the trained neural net to label the images of entire regions – even entire countries – automatically.”
Steve Reece, Turing Fellow
This marriage of human understanding of imagery with machine learning takes the best of human capability and rapidly multiplies its power and impact, providing wide-area situational awareness to where it is needed the most, and fast.
At the Neural Information Processing Systems conference in December 2018, the team’s paper picked up the best paper award at a workshop entitled 'Machine Learning for the Developing World'. That work was funded by the UK Space Agency’s International Partnerships Programme, a Google Impact Challenge award, and the Turing’s Data-centric Engineering programme, itself supported by Lloyd’s Register Foundation.
The neural net is currently being folded into the system, says Reece. “We haven't tested the neural net in anger yet – which is to say, in a real disaster situation. We've tested it on historical data from the 2017 Caribbean hurricane data archive” and Reece expects it to be ready to go live later this year.
The PRN deployed some of its suite of techniques and technology on before-and-after satellite imagery of the Bahamas, in the wake of Hurricane Dorian. “We are now repeating the damage assessment work we did for Irma and Maria on the Bahamas,” says Reece. “We've mapped Green Turtle Cay,” says Reece, “and this intelligence was delivered to Rescue Global and 24 Commando Royal Engineers [the British Army’s Military Engineers]”, says Reece. “A further two sets of crowd labels from Zooniverse are imminent, one covering Freeport and the other Marsh Harbour. In part, this is to see if the ports are free of large debris to find out if they can support aid delivery.”
24 Commando is embedded with Reece’s team, working to understand the technology and also helping the PRN to understand the information requirements of emergency responders.
Image of Hurricane Dorian credit NOAA
What does the future hold?
When not focused on an emergency deployment such as Hurricane Dorian, another aspect of technology being worked on is improving the PRN’s Zooniverse interface, so that human volunteers have a smoother, more efficient experience when examining satellite imagery. These improvements are being funded through an Impact Acceleration Account, awarded by the Engineering and Physical Sciences Research Council. “In operational situations, we’ve ended up doing a lot of manual work that could be automated,” Reece says, so this streamlining of the crowd-sourcing side is important. The interface is nearing completion and is now being evaluated by Rescue Global.
Meanwhile, an additional focus is on using BCCNet to help the Brazilian National Audit Office (TCU) identify and monitor tailings dams. These are dams created by mining organisations to provide a body of water in which to dump the typically highly toxic by-products of mining. They are an environmental nightmare in their own right, but if they collapse, releasing their toxic payload, they are capable of killing off entire rivers. So far BCCNet has identified 3 dams previously not unknown to the TCU. The team’s collaborator at the TCU is visiting the University of Oxford for five months from September 2019, funded entirely by the Brazilian government, to develop this project.
It is clear that the impact of the BCCNet algorithm will go way beyond disaster response. Virtually anything that humans can usefully discover from imagery or sound can ultimately be automated by the system. And to boost uptake of this technology, it has been important to the Turing that the code for the system be open source. When something has this much potential benefit to people and the planet, it would be wrong to keep it under wraps.