Introduction
Evidence is mounting of declines in wildlife across the globe, giving stark warnings for the perilous state of biodiversity. We critically lack data for insects despite the vital role they play in all ecosystems (for example, as food for birds and mammals, recycling nutrients, pollinating crops) and as indicators of climate change impacts; closing this knowledge gap for ‘the little things that run the world’ has never been more urgent. We will develop and test automated sensors, deep learning, bioacoustics and computer vision to help deliver more standardised monitoring of insects, bats and birds worldwide.
Explaining the science
Research has shown that populations of insects around the world are in sharp decline. For example, a 75% decline in insect biomass from Germany led the media to raise concerns of an ‘Insect Armageddon’. Understanding trends in species and communities across space and time and the driving factors responsible for these changes are key to tackling current challenges. To develop this knowledge, we need robust methods for monitoring species that minimise bias and the invasiveness of data collection, maximise the quantity and quality of data collected, both spatially and temporally, and we require fast and efficient methods for analysing the vast amounts of data collected. The Automated Monitoring of Insects (AMI) trap provides practical and cost-effective solutions for standardised monitoring that are non-biased, non-invasive and can monitor at wide spatial and temporal scales.
Developments in data science are required to realise the potential of new data streams generated through automated sensors. For example, machine learning models to locate insect in an image, and classify them to species.
The integration of AI in AMBER includes curating the training data through image collection from AMI devices and citizen science sources, with an aim to have at least 1000 downloaded per species. The AMI system first detects objects, it recognises individual objects within training images using pre-trained models. It then determines whether the image is a moth or non-moth. Finally, it will classify the species of moths from images. Moth identification and species identification are the 2 models trained on supercomputers then compressed to run on AMI devices, on a raspberry pi.
The role of AI in AMBER
On our High Performance Computing (HPC) resources, we create a comprehensive species catalogue tailored to our target regions by leveraging labelled images sourced from GBIF. Initially, we employ a pre-trained model to detect insects, facilitating the creation of a labelled set of crop images. Subsequently, these crop images serve as the foundation for training a specialised moth detection model engineered to accurately identify both moths and non-moth insects. Following this, we develop a species classifier model to facilitate efficient and precise species classification. To enhance compatibility and efficiency, we compress both the moth detection and species classification models into TFLite format, ensuring seamless deployment on the Raspberry Pi units integrated into the AMI systems.
Project aims
The project aims to deliver:
- An engaging guide on the importance of biodiversity, how to monitor using novel technologies and how individuals can contribute (e.g. submitting observations of wildlife to iNaturalist or supporting wildlife-positive actions). Guide translated into local languages for all four demonstration countries.
- A network of 40 monitoring systems deployed and delivering biodiversity data in four demonstration regions, countries spread across the globe. Proposed locations of: Uganda, Japan, Singapore, and Panama. Locations to be finalised in consultation with the abrdn Charitable Foundation.
- Four teams of local co-ordinators in place in demonstration areas, supporting deployment of sensors and development of citizen science participation for submitting observations of wildlife.
- Primary biodiversity data captured and made openly available from automated systems, comprising more than 1 million images of insects, more than 1 million minutes of bird recordings (audible sound) and more than 2 million ultrasound recordings (primarily for bats).
- More than 20,000 species occurrences shared with the Global Biodiversity Information Facility (https://www.gbif.org/) through AI processing of images and sounds collected from automated systems in biodiversity hotspots.
- A camera system and image processing system that can operate in remote locations with limited communication bandwidth and needing minimal human intervention. To achieve this, the camera trap's onboard computer will need to be able to work autonomously, making intelligent decisions on how to use resources (such as battery power) and to handle data it collects - a paradigm known as edge computing. This will enable real-time reporting, e.g. in support of early warning of pest outbreaks.
- A global standard for data integration from automated camera systems deployed anywhere in the world.
- Over 4000 volunteers contributing images of insects to support development of image classification models. Supported by over 100 species experts to confirm identification of submitted images in support of training datasets for AI image classification models.