Why Bats?

Bats are considered to be a good indicator species, reflecting the general health of the natural environment – so a healthy bat population suggests a healthy biodiversity in the local area. In this project we are exploring bat activity in one of the most iconic and high profile of London’s regeneration areas, the Queen Elizabeth Olympic Park. We have developed a network of prototype smart bat monitors and installed them across the park in different habitats. It is hoped that this exploratory network of devices will provide the most detailed picture yet of bat life throughout this large urban area.

‘Shazam’ for Bats

Each smart bat monitor – Echo Box – works like “Shazam for bats”. It captures the soundscape of its surroundings through an ultrasonic microphone, then processes this data, turning it into an image called a spectrogram. Deep learning algorithms then scan the spectrogram image, identifying possible bat calls. We are also working towards identifying the species most likely to have made each call.

How does Echo Box work?

Measuring bat activity in the Queen Elizabeth Olympic Park provides a very interesting real-world use case that involves large amounts of sensor data – in this case acoustic data. Rather than sending all of this data to the cloud for processing, each Echo Box device will process the data itself on its own chip, removing the cost of sending large amounts of data to the cloud. We call this “edge processing” since the processing is done on devices at the edge of the network.

Inside each Echo Box is an Intel Edison with Arduino breakout, plus a Dodotronic Ultramic 192K microphone. To capture, process and identify bat calls each Echo Box performs the following 4 steps:

nsc_vis-16First – a microphone on each device, capable of handling ultrasonic frequencies, can capture all audio from the environment up to 96kHz. Most bats calls occur at frequencies above 20kHz (the limit of human hearing) with some species going as high as 125kHz (although none of these species are found in the park).

nsc_vis-17Second – every 6 seconds, a 3 second sample of audio is recorded and stored as a sound file. This means that audio from the environment is captured as 3 second snapshots at a consistent sample rate across all smart bat monitors.
 

nsc_vis-18Third – the recorded audio is then turned into a spectrogram image using a method called Fast Fourier Transform. The spectrogram image shows the amplitude of sounds across the different frequencies over time. Bat calls can clearly be seen on the specrogram as bright patterns (indicating a loud noise) at high frequencies.

nsc_vis-19Finally – image processing techniques, called Convolutional Neural Networks (CNN), are applied to the spectrogram images to look for patterns that resemble bat calls. If any suspected bat calls are found in the image, then we are working towards applying the same CNN techniques again to each individual bat call to look at its shape in more detail and determine what species of bat it most likely is.

Where we are monitoring bats

A network of 15 smart bat monitors is installed across the Queen Elizabeth Olympic Park. The monitors are installed in different habitats across the park as indicated on the map (right), and will continuously capture data on bat species and activity levels until the end of the year. See live data coming from the smart bat monitors here.

Related projects

ENGAGE Project
iBats
Bat Detective
ICRI Urban IoT