Benjamin C. Evans, Marcus Rowcliffe et. al
August 2024

Lens on the wild: innovations in wildlife monitoring with machine learning

A photo of camouflage trail cameras laid out on a worktop

environmental SCIENTIST | Where Green Meets Machine | June 2024

Benjamin C. Evans, Marcus Rowcliffe, Chris Carbone, Emma L. Cartledge, Nida Al-Fulaij, Henrietta Pringle, Richard Yarnell, Philip A. Stephens, Russell Hill, Kate Scott-Gatty, Chloe Hartland and Bella Horwood show how applying artificial intelligence techniques to study hedgehogs can be applied to animal research more widely. 

Hedgehogs, one of the UK’s most loved creatures, have substantially declined in number over the last 50 years.1 The National Hedgehog Monitoring Programme (NHMP) has completed its pilot year, marking the first milestone of a three-year endeavour to better understand the causes of this decline and, ultimately, to monitor the status of other wildlife populations across the UK.

A camera trap set up on a tree in a forested area to monitor wildlife activity as part of the National Hedgehog Monitoring Programme
Figure 1. A camera trap set up in a forested area to monitor wildlife activity as part of the National Hedgehog Monitoring Programme. (© National Hedgehog Monitoring Programme)

Image identification and labelling

The NHMP uses camera traps, small units that automatically capture sequences of images akin to short video clips when a passive infrared sensor detects movement in front of the lens (see Figure 1). The programme works in a large and growing number of survey areas across the UK, distributing cameras systematically within each site and deploying them at each location for around a month, providing a glimpse into the life of hedgehogs and other wildlife without the need for challenging nocturnal observations by people. To turn these glimpses into useful information on the size and distribution of wildlife populations, an analysis pipeline is being built that draws on tools from a unique combination of artificial intelligence, citizen science, photogrammetry, data science and statistics.

The NHMP expects to collect millions of images each year using camera traps, and the number will continue to grow as more survey sites are added. This creates the first challenge: to label images that contain animals and, if so, which species. For project staff to look at each image and create the labels would take an impossible amount of time. Harnessing developments in machine learning and computer vision using an ensemble of models to detect whether sequences of images contain animals, deployed alongside citizen science, makes this task easier.
 

A screengrab from a wildlife camera at night showing a hedgehog with a green identification box around it
Figure 2. Image of a hedgehog (Erinaceus europaeus) captured as part of the National Hedgehog Monitoring Programme, tagged using artificial intelligence. The camera trap detected and identified the hedgehog, highlighting it within a green box. (© National Hedgehog Monitoring Programme)

Detection models aim to predict the object type and its localisation within an image, producing a box around the desired object (see Figure 2). The models in use consist of the MegaDetector, an open-source generalisable detection model trained to detect animals, humans and vehicles in camera trap imagery, along with Conservation AI, an ongoing endeavour to produce detection models with finer-grained classification of image regions down to species level.2,3 It has been found that this combination can reduce the image sets requiring further processing by up to 70 per cent, saving large amounts of annotation time. Both models are accessible with user-friendly packages, including CamTrap Detector and the Conservation AI web interface.4

Application of citizen science

After filtering the images to those that contain animals, the imagery is made available on the MammalWeb platform, a citizen science website dedicated to tagging camera trap imagery.5 This platform shows one sequence at a time, allowing users to flick between images and respond with the species seen. Algorithms are currently under development that prioritise the images shown to citizen scientists based on the degree of certainty in the machine classifications, allowing human spotters to correct the sequences that are most likely to be misclassified by the machine. Combining machine and citizen science techniques in this way capitalises on the strengths of each to complete the task rapidly and accurately with minimal human labour.

Extrapolating to study specific species

Once all images are classified on the MammalWeb platform, those containing hedgehogs are exported, together with their metadata, in Camtrap DP data format. This standard for working with camera trap data has been developed over the last two years, providing portability and interoperability between platforms and tools.6 Until now, many platforms and organisations employed their own methods and formats for storing camera trap data, hindering collaboration and data processing. The camera trap data standard allows for a collaborative ecosystem of digital technologies in wildlife monitoring projects, with the wider benefits of improved data accessibility and the reproducibility of published results.

Camera trap image showing the calibration process, which involves someone holding a straight pole with markings every 20 cm
Figure 3. Camera trap image showing the calibration process, a straight pole with markings every 20 cm for building a 3D depth map used in the random encounter model for estimating density. (© Zoological Society London)

The next step in the pipeline uses the Agouti platform to estimate the positions of animals.7 This process uses a combination of calibration imagery created at the time of camera deployment and a pinhole camera model to create a depth map for each image (see Figure 3). From this, the position of each animal within the image can be translated into a real-world position in front of the camera. Furthermore, positions across sequences of images can be linked together to estimate animal speeds.

As the approach is refined, one promising area of research is incorporating direct image-to-depth models, which are deep-learning models trained to predict a depth map directly from an image.8 In principle, it should be possible to combine these techniques with an object-detection model in a way that greatly reduces the human labour required for this task. More work is needed to develop and refine this approach so that it becomes fast and reliable, with limited human interaction; possible solutions for this are currently under evaluation.

Finally, the data exported from Agouti are processed to estimate animal density, which can then be used to evaluate variations across the country in response to differences in the environment or in management techniques. This is done using the random encounter model, a statistical technique for estimating the abundance of wildlife from camera trap data when individual animals from images cannot be identified.9 The method works by using a process derived from physical gas modelling to extract a density signal from the camera trap rate while controlling for confounding factors – in this case, the size of the camera detection zone and the speed of animal movement, which was generated in the previous step. The link from Agouti to analysis is again facilitated by using the Camtrap DP data standard, making it possible to draw on emerging statistical packages that make it easy to generate abundance outputs from this form of data.  

The NHMP’s use of innovative technologies and public engagement is paving the way for more effective large-scale wildlife monitoring. By harnessing the power of machine learning and citizen science, we can gain deeper insights into the processes driving changes in hedgehog and other wildlife populations and implement better conservation strategies for them. Ensuring these technologies are open and accessible to all will be crucial to expanding their impact and fostering a collaborative approach to wildlife conservation.

Get involved

If you would like to help with this crucial task, you can spot hedgehogs and many other wildlife species as part of the MammalWeb NHMP project. Get involved through www.nhmp.co.uk.


Dr Benjamin C. Evans is a postdoctoral researcher at the Institute of Zoology at the Zoological Society of London (ZSL) and specialises in developing machine-learning methods for conservation. He focuses on automating density estimation and creating data pipelines for the NHMP. His earlier research involved developing an end-to-end semi-automated camera trap pipeline with advances in multi-frame detection and generalisable species classification methods.

Professor Marcus Rowcliffe is conservation scientist at the Institute of Zoology, ZSL. He specialises in biodiversity monitoring using advanced technologies such as camera traps, drones and acoustic sensors. Marcus has developed key statistical tools for estimating animal abundance. His work spans urban wildlife in the UK to endangered species conservation globally.

Professor Chris Carbone is a scientist at the Institute of Zoology, ZSL. His work combines theoretical and practical research in ecology, focusing on body size, diet and human–wildlife interactions. He founded the London HogWatch project and has published on a wide range of topics focusing on mammalian ecology and biodiversity.

Dr Emma L. Cartledge is a research fellow at Nottingham Trent University, and was involved in setting up the NHMP. Emma's research interests lie at the interface of conservation science and practice. Her recent work has focused on mammalian population monitoring and survey methods.

Nida Al-Fulaij is Chief Executive Officer at the People’s Trust for Endangered Species (PTES), a UK-based wildlife organisation. She is a committed conservation leader with over two decades of experience funding and managing evidence-based conservation projects in the UK and abroad, supporting research and practical conservation on endangered species and threatened habitats. As Co-Chair of the International Union on the Conservation of Nature’s (IUCN) National Committee UK Species Survival Working Group and a member of the IUCN Species Survival Commission Small Mammal Working Group, Nida has a particular interest in small mammal monitoring and conservation.

Dr Henrietta Pringle is the NHMP’s Project Coordinator at the PTES and the British Hedgehog Preservation Society. Experienced in using large-scale citizen science datasets to explore drivers of population change in birds and other species, Henrietta now focuses on the data collection side of the process, working with volunteers and biological recorders.

Dr Richard Yarnell is Associate Professor in Ecology at Nottingham Trent University. He has research interests in conservation and ecology, with a focus on hedgehogs in the UK. He uses remote sensing cameras and movement loggers to understand variability in population size across regions and how individual animals select and use habitats.

Professor Philip A. Stephens is an ecologist at the Department of Biosciences at Durham University. His research focuses on population ecology and monitoring, including the use of citizen science to inform biodiversity management and conservation.

Professor Russell Hill works at the Department of Anthropology at Durham University and is a Director of MammalWeb. Russell’s research interests span predator–prey interactions, biodiversity conservation, and interdisciplinary approaches to understanding human–wildlife interactions.

Kate Scott-Gatty manages the London HogWatch project at the Institute of Zoology, ZSL, a London-based hedgehog conservation project. She specialises in urban biodiversity monitoring using camera traps, and her previous work focused on trends in global biodiversity.

Chloe Hartland is a researcher at London HogWatch. Her research interests are in urban biodiversity monitoring, ecology and the use of population genomics for wildlife conservation and management.

Bella Horwood is a researcher at London HogWatch. Her interests include bridging urban conservation efforts with inclusive community engagement initiatives.


References 

  1.  Wembridge, D. (2011) The State of Britain’s Hedgehogs. https://www.britishhedgehogs.org.uk/leaflets/sobh.pdf (Accessed: 7 June 2024).
  2.  Beery, S., Morris, D. and Yang, S. (2019) Efficient pipeline for camera trap image review. ArXiv, Cornell University. https://doi.org/10.48550/arXiv.1907.06772 (Accessed: 7 June 2024).
  3.  Conservation AI (no date) Where conservation meets artificial intelligence. https://www.conservationai.co.uk/ (Accessed: 7 June 2024).
  4.  GitHub (no date) bencevans / camtrap-detector. https://github.com/bencevans/camtrap-detector (Accessed: 7 June 2024).
  5.  Hsing, P-Y., Hill, R.A., Smith, G.C., Bradley, S., Green, S.E., Kent, V.T., Mason, S.S., Rees, J., Whittingham, M.J., Cokill, J., MammalWeb Citizen Scientists and Stephens, P.A. (2022) Large-scale mammal monitoring: the potential of a citizen science camera-trapping project in the United Kingdom. Ecological Solutions and Evidence, 3 (4), e12180. https://doi.org/10.1002/2688-8319.12180 (Accessed: 7 June 2024).
  6.  Bubnicki, J.W., Norton, B., Baskauf, S.J., Bruce, T., Cagnacci, F., Casaer, J., Churski, M., Cromsigt, J.P.G.M., Dal Farra, S., Fiderer, C., Forrester, T.D., Hendry, H., Heurich, M. et al. (2023) Camtrap DP: an open standard for the FAIR exchange and archiving of camera trap data. Remote Sensing in Ecology and Conservation, Early view. https://doi.org/10.1002/rse2.374 (Accessed: 7 June 2024).
  7.  Casaer, J., Milotic, T., Liefting, Y., Desmet, P. and Jansen, P. (2019) Agouti: a platform for processing and archiving of camera trap images. Biodiversity Information Science and Standards, 3, e46690. https://doi.org/10.3897/biss.3.46690 (Accessed: 7 June 2024).
  8.  Haucke, T., Kühl, H.S., Hoyer, J. and Steinhage, V. (2022) Overcoming the distance estimation bottleneck in estimating animal abundance with camera traps. Ecological Informatics, 68, e101536. https://doi.org/10.1016/j.ecoinf.2021.101536 (Accessed: 7 June 2024).
  9.  Rowcliffe, J. M., Field, J., Turvey, S.T. and Carbone, C. (2008) Estimating animal density using camera traps without the need for individual recognition. Journal of Applied Ecology, 45 (4), pp. 1228–1236. https://doi.org/10.1111/j.1365-2664.2008.01473.x (Accessed: 7 June 2024).

Header image © National Hedgehog Monitoring Programme