AIDE: Accelerating image-based ecological surveys with interactive machine learning

Benjamin Kellenberger*, Devis Tuia, Dan Morris

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

27 Citations (Scopus)

Abstract

Ecological surveys increasingly rely on large-scale image datasets, typically terabytes of imagery for a single survey. The ability to collect this volume of data allows surveys of unprecedented scale, at the cost of expansive volumes of photo-interpretation labour. We present Annotation Interface for Data-driven Ecology (AIDE), an open-source web framework designed to alleviate the task of image annotation for ecological surveys. AIDE employs an easy-to-use and customisable labelling interface that supports multiple users, database storage and scalability to the cloud and/or multiple machines. Moreover, AIDE closely integrates users and machine learning models into a feedback loop, where user-provided annotations are employed to re-train the model, and the latter is applied over unlabelled images to e.g. identify wildlife. These predictions are then presented to the users in optimised order, according to a customisable active learning criterion. AIDE has a number of deep learning models built-in, but also accepts custom model implementations. Annotation Interface for Data-driven Ecology has the potential to greatly accelerate annotation tasks for a wide range of researches employing image data. AIDE is open-source and can be downloaded for free at https://github.com/microsoft/aerial_wildlife_detection.

Original languageEnglish
Pages (from-to)1716-1727
JournalMethods in Ecology and Evolution
Volume11
Issue number12
Early online date24 Sept 2020
DOIs
Publication statusPublished - Dec 2020

Keywords

  • applied ecology
  • conservation
  • monitoring (population ecology)
  • population ecology
  • statistics
  • surveys

Fingerprint

Dive into the research topics of 'AIDE: Accelerating image-based ecological surveys with interactive machine learning'. Together they form a unique fingerprint.

Cite this