Interactive Coconut Tree Annotation Using Feature Space Projections

John E. Vargas-Munoz, Ping Zhou, Alexandre X. Falcao, Devis Tuia

Research output: Chapter in Book/Report/Conference proceedingConference paperAcademicpeer-review

11 Citations (Scopus)

Abstract

The detection and counting of coconut trees in aerial images are important tasks for environment monitoring and post-disaster assessment. Recent deep-learning-based methods can attain accurate results, but they require a reasonably high number of annotated training samples. In order to obtain such large training sets with considerably reduced human effort, we present a semi-automatic sample annotation method based on the 2D t-SNE projection of the sample feature space. The proposed approach can facilitate the construction of effective training sets more efficiently than using the traditional manual annotation, as shown in our experimental results with VHR images from the Kingdom of Tonga.
Original languageEnglish
Title of host publicationIGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium
Subtitle of host publicationProceedings
PublisherIEEE
Pages5718-5721
ISBN (Electronic)9781538691540
ISBN (Print)9781538691557
DOIs
Publication statusPublished - 14 Nov 2019
EventIGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium - Yokohama, Japan
Duration: 28 Jul 20192 Aug 2019

Conference/symposium

Conference/symposiumIGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium
Period28/07/192/08/19

Keywords

  • Coconut trees detection
  • convolutional neural networks
  • feature space projections
  • interactive annotation

Fingerprint

Dive into the research topics of 'Interactive Coconut Tree Annotation Using Feature Space Projections'. Together they form a unique fingerprint.

Cite this