Abstract
The detection and counting of coconut trees in aerial images are important tasks for environment monitoring and post-disaster assessment. Recent deep-learning-based methods can attain accurate results, but they require a reasonably high number of annotated training samples. In order to obtain such large training sets with considerably reduced human effort, we present a semi-automatic sample annotation method based on the 2D t-SNE projection of the sample feature space. The proposed approach can facilitate the construction of effective training sets more efficiently than using the traditional manual annotation, as shown in our experimental results with VHR images from the Kingdom of Tonga.
Original language | English |
---|---|
Title of host publication | IGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium |
Subtitle of host publication | Proceedings |
Publisher | IEEE |
Pages | 5718-5721 |
ISBN (Electronic) | 9781538691540 |
ISBN (Print) | 9781538691557 |
DOIs | |
Publication status | Published - 14 Nov 2019 |
Event | IGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium - Yokohama, Japan Duration: 28 Jul 2019 → 2 Aug 2019 |
Conference/symposium
Conference/symposium | IGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium |
---|---|
Period | 28/07/19 → 2/08/19 |
Keywords
- Coconut trees detection
- convolutional neural networks
- feature space projections
- interactive annotation