Fully convolutional networks for multi-temporal SAR image classification

Adugna G. Mullissa, Claudio Persello, Valentyn Tolpekin

Research output: Chapter in Book/Report/Conference proceedingConference paperAcademicpeer-review

26 Citations (Scopus)

Abstract

Classification of crop types from multi-temporal SAR data is a complex task because of the need to extract spatial and temporal features from images affected by speckle. Previous methods applied speckle filtering and then classification in two separate processing steps. This paper introduces fully convolutional networks (FCN) for pixel-wise classification of crops from multi-temporal SAR data. It applies speckle filtering and classification in a single framework. Furthermore, it also uses dilated kernels to increase the capability to learn long distance spatial dependencies. The proposed FCN was compared with patch-based convolutional neural network (CNN) and support vector machine (SVM) classifiers. The proposed method performed better when compared with the patch-based CNN and SVM.
Original languageEnglish
Title of host publication2018 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2018 - Proceedings
PublisherIEEE
Pages6635-6638
Number of pages4
ISBN (Electronic)9781538671504
ISBN (Print)9781538671511
DOIs
Publication statusPublished - 31 Oct 2018
Externally publishedYes
Event38th Annual IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2018 - Valencia, Spain
Duration: 22 Jul 201827 Jul 2018

Publication series

NameInternational Geoscience and Remote Sensing Symposium (IGARSS)
Volume2018-July

Conference

Conference38th Annual IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2018
Country/TerritorySpain
CityValencia
Period22/07/1827/07/18

Keywords

  • Deep learning
  • Fully convolutional networks
  • Remote Sensing
  • SAR
  • Sentinel-1

Fingerprint

Dive into the research topics of 'Fully convolutional networks for multi-temporal SAR image classification'. Together they form a unique fingerprint.

Cite this