Classification of urban multi-angular image sequences by aligning their manifolds

Maxime Trolliet, Devis Tuia, Michele Volpi

Research output: Chapter in Book/Report/Conference proceedingConference paperAcademicpeer-review

Abstract

When dealing with multi-angular image sequences, problems of reflectance changes due either to illumination and acquisition geometry, or to interactions with the atmosphere, naturally arise. These phenomena interplay with the scene and lead to a modification of the measured radiance: for example, according to the angle of acquisition, tall objects may be seen from top or from the side and different light scatterings may affect the surfaces. This results in shifts in the acquired radiance, that make the problem of multi-angular classification harder and might lead to catastrophic results, since surfaces with the same reflectance return significantly different signals. In this paper, rather than performing atmospheric or bi-directional reflection distribution function (BRDF) correction, a non-linear manifold learning approach is used to align data structures. This method maximizes the similarity between the different acquisitions by deforming their manifold, thus enhancing the transferability of classification models among the images of the sequence.

Original languageEnglish
Title of host publicationJoint Urban Remote Sensing Event 2013, JURSE 2013
PublisherIEEE
Pages53-56
Number of pages4
ISBN (Electronic)9781479902118
ISBN (Print)9781479902132
DOIs
Publication statusPublished - 1 Jun 2013
Externally publishedYes
Event2013 Joint Urban Remote Sensing Event, JURSE 2013 - Sao Paulo, Brazil
Duration: 21 Apr 201323 Apr 2013

Publication series

NameJoint Urban Remote Sensing Event 2013, JURSE 2013

Conference

Conference2013 Joint Urban Remote Sensing Event, JURSE 2013
Country/TerritoryBrazil
CitySao Paulo
Period21/04/1323/04/13

Fingerprint

Dive into the research topics of 'Classification of urban multi-angular image sequences by aligning their manifolds'. Together they form a unique fingerprint.

Cite this