Discriminative graphical models for sparsity-based hyperspectral target detection

Umamahesh Srinivas, Yi Chen, Vishal Monga, Nasser M. Nasrabadi, Trac D. Tran

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations

Abstract

The inherent discriminative capability of sparse representations has been exploited recently for hyperspectral target detection. This approach relies on the observation that the spectral signature of a pixel can be represented as a linear combination of a few training spectra drawn from both target and background classes. The sparse representation corresponding to a given test spectrum captures class-specific discriminative information crucial for detection tasks. Spatio-spectral information has also been introduced into this framework via a joint sparsity model that simultaneously solves for the sparse features for a group of spatially local pixels, since such pixels are highly likely to have similar spectral characteristics. In this paper, we propose a probabilistic graphical model framework that can explicitly learn the class conditional correlations between these distinct sparse representations corresponding to different pixels in a spatial neighborhood. Simulation results show that the proposed algorithm outperforms classical hyperspectral target detection algorithms as well as support vector machines.

Original languageEnglish (US)
Pages1489-1492
Number of pages4
DOIs
StatePublished - 2012
Event2012 32nd IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2012 - Munich, Germany
Duration: Jul 22 2012Jul 27 2012

Other

Other2012 32nd IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2012
Country/TerritoryGermany
CityMunich
Period7/22/127/27/12

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • General Earth and Planetary Sciences

Fingerprint

Dive into the research topics of 'Discriminative graphical models for sparsity-based hyperspectral target detection'. Together they form a unique fingerprint.

Cite this