Bayesian data fusion of multiview synthetic aperture sonar imagery for seabed classification

Research output: Contribution to journalArticlepeer-review

40 Scopus citations


A Bayesian data fusion approach for seabed classification using multiview synthetic aperture sonar (SAS) imagery is proposed. The principled approach exploits all available information and results in probabilistic predictions. Each data point, corresponding to a unique 10 m × 10 m area of seabed, is represented by a vector of wavelet-based features. For each seabed type, the distribution of these features is then modeled by a unique Gaussian mixture model. When multiple views of the same data point (i.e., area of seabed) are available, the views are combined via a joint likelihood calculation. The end result of this Bayesian formulation is the posterior probability that a given data point belongs to each seabed type. It is also shown how these posterior probabilities can be exploited in a form of entropy-based active-learning to determine the most useful additional data to acquire. Experimental results of the proposed multiview classification framework are shown on a large data set of real, multiview SAS imagery spanning more than 2 km2 of seabed.

Original languageEnglish (US)
Pages (from-to)1239-1254
Number of pages16
JournalIEEE Transactions on Image Processing
Issue number6
StatePublished - 2009

All Science Journal Classification (ASJC) codes

  • Software
  • Computer Graphics and Computer-Aided Design


Dive into the research topics of 'Bayesian data fusion of multiview synthetic aperture sonar imagery for seabed classification'. Together they form a unique fingerprint.

Cite this