Acoustic semantic labeling and fusion of human-vehicle interactions

Amir Shirkhodaie, Vinayak Elangovan, Aaron Rababaah

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations


Situational awareness in a Persistent Surveillance System (PSS) can be significantly improved by fusion of Data from physical (Hard) sensors and information provided by human observers (as Soft/biological sensors) from the field. One of the major limitations that this trend brings about is, however, the integration and fusion of the sensory data collected from hard sensors along with soft data gathered from human agents in a consistent and cohesive way. This paper presents a proposed approach for semantic labeling of vehicular non-stationary acoustic events in the context of PSS. Two techniques for feature extraction based on discrete wavelet and short-time Fourier transforms are described. A correlation-based classifier is proposed for classifying and semantic labeling of vehicular acoustic events. The presented result demonstrates the proposed solution is both reliable and effective, and can be extended to future PSS applications.

Original languageEnglish (US)
Title of host publicationSignal Processing, Sensor Fusion, and Target Recognition XX
StatePublished - 2011
EventSignal Processing, Sensor Fusion, and Target Recognition XX - Orlando, FL, United States
Duration: Apr 25 2011Apr 27 2011

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
ISSN (Print)0277-786X


OtherSignal Processing, Sensor Fusion, and Target Recognition XX
Country/TerritoryUnited States
CityOrlando, FL

All Science Journal Classification (ASJC) codes

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering


Dive into the research topics of 'Acoustic semantic labeling and fusion of human-vehicle interactions'. Together they form a unique fingerprint.

Cite this