Test and evaluation of soft/hard information fusion systems: A test environment, methodology and initial data sets

David L. Hall, Loretta D. More, Jake Graham, Jeffrey C. Rimland

Research output: Chapter in Book/Report/Conference proceedingConference contribution

16 Scopus citations

Abstract

Increasing interest in human-centered information fusion systems involves; (1) humans as sensors (viz., "soft sensors"), (2) humans performing pattern recognition and participating in the fusion cognitive process, and (3) human groups performing collaborative analysis (viz., "crowd-sourcing" of analysis). Test and evaluation of such systems is challenging because we must develop both representative test data (involving both physical sensors and human observers) and test environments to evaluate the performance of the hardware, software and humans-in-the-loop. This paper describes an experimental facility called an extreme events laboratory, a test and evaluation approach, and evolving test data sets for evaluation of human-centered information fusion systems for situation awareness. The data sets include both synthetic data as well as data obtained using human subjects in campus wide experiments.

Original languageEnglish (US)
Title of host publication13th Conference on Information Fusion, Fusion 2010
StatePublished - 2010
Event13th Conference on Information Fusion, Fusion 2010 - Edinburgh, United Kingdom
Duration: Jul 26 2010Jul 29 2010

Publication series

Name13th Conference on Information Fusion, Fusion 2010

Other

Other13th Conference on Information Fusion, Fusion 2010
Country/TerritoryUnited Kingdom
CityEdinburgh
Period7/26/107/29/10

All Science Journal Classification (ASJC) codes

  • Information Systems

Fingerprint

Dive into the research topics of 'Test and evaluation of soft/hard information fusion systems: A test environment, methodology and initial data sets'. Together they form a unique fingerprint.

Cite this