A Novel Framework for Evaluating Performance-Estimation Models

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

A general framework for quantifying the worth of a performance-estimation model is proposed. The purpose of the model is to predict the performance of an automatic target recognition algorithm on a given set of test data, while the purpose of the framework is to quantify how well the model fulfills its task. To this end, a quantity referred to as the utility, which is based on the Kullback-Leibler divergence, is introduced. A key aspect of the framework is the inclusion of a significance function that specifies the relative importance of each point in the performance space, here assumed to be defined in terms of false alarm rate and probability of detection. Example significance functions are suggested and discussed. The functionality of the proposed framework is demonstrated on an underwater target detection application involving measured synthetic aperture sonar data. In this context, an image complexity metric is exploited to enable the development of models corresponding to different seafloor conditions and mine-hunting difficulty. The appeal of the framework is its ability to quantitatively assess the utility of competing performance-estimation models and to fairly compare the utility of a model on different test data sets.

Original languageEnglish (US)
Article number8661781
Pages (from-to)5285-5302
Number of pages18
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume57
Issue number8
DOIs
StatePublished - 2019

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • General Earth and Planetary Sciences

Fingerprint

Dive into the research topics of 'A Novel Framework for Evaluating Performance-Estimation Models'. Together they form a unique fingerprint.

Cite this