TY - GEN
T1 - Exploiting Auxiliary Information for Improved Underwater Target Classification with Convolutional Neural Networks
AU - Berthomier, Thibaud
AU - Williams, David P.
AU - D'Ales, Benoit
AU - Dugelay, Samantha
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/10/5
Y1 - 2020/10/5
N2 - This work deals with the classification of objects as targets or clutter in synthetic aperture sonar (SAS) imagery using convolutional neural networks (CNNs). First, a new image-annotation tool is developed that allows extra auxiliary information (beyond the basic binary label) to be easily recorded about a given input image. The additional information consists of an estimate of the image quality; the local background environment; and for targets, the specific object shape, orientation, and length. The architecture of the CNNs - specifically the final dense layer and output layer - is then modified so that these extra quantities are additional outputs to be predicted simultaneously. As such, the task of the augmented CNNs becomes to provide a richer representation of an image beyond the binary label. This more complete operational picture can then better inform subsequent mine countermeasures (MCM) decisions. Experiments on a set of real, measured SAS data collected at sea demonstrate that tiny CNNs can accurately predict the additional auxiliary qualities without suffering a significant drop in binary classification performance.
AB - This work deals with the classification of objects as targets or clutter in synthetic aperture sonar (SAS) imagery using convolutional neural networks (CNNs). First, a new image-annotation tool is developed that allows extra auxiliary information (beyond the basic binary label) to be easily recorded about a given input image. The additional information consists of an estimate of the image quality; the local background environment; and for targets, the specific object shape, orientation, and length. The architecture of the CNNs - specifically the final dense layer and output layer - is then modified so that these extra quantities are additional outputs to be predicted simultaneously. As such, the task of the augmented CNNs becomes to provide a richer representation of an image beyond the binary label. This more complete operational picture can then better inform subsequent mine countermeasures (MCM) decisions. Experiments on a set of real, measured SAS data collected at sea demonstrate that tiny CNNs can accurately predict the additional auxiliary qualities without suffering a significant drop in binary classification performance.
UR - http://www.scopus.com/inward/record.url?scp=85104603748&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85104603748&partnerID=8YFLogxK
U2 - 10.1109/IEEECONF38699.2020.9389138
DO - 10.1109/IEEECONF38699.2020.9389138
M3 - Conference contribution
AN - SCOPUS:85104603748
T3 - 2020 Global Oceans 2020: Singapore - U.S. Gulf Coast
BT - 2020 Global Oceans 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 Global Oceans: Singapore - U.S. Gulf Coast, OCEANS 2020
Y2 - 5 October 2020 through 30 October 2020
ER -