Abstract
Synthetic Aperture Sonar (SAS) can be used commonly in many different underwater applications such as mine countermeasures, habitat mapping and archeology. It offers high resolution images over wide swath areas. A single-view SAS image however may lack critical information for an object classification task (a mine hidden by a rock for example, or a partial image). Instead, multi-view images of the same scene could provide much richer information. In this context, Thales developed a sonar capable of processing three views under different angles simultaneously. CMRE and Thales have teamed up to investigate deep learning applications for multi-view. This paper demonstrates the potential benefits of such a technology in the matter of target classification. The data used for this study are real SAS data collected at sea trials by the MUSCLE. The preliminary work compares different ways of classifying with Convolutional Neural Network (CNN) architectures. Transfer learning is also performed from pre-trained models.
Original language | English (US) |
---|---|
Pages (from-to) | 227-233 |
Number of pages | 7 |
Journal | Underwater Acoustic Conference and Exhibition Series |
State | Published - 2019 |
Event | 5th Underwater Acoustics Conference and Exhibition, UACE 2019 - Hersonissos, Greece Duration: Jun 30 2019 → Jul 5 2019 |
All Science Journal Classification (ASJC) codes
- Geophysics
- Oceanography
- Environmental Engineering
- Acoustics and Ultrasonics