TY - GEN
T1 - Segmentation of Breast Ultrasound Images using Densely Connected Deep Convolutional Neural Network and Attention Gates
AU - Thirusangu, Niranjan
AU - Almekkawy, Mohamed
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Ultrasound imagining modality is a popular complementary technique for diagnosing breast cancer. A standardized reporting process called Breast imaging reporting and data system (BI-RADS) is used to categorize breast cancer. The BI-RADS scale uses several features of lesions based on the ultrasound images, which makes the quality of the diagnosis highly dependent on the experience of the radiologist. Radiologists use Computer-Aided Diagnosis (CAD) system to help in the detection of lesions. The accuracy of a CAD system depends greatly on the segmentation stage of the system. To increase the reliability of the diagnosis, we propose a solution based on a densely connected deep convolutional neural network and attention gates, called Attention U-DenseNet. Attention U-DenseNet is an architecture to do semantic segmentation of the lesions from Breast Ultrasound (BUS) images based on the U-Net, DenseNet, and attention gates. Convolutional layers of the U-Net are made densely connected using dense blocks to help to learn complex patterns of the BUS image which is usually noisy and contaminated with speckles. This architecture (U-DenseNet) produced an F-score of 0.63 compared to the U-Net model with an F-score of 0.49. Furthermore, to localize the segmentation by learning salient features, attention gates are added to the U-DenseNet architecture (Attention U-DenseNet). Attention U-DenseNet performed even better compared to U-DenseNet, by improving the F-score to 0.75. Finally, a per-image regularised binary cross-entropy is employed to penalize false negatives more than false positives, since the region of interest is small.
AB - Ultrasound imagining modality is a popular complementary technique for diagnosing breast cancer. A standardized reporting process called Breast imaging reporting and data system (BI-RADS) is used to categorize breast cancer. The BI-RADS scale uses several features of lesions based on the ultrasound images, which makes the quality of the diagnosis highly dependent on the experience of the radiologist. Radiologists use Computer-Aided Diagnosis (CAD) system to help in the detection of lesions. The accuracy of a CAD system depends greatly on the segmentation stage of the system. To increase the reliability of the diagnosis, we propose a solution based on a densely connected deep convolutional neural network and attention gates, called Attention U-DenseNet. Attention U-DenseNet is an architecture to do semantic segmentation of the lesions from Breast Ultrasound (BUS) images based on the U-Net, DenseNet, and attention gates. Convolutional layers of the U-Net are made densely connected using dense blocks to help to learn complex patterns of the BUS image which is usually noisy and contaminated with speckles. This architecture (U-DenseNet) produced an F-score of 0.63 compared to the U-Net model with an F-score of 0.49. Furthermore, to localize the segmentation by learning salient features, attention gates are added to the U-DenseNet architecture (Attention U-DenseNet). Attention U-DenseNet performed even better compared to U-DenseNet, by improving the F-score to 0.75. Finally, a per-image regularised binary cross-entropy is employed to penalize false negatives more than false positives, since the region of interest is small.
UR - http://www.scopus.com/inward/record.url?scp=85124137803&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124137803&partnerID=8YFLogxK
U2 - 10.1109/LAUS53676.2021.9639178
DO - 10.1109/LAUS53676.2021.9639178
M3 - Conference contribution
AN - SCOPUS:85124137803
T3 - LAUS 2021 - 2021 IEEE UFFC Latin America Ultrasonics Symposium, Proceedings
BT - LAUS 2021 - 2021 IEEE UFFC Latin America Ultrasonics Symposium, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE UFFC Latin America Ultrasonics Symposium, LAUS 2021
Y2 - 4 October 2021 through 5 October 2021
ER -