TY - JOUR
T1 - Estimating CT from MR Abdominal Images Using Novel Generative Adversarial Networks
AU - Qian, Pengjiang
AU - Xu, Ke
AU - Wang, Tingyu
AU - Zheng, Qiankun
AU - Yang, Huan
AU - Baydoun, Atallah
AU - Zhu, Junqing
AU - Traughber, Bryan
AU - Muzic, Raymond F.
N1 - Publisher Copyright:
© 2020, Springer Nature B.V.
PY - 2020/6/1
Y1 - 2020/6/1
N2 - Computed tomography (CT) plays key roles in radiotherapy treatment planning and PET attenuation correction (AC). Magnetic resonance (MR) imaging has better soft tissue contrast than CT and has no ionizing radiation but cannot directly provide information about photon interactions with tissue that is needed for radiation treatment planning and AC. Therefore, estimating synthetic CT (sCT) images from corresponding MR images and obviating CT scanning is of great interest, but can be particularly challenging in the abdomen owing to a range of tissue types and physiologic motion. For this purpose, inspired by deep learning, we design a novel generative adversarial network (GAN) model that organically combines ResNet, U-net, and auxiliary classifier-augmented GAN (RU-ACGAN for short). The significance of our effort is three-fold: 1) The combination of ResNet and U-net, instead of only the U-net which was commonly used in existing conditional GAN, is enlisted to constitute the generative network in RU-ACGAN. This has the potential to generate more accurate CT than existing methods. 2) Adding the classifier to the discriminant network makes the training process of the proposed model more stable, and thereby benefits the robustness of sCT estimation. 3) Owing to the delicate architecture, RU-ACGAN is capable of estimating superior sCT using only a limited quantity of training data. The experimental studies on ten subjects’ MR-CT pair images indicate that the proposed RU-ACGAN model can capture the potential, non-linear matching between the MR and CT images, and thus achieves the better performance for sCT estimation for the abdomen than many other existing methods.
AB - Computed tomography (CT) plays key roles in radiotherapy treatment planning and PET attenuation correction (AC). Magnetic resonance (MR) imaging has better soft tissue contrast than CT and has no ionizing radiation but cannot directly provide information about photon interactions with tissue that is needed for radiation treatment planning and AC. Therefore, estimating synthetic CT (sCT) images from corresponding MR images and obviating CT scanning is of great interest, but can be particularly challenging in the abdomen owing to a range of tissue types and physiologic motion. For this purpose, inspired by deep learning, we design a novel generative adversarial network (GAN) model that organically combines ResNet, U-net, and auxiliary classifier-augmented GAN (RU-ACGAN for short). The significance of our effort is three-fold: 1) The combination of ResNet and U-net, instead of only the U-net which was commonly used in existing conditional GAN, is enlisted to constitute the generative network in RU-ACGAN. This has the potential to generate more accurate CT than existing methods. 2) Adding the classifier to the discriminant network makes the training process of the proposed model more stable, and thereby benefits the robustness of sCT estimation. 3) Owing to the delicate architecture, RU-ACGAN is capable of estimating superior sCT using only a limited quantity of training data. The experimental studies on ten subjects’ MR-CT pair images indicate that the proposed RU-ACGAN model can capture the potential, non-linear matching between the MR and CT images, and thus achieves the better performance for sCT estimation for the abdomen than many other existing methods.
UR - http://www.scopus.com/inward/record.url?scp=85081900435&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081900435&partnerID=8YFLogxK
U2 - 10.1007/s10723-020-09513-3
DO - 10.1007/s10723-020-09513-3
M3 - Article
AN - SCOPUS:85081900435
SN - 1570-7873
VL - 18
SP - 211
EP - 226
JO - Journal of Grid Computing
JF - Journal of Grid Computing
IS - 2
ER -