Slice-by-slice deep learning aided oropharyngeal cancer segmentation with adaptive thresholding for spatial uncertainty on FDG PET and CT images. (arXiv:2207.01623v1 [eess.IV])

Tumor segmentation is a fundamental step for radiotherapy treatment planning.
To define an accurate segmentation of the primary tumor (GTVp) of oropharyngeal
cancer patients (OPC), simultaneous assessment of different image modalities is
needed, and each image volume is explored slice-by-slice from different
orientations. Moreover, the manual fixed boundary of segmentation neglects the
spatial uncertainty known to occur in tumor delineation. This study proposes a
novel automatic deep learning (DL) model to assist radiation oncologists in a
slice-by-slice adaptive GTVp segmentation on registered FDG PET/CT images. We
included 138 OPC patients treated with (chemo)radiation in our institute. Our
DL framework exploits both inter and intra-slice context. Sequences of 3
consecutive 2D slices of concatenated FDG PET/CT images and GTVp contours were
used as input. A 3-fold cross validation was performed three times, training on
sequences extracted from the Axial (A), Sagittal (S), and Coronal (C) plane of
113 patients. Since consecutive sequences in a volume contain overlapping
slices, each slice resulted in three outcome predictions that were averaged. In
the A, S, and C planes, the output shows areas with different probabilities of
predicting the tumor. The performance of the models was assessed on 25 patients
at different probability thresholds using the mean Dice Score Coefficient
(DSC). Predictions were the closest to the ground truth at a probability
threshold of 0.9 (DSC of 0.70 in the A, 0.77 in the S, and 0.80 in the C
plane). The promising results of the proposed DL model show that the
probability maps on registered FDG PET/CT images could guide radiation
oncologists in a slice-by-slice adaptive GTVp segmentation.

Source: https://arxiv.org/abs/2207.01623

webmaster

Related post