Deep Learning in ex-vivo Lung Cancer Discrimination using Fluorescence Lifetime Endomicroscopic Images (2020).

, , , ,

Wang Q, Hopgood JR, Finlayson N, Williams GO, Fernandes S, Williams E, Akram A, Dhaliwal K, Vallejo M.


Read the article here

Fluorescence lifetime is effective in discriminating cancerous tissue from normal tissue, but conventional discrimination methods are primarily based on statistical approaches in collaboration with prior knowledge. This paper investigates the application of deep convolutional neural networks (CNNs) for automatic differentiation of ex-vivo human lung cancer via fluorescence lifetime imaging. Around 70,000 fluorescence images from ex-vivo lung tissue of 14 patients were collected by a custom fibre-based fluorescence lifetime imaging endomicroscope. Five state-of-the-art CNN models, namely ResNet, ResNeXt, Inception, Xception, and DenseNet, were trained and tested to derive quantitative results using accuracy, precision, recall, and the area under receiver operating characteristic curve (AUC) as the metrics. The CNNs were firstly evaluated on lifetime images. Since fluorescence lifetime is independent of intensity, further experiments were conducted by stacking intensity and lifetime images together as the input to the CNNs. As the original CNNs were implemented for RGB images, two strategies were applied. One was retaining the CNNs by putting intensity and lifetime images in two different channels and leaving the remaining channel blank. The other was adapting the CNNs for two-channel input. Quantitative results demonstrate that the selected CNNs are considerably superior to conventional machine learning algorithms. Combining intensity and lifetime images introduces noticeable performance gain compared with using lifetime images alone. In addition, the CNNs with intensity-lifetime RGB image is comparable to the modified two-channel CNNs with intensity-lifetime two-channel input for accuracy and AUC, but significantly better for precision and recall.


Posted on

June 5, 2023

Skip to content