论文标题

低剂量CT的无监督重建方法,使用深层生成正则化

An Unsupervised Reconstruction Method For Low-Dose CT Using Deep Generative Regularization Prior

论文作者

Unal, Mehmet Ozan, Ertas, Metin, Yildirim, Isa

论文摘要

低剂量CT成像需要从嘈杂的间接测量结果进行重建,这些测量可以定义为虚拟的线性反问题。除了CT成像中的常规FBP方法外,最新的基于压缩感应的方法利用手工制作的先验,大多是简单且难以确定的。最近,基于深度学习(DL)的方法已在医学成像领域中流行。在CT成像中,基于DL的方法试图学习将低剂量图像映射到正常剂量图像的函数。尽管这些方法的结果很有希望,但它们的成功主要取决于高质量的大型数据集的可用性。在这项研究中,我们提出了一种不需要任何培训数据或学习过程的方法。我们的方法利用了一种方法,即深卷积神经网络(CNN)比噪声更容易生成模式,因此可以随机初始化的生成神经网络可用于正规化重建。在实验中,提出的方法是通过不同的损失函数变体实现的。分析性CT幻象和现实世界CT图像均与不同的视图一起使用。常规的FBP方法,一种流行的迭代方法(SART)和TV正则SART用于比较。我们证明,具有不同损耗函数变体的方法在定性和定量上都优于其他方法。

Low-dose CT imaging requires reconstruction from noisy indirect measurements which can be defined as an ill-posed linear inverse problem. In addition to conventional FBP method in CT imaging, recent compressed sensing based methods exploit handcrafted priors which are mostly simplistic and hard to determine. More recently, deep learning (DL) based methods have become popular in medical imaging field. In CT imaging, DL based methods try to learn a function that maps low-dose images to normal-dose images. Although the results of these methods are promising, their success mostly depends on the availability of high-quality massive datasets. In this study, we proposed a method that does not require any training data or a learning process. Our method exploits such an approach that deep convolutional neural networks (CNNs) generate patterns easier than the noise, therefore randomly initialized generative neural networks can be suitable priors to be used in regularizing the reconstruction. In the experiments, the proposed method is implemented with different loss function variants. Both analytical CT phantoms and real-world CT images are used with different views. Conventional FBP method, a popular iterative method (SART), and TV regularized SART are used in the comparisons. We demonstrated that our method with different loss function variants outperforms the other methods both qualitatively and quantitatively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源