论文标题

通过分层的弥散光学层析成像中的层次先验模型促进重建的稀疏性

Sparsity promoting reconstructions via hierarchical prior models in diffuse optical tomography

论文作者

Manninen, Anssi, Mozumder, Meghdoot, Tarvainen, Tanja, Hauptmann, Andreas

论文摘要

弥漫性光学断层扫描(DOT)是一个严重不良的非线性反问题,试图从边界测量值估算光学参数。在贝叶斯框架中,通过将光学参数的{\ em先验}信息通过先验分布纳入{\ em a先验信息来减少。如果目标是稀疏或锋利的,则作为先前模型的共同选择是非差异的总变化,$ \ ell^1 $ priors。另外,在获得可促进先验的可微分稀疏性之前,可以从层次上扩展高斯的差异。通过这样做,差异被视为未知数,允许估计找到不连续性。在这项工作中,我们使用指数,标准伽马和逆伽玛超级主体制定了非线性点反问题的层次先验模型。根据高位和超级参数的不同,分层模型会促进不同水平的稀疏性和光滑度。为了计算地图估计值,先前提出的交替算法适用于非线性模型。然后,我们提出了一种基于高级培训者的累积分布函数以选择超参数的方法。我们通过数值模拟评估了高度培训的性能,并表明分层模型可以改善重建的定位,对比度和边缘锐度。

Diffuse optical tomography (DOT) is a severely ill-posed nonlinear inverse problem that seeks to estimate optical parameters from boundary measurements. In the Bayesian framework, the ill-posedness is diminished by incorporating {\em a priori} information of the optical parameters via the prior distribution. In case the target is sparse or sharp-edged, the common choice as the prior model are non-differentiable total variation and $\ell^1$ priors. Alternatively, one can hierarchically extend the variances of a Gaussian prior to obtain differentiable sparsity promoting priors. By doing this, the variances are treated as unknowns allowing the estimation to locate the discontinuities. In this work, we formulate hierarchical prior models for the nonlinear DOT inverse problem using exponential, standard gamma and inverse-gamma hyperpriors. Depending on the hyperprior and the hyperparameters, the hierarchical models promote different levels of sparsity and smoothness. To compute the MAP estimates, the previously proposed alternating algorithm is adapted to work with the nonlinear model. We then propose an approach based on the cumulative distribution function of the hyperpriors to select the hyperparameters. We evaluate the performance of the hyperpriors with numerical simulations and show that the hierarchical models can improve the localization, contrast and edge sharpness of the reconstructions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源