论文标题

Soden:通过普通微分方程网络的可扩展连续时间生存模型

SODEN: A Scalable Continuous-Time Survival Model through Ordinary Differential Equation Networks

论文作者

Tang, Weijing, Ma, Jiaqi, Mei, Qiaozhu, Zhu, Ji

论文摘要

在本文中,我们提出了一个灵活的模型,用于使用神经网络以及可扩展优化算法的生存分析。直接将最大似然估计(MLE)应用于审查数据的一个关键技术挑战是评估目标函数及其相对于模型参数的梯度需要计算积分。为了应对这一挑战,我们认识到,审查数据的MLE可以看作是一个差分方程的约束优化问题,即一种新颖的视角。在此连接之后,我们通过普通的微分方程对事件时间的分布进行建模,并利用有效的ODE求解器和伴随灵敏度分析来数值评估可能性和梯度。使用这种方法,我们能够1)提供一系列连续的生存分布的广泛家族,而没有强大的结构假设,2)使用神经网络获得强大的特征表示,3)3)允许使用随机梯度下降的大规模应用中对模型进行有效估算。通过模拟研究和现实世界数据示例,我们证明了与现有的最新深度学习生存分析模型相比,提出的方法的有效性。已在https://github.com/jiaqima/soden上公开提供了拟议的Soden方法的实施。

In this paper, we propose a flexible model for survival analysis using neural networks along with scalable optimization algorithms. One key technical challenge for directly applying maximum likelihood estimation (MLE) to censored data is that evaluating the objective function and its gradients with respect to model parameters requires the calculation of integrals. To address this challenge, we recognize that the MLE for censored data can be viewed as a differential-equation constrained optimization problem, a novel perspective. Following this connection, we model the distribution of event time through an ordinary differential equation and utilize efficient ODE solvers and adjoint sensitivity analysis to numerically evaluate the likelihood and the gradients. Using this approach, we are able to 1) provide a broad family of continuous-time survival distributions without strong structural assumptions, 2) obtain powerful feature representations using neural networks, and 3) allow efficient estimation of the model in large-scale applications using stochastic gradient descent. Through both simulation studies and real-world data examples, we demonstrate the effectiveness of the proposed method in comparison to existing state-of-the-art deep learning survival analysis models. The implementation of the proposed SODEN approach has been made publicly available at https://github.com/jiaqima/SODEN.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源