论文标题
通过迭代人类in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in-in临床数据集进行快速转移。
Rapid model transfer for medical image segmentation via iterative human-in-the-loop update: from labelled public to unlabelled clinical datasets for multi-organ segmentation in CT
论文作者
论文摘要
尽管在医学图像分析和深度学习方面取得了显着的成功,但仍在探索有关如何将AI模型从一个数据集中快速传输到另一个数据集中的临床应用中的探索。本文提出了一种新颖的通用人类方案,用于将分割模型从小规模的标签数据集有效地传输到大规模的无标记数据集,以用于CT中的多器官分割。为了实现这一目标,我们建议使用一个点燃网络,该网络可以从标有小规模的数据集中学习并生成粗糙注释以启动人机相互作用的过程。然后,我们将维持器网络用于大规模数据集,并在新的注释数据上迭代进行了更新。此外,我们为注释者提出了一种灵活的标签策略,以减少初始注释工作量。报告和分析了在我们私人数据集上评估的每个主题中的模型性能和注释的时间成本。结果表明,我们的方案不仅可以将骰子的性能提高19.7%,而且还可以加快手动标签的成本时间从13.87分钟到1.51分钟,在模型转移过程中每CT量1.51分钟,表明具有有希望的潜力的临床实用性。
Despite the remarkable success on medical image analysis with deep learning, it is still under exploration regarding how to rapidly transfer AI models from one dataset to another for clinical applications. This paper presents a novel and generic human-in-the-loop scheme for efficiently transferring a segmentation model from a small-scale labelled dataset to a larger-scale unlabelled dataset for multi-organ segmentation in CT. To achieve this, we propose to use an igniter network which can learn from a small-scale labelled dataset and generate coarse annotations to start the process of human-machine interaction. Then, we use a sustainer network for our larger-scale dataset, and iteratively updated it on the new annotated data. Moreover, we propose a flexible labelling strategy for the annotator to reduce the initial annotation workload. The model performance and the time cost of annotation in each subject evaluated on our private dataset are reported and analysed. The results show that our scheme can not only improve the performance by 19.7% on Dice, but also expedite the cost time of manual labelling from 13.87 min to 1.51 min per CT volume during the model transfer, demonstrating the clinical usefulness with promising potentials.