论文标题
批处理统计数据在解决医疗机构的灾难性遗忘方面的不合理效力
The unreasonable effectiveness of Batch-Norm statistics in addressing catastrophic forgetting across medical institutions
论文作者
论文摘要
当由于机构间的变化(例如患者人口统计和机构内部变化)(例如多种扫描仪类型),在医疗环境中部署深度学习模型时,模型脆性是主要问题。虽然简单地对合并数据集进行培训充满了数据隐私限制,在对原始机构进行培训后,对随后的机构进行了微调,导致原始数据集的性能下降,这一现象称为灾难性遗忘。在本文中,我们调查了模型改进与以前学习的知识的保留之间的权衡,并随后解决了评估乳房X线乳房密度的灾难性遗忘。更具体地说,我们提出了一种简单但有效的方法,使用原始数据集的全局批归归量化(BN)统计数据适应弹性重量合并(EWC)。这项研究的结果为部署临床深度学习模型提供了指导,在该模型中需要持续学习以进行领域扩展。
Model brittleness is a primary concern when deploying deep learning models in medical settings owing to inter-institution variations, like patient demographics and intra-institution variation, such as multiple scanner types. While simply training on the combined datasets is fraught with data privacy limitations, fine-tuning the model on subsequent institutions after training it on the original institution results in a decrease in performance on the original dataset, a phenomenon called catastrophic forgetting. In this paper, we investigate trade-off between model refinement and retention of previously learned knowledge and subsequently address catastrophic forgetting for the assessment of mammographic breast density. More specifically, we propose a simple yet effective approach, adapting Elastic weight consolidation (EWC) using the global batch normalization (BN) statistics of the original dataset. The results of this study provide guidance for the deployment of clinical deep learning models where continuous learning is needed for domain expansion.