论文标题
我们如何开发可解释的系统?文献综述和访谈研究的见解
How Can We Develop Explainable Systems? Insights from a Literature Review and an Interview Study
论文作者
论文摘要
事实证明,诸如道德,公平和透明度之类的质量方面对于值得信赖的软件系统至关重要。解释性不仅被确定为实现系统中所有这三个方面的一种手段,而且是培养用户信任情感的一种方式。尽管如此,研究仅略微关注开发可解释系统的活动和实践。为了缩小这一差距,我们建议基于文献综述和访谈研究的结果来开发可解释系统的六项核心活动和相关实践。首先,我们确定并汇总了文献中的活动和相应的实践。为了补充这些发现,我们对19位行业专业人员进行了访谈,他们为解释系统的开发过程提供了建议,并根据其专业知识和知识审查了活动和实践。我们比较并结合了访谈和文献综述的发现,以推荐活动并评估其在行业中的适用性。我们的发现表明,活动和实践不仅可行,而且还可以集成到不同的开发过程中。
Quality aspects such as ethics, fairness, and transparency have been proven to be essential for trustworthy software systems. Explainability has been identified not only as a means to achieve all these three aspects in systems, but also as a way to foster users' sentiments of trust. Despite this, research has only marginally focused on the activities and practices to develop explainable systems. To close this gap, we recommend six core activities and associated practices for the development of explainable systems based on the results of a literature review and an interview study. First, we identified and summarized activities and corresponding practices in the literature. To complement these findings, we conducted interviews with 19 industry professionals who provided recommendations for the development process of explainable systems and reviewed the activities and practices based on their expertise and knowledge. We compared and combined the findings of the interviews and the literature review to recommend the activities and assess their applicability in industry. Our findings demonstrate that the activities and practices are not only feasible, but can also be integrated in different development processes.