Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Article

Download BibTeX

Title

Provision and evaluation of explanations within an automated planning-based approach to solving the multimorbidity problem

Authors

[ 1 ] Instytut Informatyki, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.3] Information and communication technology

Year of publication

2024

Published in

Journal of Biomedical Informatics

Journal year: 2024 | Journal volume: vol. 156

Article type

scientific article

Publication language

english

Keywords
EN
  • Explainability
  • Automated planning
  • Multimorbidity
  • Clinical decision support
Abstract

EN The multimorbidity problem involves the identification and mitigation of adverse interactions that occur when multiple computer interpretable guidelines are applied concurrently to develop a treatment plan for a patient diagnosed with multiple diseases. Solving this problem requires decision support approaches which are difficult to comprehend for physicians. As such, the rationale for treatment plans generated by these approaches needs to be provided. Objective: To develop an explainability component for an automated planning-based approach to the multi- morbidity problem, and to assess the fidelity and interpretability of generated explanations using a clinical case study. Methods: The explainability component leverages the task-network model for representing computer inter- pretable guidelines. It generates post-hoc explanations composed of three aspects that answer why specific clinical actions are in a treatment plan, why specific revisions were applied, and how factors like medication cost, patient’s adherence, etc. influence the selection of specific actions. The explainability component is implemented as part of MitPlan, where we revised our planning-based approach to support explainability. We developed an evaluation instrument based on the system causability scale and other vetted surveys to evaluate the fidelity and interpretability of its explanations using a two dimensional comparison study design. Results: The explainability component was implemented for MitPlan and tested in the context of a clinical case study. The fidelity and interpretability of the generated explanations were assessed using a physician-focused evaluation study involving 21 participants from two different specialties and two levels of experience. Results show that explanations provided by the explainability component in MitPlan are of acceptable fidelity and interpretability, and that the clinical justification of the actions in a treatment plan is important to physicians. Conclusion: We created an explainability component that enriches an automated planning-based approach to solving the multimorbidity problem with meaningful explanations for actions in a treatment plan. This component relies on the task-network model to represent computer interpretable guidelines and as such can be ported to other approaches that also use the task-network model representation. Our evaluation study demonstrated that explanations that support a physician’s understanding of the clinical reasons for the actions in a treatment plan are useful and important.

Date of online publication

01.07.2024

Pages (from - to)

104681-1 - 104681-25

DOI

10.1016/j.jbi.2024.104681

URL

https://www.sciencedirect.com/science/article/pii/S1532046424000996?via%3Dihub

Comments

Article Number: 104681

Ministry points / journal

100

Impact Factor

4 [List 2023]

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.