Feature Importance to Explain Multimodal Prediction Models: A Clinical Use Case

Jorn-Jan van de Beld*, Shreyasi Pathak, Jeroen Geerdink, Johannes H. Hegeman, Christin Seifert

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)
47 Downloads (Pure)

Abstract

Surgery to treat elderly hip fracture patients may cause complications that can lead to early mortality. An early warning system for complications could provoke clinicians to monitor high-risk patients more carefully and address potential complications early, or inform the patient. In this work, we develop a multimodal deep-learning model for post-operative mortality prediction using pre-operative and per-operative data from elderly hip fracture patients. Specifically, we include static patient data, hip and chest images before surgery in pre-operative data, vital signals, and medications administered during surgery in per-operative data. We extract features from image modalities using ResNet and from vital signals using LSTM. Explainable model outcomes are essential for clinical applicability, therefore we compute Shapley values to explain the predictions of our multimodal black box model. We find that i) Shapley values can be used to estimate the relative contribution of each modality both locally and globally, and ii) a modified version of the chain rule can be used to propagate Shapley values through a sequence of models supporting interpretable local explanations. Our findings imply that a multimodal combination of black box models can be explained by propagating Shapley values through the model sequence.

Original languageEnglish
Title of host publicationExplainable Artificial Intelligence
Subtitle of host publicationSecond World Conference, xAI 2024, Valletta, Malta, July 17–19, 2024, Proceedings
EditorsLuca Longo, Sebastian Lapuschkin, Christin Seifert
Place of PublicationCham, Switzerland
PublisherSpringer
Pages84-101
Number of pages18
VolumePart IV
ISBN (Electronic)978-3-031-63803-9
ISBN (Print)978-3-031-63802-2
DOIs
Publication statusPublished - 2024
Event2nd World Conference on Explainable Artificial Intelligence, xAI 2024 - Valletta, Malta
Duration: 17 Jul 202419 Jul 2024
Conference number: 2
https://xaiworldconference.com/2024/

Publication series

NameCommunications in Computer and Information Science
PublisherSpringer
Volume2156
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference2nd World Conference on Explainable Artificial Intelligence, xAI 2024
Abbreviated titlexAI
Country/TerritoryMalta
CityValletta
Period17/07/2419/07/24
Internet address

Keywords

  • 2024 OA procedure
  • Hip fractures
  • Mortality prediction
  • Multimodal machine learning
  • Clinical decision support

Fingerprint

Dive into the research topics of 'Feature Importance to Explain Multimodal Prediction Models: A Clinical Use Case'. Together they form a unique fingerprint.

Cite this