Abstract
Path integration methods generate attributions by integrating along a trajectory from a baseline to the input. These techniques have demonstrated considerable effectiveness in the field of explainability research. While multiple types of baselines for the path integration process have been explored in the literature, there is no consensus on the ultimate one. This work examines the performance of different baseline distributions on explainability metrics and proposes a probabilistic path integration approach where the baseline distribution is modeled as a mixture of distributions, learned for each combination of model architecture and explanation metric. Extensive evaluations on various model architectures show that our method outperforms state-of-the-art explanation methods across multiple metrics.
Original language | English |
---|---|
Title of host publication | CIKM 2024 - Proceedings of the 33rd ACM International Conference on Information and Knowledge Management |
Publisher | Association for Computing Machinery |
Pages | 570-580 |
Number of pages | 11 |
ISBN (Electronic) | 9798400704369 |
DOIs | |
State | Published - 21 Oct 2024 |
Event | 33rd ACM International Conference on Information and Knowledge Management, CIKM 2024 - Boise, United States Duration: 21 Oct 2024 → 25 Oct 2024 |
Publication series
Name | International Conference on Information and Knowledge Management, Proceedings |
---|---|
ISSN (Print) | 2155-0751 |
Conference
Conference | 33rd ACM International Conference on Information and Knowledge Management, CIKM 2024 |
---|---|
Country/Territory | United States |
City | Boise |
Period | 21/10/24 → 25/10/24 |
Bibliographical note
Publisher Copyright:© 2024 ACM.
Keywords
- computer vision
- deep learning
- explainable AI