Navigating explanations

AI Model
Agnostic Neural Network Reinforcement Learning Unknown
AI Task
Behaviour Learning Image Classification Multi-class Classification
Application Domain
Artificial Intelligence and Robotics Systems Education General Health Natural Language Processing
Type of Users
AI experts Generic Non-experts
Explanation Modality
Text Visual
XAI Model
Exemplars Features Importance None
Related Papers

Users should have freedom and control in accessing and navigating the explanations by including UI controls such as “next”, “previous”, or “exit”.

This can lead to unlock visual step-by-step explanations, which can be more appealing, relatable, and easy to understand [10.1109/VL/HCC51201.2021.9576440]. Explanations can be further improved when combined with animations such as fluid transitions between different-level views to improve engagement, comprehension, and enjoyment [10.1109/TVCG.2020.3030418].

Presenting explanations in multiple “pages” or segments allows users to control the depth of information they receive, aligning with their cognitive needs and preventing information overload (“Incremental Information Disclosure”) [10.1145/3627043.3659566].

Navigation can also happen at feature space to enable “what-if” analyses [10.1109/PacificVis48177.2020.7090, 10.1109/PacificVis53943.2022.00020], especially when datasets are large [10.1109/TVCG.2022.3209384].