Let the user ask questions to the system

AI Model
Neural Network None Probabilistic Reinforcement Learning White Box
AI Task
Behaviour Learning Binary classification Regression Text Classification
Application Domain
Artificial Intelligence and Robotics Systems Finance/Economics General Health Media and Communication
Type of Users
AI experts Domain-experts Generic Non-experts
Explanation Modality
Text Visual
XAI Model
Counter-exemplars/factual Exemplars Features Importance
Related Papers

The system should let the user conversate with it to ask inquiries as one would with chatbots and current personal assistant devices such as Siri or Alexa [10.1016/j.artint.2021.103507, 10.1109/PacificVis53943.2022.00020].

This can also be done in the form of what-if questions to show the user counterfactual examples [10.1145/3290605.3300809]. These can let the user observe the effect that modified features have on the model’s prediction. Examples: “Given a house and its predicted price of $250,000, how would the price change if it had an extra bedroom?” or “Given a house and its predicted price of $250,000, what would I have to change to increase its predicted price to $300,000?

Follow-up questions can arise after reading explanations coming from AI; this need is called Multi-step explainability [10.1016/j.ijhcs.2022.102941]. Obtaining answers to follow-up questions can support the user in validating the AI recommendation. Therefore, the system should allow further inquiries from the user after presenting them with a main explanation.