In recent years, the field of Explainable Artificial Intelligence (XAI) has gained significant traction, aiming to make AI systems more transparent and their decisions understandable. A meta-analysis published in March 2025 examined how XAI influences human performance in decision support systems. The study found that while XAI-based decision support can enhance task performance, the explanations themselves are not the primary driver of this improvement. Instead, factors such as the risk of bias in the studies and the type of explanation provided play more substantial roles. This suggests that while XAI holds promise, its effectiveness depends on various factors beyond the explanations themselves. arxiv.org
Another comprehensive review highlighted the importance of user-centric design in XAI systems. It emphasized that developers should prioritize user needs for explainability and ensure that XAI systems are accessible and understandable to non-expert users. By focusing on these aspects, developers can create systems that not only perform effectively but also meet user expectations, thereby enhancing trust and usability. mdpi.com