Invisible Users: Uncovering End-Users’ Requirements for Explainable AI via Explanation Forms and Goals. (arXiv:2302.06609v1 [cs.HC])

Non-technical end-users are silent and invisible users of the
state-of-the-art explainable artificial intelligence (XAI) technologies. Their
demands and requirements for AI explainability are not incorporated into the
design and evaluation of XAI techniques, which are developed to explain the
rationales of AI decisions to end-users and assist their critical decisions.
This makes XAI techniques ineffective or even harmful in high-stakes
applications, such as healthcare, criminal justice, finance, and autonomous
driving systems. To systematically understand end-users’ requirements to
support the technical development of XAI, we conducted the EUCA user study with
32 layperson participants in four AI-assisted critical tasks. The study
identified comprehensive user requirements for feature-, example-, and
rule-based XAI techniques (manifested by the end-user-friendly explanation
forms) and XAI evaluation objectives (manifested by the explanation goals),
which were shown to be helpful to directly inspire the proposal of new XAI
algorithms and evaluation metrics. The EUCA study findings, the identified
explanation forms and goals for technical specification, and the EUCA study
dataset support the design and evaluation of end-user-centered XAI techniques
for accessible, safe, and accountable AI.



Related post