De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk

Deel deze publicatie

Formalizing explanation design through interaction patterns in human-AI decision support

Formalizing explanation design through interaction patterns in human-AI decision support

Samenvatting

Trust in AI is crucial for effective and responsible use in high-stakes sectors like healthcare and finance. One of the most commonly used techniques to mitigate mistrust in AI and even increase trust is the use of Explainable AI models, which enables human understanding of certain decisions made by AI-based systems. Interaction design, the practice of designing interactive systems, plays an important role in promoting trust by improving explainability, interpretability, and transparency, ultimately enabling users to feel more in control and confident in the system’s decisions. This paper introduces, based on an empirical study with experts from various fields, the concept of Explanation Stream Patterns, which are interaction patterns that structure and organize the flow of explanations in decision support systems. Explanation Stream Patterns formalize explanation streams by incorporating procedures such as progressive disclosure of explanations or interacting with explanations in a more deliberate way through cognitive forcing functions. We argue that well-defined Explanation Stream Patterns provide practical tools for designing interactive systems that enhance human-AI decision-making.

Toon meer
Organisatie
Afdeling
Lectoraat
Gepubliceerd inProceedings of the 4th International Conference on Hybrid Human-Artificial Intelligence, HHAI 2025 IOS Press, Vol. 408, Pagina's: 262-276
Jaar2025
Type
DOI10.3233/FAIA250644
TaalEngels

Op de HBO Kennisbank vind je publicaties van 26 hogescholen

De grootste kennisbank van het HBO

Inspiratie op jouw vakgebied

Vrij toegankelijk