JOURNAL ARTICLE

Toolkit to support intelligibility in context-aware applications

Abstract

Context-aware applications should be intelligible so users can better understand how they work and improve their trust in them. However, providing intelligibility is non-trivial and requires the developer to understand how to generate explanations from application decision models. Furthermore, users need different types of explanations and this complicates the implementation of intelligibility. We have developed the Intelligibility Toolkit that makes it easy for application developers to obtain eight types of explanations from the most popular decision models of context-aware applications. We describe its extensible architecture, and the explanation generation algorithms we developed. We validate the usefulness of the toolkit with three canonical applications that use the toolkit to generate explanations for end-users.

Keywords:
Computer science Intelligibility (philosophy) Extensibility Architecture Human–computer interaction Context model Data science Software engineering Artificial intelligence Programming language

Metrics

134
Cited By
10.24
FWCI (Field Weighted Citation Impact)
47
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Context-Aware Activity Recognition Systems
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Personal Information Management and User Behavior
Social Sciences →  Decision Sciences →  Information Systems and Management
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
© 2026 ScienceGate Book Chapters — All rights reserved.