Laboratory of Informatics of Grenoble Équipe Ingénierie de l'Interaction Humain-Machine

Équipe Ingénierie de l'Interaction
Humain-Machine

Xplain: an Editor for building Self-Explanatory User Interfaces by Model-Driven Engineering

In Proceedings of the second ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS 2010). pages 41-46. 2010.

Alfonso García Frey, Gaëlle Calvary, Sophie Dupuy-Chessa

Abstract

Modern User Interfaces (UI) must deal with the increasing complexity of applications in terms of functionality as well as new properties as plasticity. The plasticity of a UI denotes its capacity of adaptation to the context of use while preserving its quality. The efforts in plasticity have focused on the (meta) modeling of the UI, but the quality remains uncovered. This paper describes an on-going research that studies a method to develop Self-Explanatory User Interfaces as well as an editor that implements this method. Self-explanation makes reference to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what’s the purpose of this button?), its current state (why is the menu disabled?) as well as the evolution of the state (how can I enable this feature?). Explanations are provided by embedded models.