Laboratory of Informatics of Grenoble Équipe Ingénierie de l'Interaction Humain-Machine

Équipe Ingénierie de l'Interaction

Understanding and designing microgesture interaction


Adrien Chaffangeon Caillet


Over the last three decades, some of the objects we use in our daily life have gradually become computers. Our habits are changing with these mutations, and it is now not uncommon that we interact with these computers while performing other tasks, e.g. checking our GPS position on our smartwatch while biking. Over the last ten years, a new interaction modality has emerged to meet these needs, hand microgestures. Hand microgestures, simplified to microgestures, are fast and subtle movements of the fingers. They enable interaction in parallel with a main task, as they are quick and can be performed while holding an object. However, as it is a recent modality, the field of research still lacks structure and sometimes coherence. For instance, there is no convention for naming or describing microgestures, which can lead to terminological inconsistencies between different studies. Moreover, the literature focuses mainly on how to build systems to sense and recognize microgestures. Thus, few studies examine the expected properties of microgestures, such as speed or low impact on physical fatigue in certain contexts of use. As a result, this thesis focuses on the study of microgestures, from their description to their application in a specific field, i.e. Augmented Reality (AR), as well as their sensing and recognition.Our scientific approac is comprised of three steps. In the first step, we focus on the space of possibilities. After a literature review to highlight the diversity of microgestures and terminological issues, we present μGlyph, a notation to describe microgestures. Next, we present a user study to understand the constraints imposed when holding an object on the feasibility of microgestures. The results of this study were used to create a set of three rules to determine the feasibility of microgestures in different contexts, i.e. different grasps. For ease of use, we reused μGlyph to provide a visual description of these rules. Finally, we study different ways of making a set of microgestures compatible with many contexts, i.e. that each microgesture in the set is feasible in all contexts.With the space of possibilities defined, we focus on the design of systems for sensing and recognizing microgestures. After a review of such systems in the literature, we present our easily reproducible sensing systems that we implemented, resulting in two gloves. We then present a user study on the impact of wearing these gloves on the feasibility of microgestures. Our results suggest that our gloves have little impact on the feasibility of microgestures. Next, we present a more comprehensive system that recognizes both microgestures and contexts. Our studies on recognition rates suggest that our system is usable for microgesture detection, with a recognition rate of 94%, but needs to be improved for context recognition, with a rate of 80%. Finally, we present a proof-of-concept of a modular glove and a recognition system based on μGlyph to enable the unification of microgesture sensing systems.Our final step is then dedicated to interaction techniques based on microgestures. We focus on the properties of microgestures for 3D selection in AR. We have designed two 3D selection techniques based on eye-gaze and microgestures for interaction with low fatigue. Our results suggest that the combination of eye-gaze and microgesture enables fast interaction while minimizing fatigue, compared to the commonly used virtual pointer. We conclude with an extension of our techniques to integrate 3D object manipulation in AR.