Laboratory of Informatics of Grenoble Engineering Human-Computer Interaction Research Group

Engineering Human-Computer Interaction
Research Group

Bimanual Input for Multiscale Navigation with Pressure and Touch Gestures

In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016), Tokyo, Japan, November 12-16, 2016. pages 145-152. 2016.

Sébastien Pelurson, Laurence Nigay

Abstract

We explore the combination of touch modalities with pressure-based modalities for multiscale navigation in bifocal views. We investigate a two-hand mobile configuration in which: 1) The dominant hand is kept free for precise touch interaction at any scale of a bifocal view, and 2) The non-dominant hand is used for holding the device in landscape mode, keeping the thumb free for pressure input for navigation at the context scale. The pressure sensor is fixed to the front bezel. Our investigation of pressure-based modalities involves two design options: control (continuous or discrete) and inertia (with or without). The pressure-based modalities are compared to touch-only modalities: the well-known drag-flick and drag-drop modalities. The results show that continuous pressure-based modality without inertia is 1) the fastest one along with the drag-drop touch modality 2) is preferred by the users and 3) importantly minimizes screen occlusion during a phase that requires navigating a large part of the information space.