Efficient Finger Model and Accurate Tracking for Hover-and-Touch, Mid-air and Microgesture Interaction

1Univ. Grenoble Alpes, CNRS, Grenoble INP, LIG, 2Inria, LJK
teaser

Abstract

Bare-handed gestural interaction with computer systems is widespread, whether with touchscreens or Augmented Reality headsets. Various forms of gestural interaction exist including hover-and-touch, mid-air and microgesture interaction. Studying the full benefits of these gestural interactions, and their combinations, is currently not possible due to the inadequate performances of the existing tracking solutions.

To address this problem, we propose a marker-based visual tracking algorithm with a novel finger model. A key contribution is the simplicity of the finger and fingertip model (i.e. cylinders and a sphere respectively). This simple model leads to low computational cost (600 microsecond), high precision (0.02 mm) and accurate (one millimeter) fingertip tracking, without impeding finger movement.

We illustrate the benefits of the proposed tracking approach with a demonstration application combining hover-and-touch, mid-air and microgesture interactions for editing a 3D point cloud.