Laboratory of Informatics of Grenoble Engineering Human-Computer Interaction Research Group

Engineering Human-Computer Interaction
Research Group

Handheld AR/AV indoor navigation and detailed information with contextual interaction

In Demonstration, Conference ISMAR 2011, 10th IEEE International Symposium on Mixed and Augmented Reality, Switzerland, October 26-29, 2011. 2011.

Masakatsu Kourogi, Koji Makita, Thomas Vincent, Takashi Okuma, Jun Nishida, Tomoya Ishikawa, Laurence Nigay, Takeshi Kurata

Abstract

The demonstration shows a handheld system for indoor navigation to a specific exhibit item followed by detailed information about the exhibit with contextual AR/AV interaction. The system provides the following four key functions:

(1) Indoor navigation based on a PDR (Pedestrian Dead Reckoning) localization method combined with map matching using the built-in sensors (3-axis accelerometers, gyroscopes and magnetometers) of the waist mounted device.

(2) Coarse estimation of location and orientation by making correspondence between Virtualized-Reality (VR) models of environments and images from the handheld device camera.

(3) Fine estimation of location and attitude of the handheld device based on visual AR tracking methods.

(4) Contextual AR/AV (Augmented Virtuality) interaction widgets (e.g., buttons and menus) that provide detailed information about the exhibit. Widgets are contextual according to the relative position of the user to the exhibit.

Any participant can experience the AR/AV system, by being directed to search for a target exhibit to obtain further detailed information about the exhibit.

What makes our demonstration unique is the integration of indoor navigation capabilities with interactive AR/AV functionalities for augmenting an exhibit.