TAPIOCA TAngibilitée Physiologique Instrumentée : Outil mixte redimensionnable pour la Conception d'Artefact
  Exploratory project financed by Persyval
  • Laboratory GIPS-Lab (SAIGA) : Franck Quaine
  • Laboratory G-SCOP (CC) : Cédric Masclet
  • Laboratory LIG (IIHM) : Yann Laurillau, Céline Coutrix
  • Laboratory LJK (IMAGINE) : Damien Rohmer, Jean-Claude Leon
Aims and goals

Designing 3D and CAD models is a complex task. As it requires domain-dependant skills (electrical engineering, design, etc) to achieve this task, experts also need to be expert users mastering complex user interfaces using inadequate input devices (i.e. mouse).

This project investigates tangible interactions as means for experts to better interact with 3D and CAD model in a more natural way (NUI: Natural User Interaction), and thus to facilitate the design of such models. Precisely, interaction techniques combining tangible interaction based on resizable input interaction devices with muscle-based modalities (i.e. ElectroMyographic signals emitted by forearm muscles).

Several goals are considered:

  • To study and demonstrate the feasibility of resizable input interaction devices
  • To promote suitable EMG signal processing models for muscle-computer interaction (performance, fatigue, etc)
  • To design interaction models with 3D models able to adapt to gestures and muscle activity.

As a proof-of-concept, this project aims to design, build and develop hardware and software prototypes. In terms of user interaction, it considers positioning tasks of 3D/CAD virtual objects for various configurations as well as the manipulation of organic virtual objects with multiple levels of detail. This project is challenging as it jointly investigates NUI-based 3D/CAD modelling tools, muscle-computer and tangible interaction, resizable interaction devices, Furthermore, resizable interaction devices are very promising and are an emerging research at the international level but must face hardware as well as software design challenges.

Activities 2013-2015
  • Development of a very first prototype of resizable mixed object and experiments (2013-2014);
  • M2R training period (Silvan Cabot, sciences cognitives) that studies the use of resizable object for CAD-related tasks (2013-2014);
  • Developement of a software library able to capture EMG signals as a basis for the development of interactive systems (2013)
  • Seminar of Audrey Serna : "Interaction située, modularité, apprentissage" (26 juin 2014) ;
  • Developement of a second prototype of resizable mixed object and experiments (2014-2015);
  • Two training periods of DUT: Benjamin Lausenaz (to study correlation between the shape of a 1D object and the associated visual representation) and Shen Mesrobian (software development of a prototype proof-of-concept enabling interaction based on EMG to control a 3D model) (2015)
Ongoing activities (2015)
  • Campaign of measures to study correlation between size and shape of a physical object and EMG signatures (2013-)
  • Resizable objects : to study correlation between the shape of a 1D object and the associated visual representation
  • Software development of a prototype proof-of-concept enabling interaction based on EMG to control a 3D model
  • Phinyomark, A., Quaine, F., Charbonnier, S., Serviere, C., Tarpin-Bernard, F., Laurillau, Y., 2014. Feature extraction of the first difference of EMG time series for EMG pattern recognition. Computer Methods and Programs in Biomedicine 117, 247 – 256.
  • Phinyomark, A., Quaine, F., Laurillau, Y., 2014. The Relationship between Anthropometric Variables and Features of Electromyography Signal for Human-Computer Interface, in: Naik, D.G. (Ed.), Applications, Challenges, and Advancements in Electromyography Signal Processing. IGI Global, Autralia, pp. 325–357.
  • Coutrix, C., Masclet, C, 2015. Shape-Change for Zoomable TUIs: Opportunities and Limits of a Resizable Slider. Proceedings of the 15th IFIP TC13 Conference on Human-Computer Interaction (INTERACT'15), 14-18 September 2015, Bamberg, Germany, Springer. 2015. à paraître.