Internship: UP: Under Pressure
Context
The last decades saw the democratisation of the touch surfaces. Whether using tablet or smartphones a significant part of the population is using a touch surface on a daily basis. However, even though we often use this touch modality, we restrict ourselves to simple tasks as opposed to complexe task when using desktop computers. This is often due to a lack of precision, a lack of flexibility in the different mechanism at hand (eg. copy / paste) and a lack of expressivity (a lack of communication channels to send instructions to a machine) of the touch modality. The latter, which is not a problem on desktop computers or laptops that have other modalities (such as a keyboard), becomes problematic on a device that only rely on a touch sensor and therefore limits the number of functionalities that could be used. Most of the touch screens only provides an x/y location of the different fingers in contact. As a consequence, applications tends to rely on interactions that are time consuming which can lead to frustration or a feeling of inefficiency. Given the current trend, our use of such devices will likely be reinforced. For instance, professional use of smartphones are no longer confidential. Hence, it is crucial to reflect now on tomorrow’s interactions and avoid a daily use of ill-adapted tools.
Increasing the expressivity of the touch modality and the user efficiency are key points to face the challenge. Numerous works in HCI focus on developing new technologies technologies increasing the expressive power of our touch screens. Finger identification [Go17] or finger orientation [Wa09, Ro11] are few examples augmenting the interaction vocabulary. Concurrently, researchers explored the use of such new information and guide their usage [Go16, Ma18]. One particular information, the force applied on the screen, is now at a stage in which technologies are viable [Apple 3D Touch]. However, even though the technologies works perfectly, its usage hasn’t been explored with a user-centered approach. As a result, it is often judge hard to use, gadget and unfortunately considered useless. Next generations of iPhone will for instance no longer provide the 3D Touch technology even though recent research projects [Br09,An17,Co18,Go18] clearly showed force could increase efficiency and reduce user frustration.
Brewster et al. demonstrated that a simple pressure disambiguation on the keys of a virtual keyboard, to activate or not the use of uppercase, could improve the performances [Br09]. Corsten et al. studied an another classical component of a graphical interface, the drop-down list, and showed that the use of force-based interaction, compared to a classical interaction, reduces up to 94% the physical surface needed to interact with the digital elements while still improving time performances [Co18]. The use of the force at the OS level has also been investigated. Antoine et al. drastically improve the automatic scrolling of content [An17]. Goguey et al. demonstrated that new force-based techniques of text selection could lead to an expert use of such mechanism [Go18].
Expected results
The goal of the project is to explore and develop interaction techniques further demonstrating the potential benefit of force based interaction to improve day to day task completion. We provide a force sensitive smartphone and example source code of a previous project. Implementations will be done in Objective-C or Swift.
[Br09] Brewster et al. Pressure-Based Text Entry for Mobile Devices. MobileHCI'09.
[Wa09] Wang et al. Detecting and Leveraging Finger Orientation for Interaction with Direct-touch Surfaces. UIST’09.
[Ro11] Rogers et al. AnglePose: Robust, Precise Capacitive Touch Tracking via 3D Orientation Estimation. CHI’11.
[Go16] Goguey et al. The Performance and Preference of Different Fingers and Chords for Pointing, Dragging, and Object Transformation. CHI’16.
[An17] Antoine et al. ForceEdge: Controlling Autoscroll on Both Desktop and Mobile Computers Using the Force. CHI'17.
[Go17] Goguey et al. Leveraging Finger Identification to Integrate Multi-touch Command Selection and Parameter Manipulation. IJHCS’17.
[Co18] Corsten et al. Use the Force Picker, Luke: Space-Efficient Value Input on Force-Sensitive Mobile Touchscreens. CHI’18.
[Go18] Goguey et al. Improving Discoverability and Expert Performance in Force-Sensitive Text Selection for Touch Devices with Mode Gauges. CHI'18.
[Ma18] Mayer et al. Designing Finger Orientation Input for Mobile Touchscreens. MobileHCI'18.