The MINT team focuses on gestural interaction, i.e. the use of gesture for human-computer interaction (HCI). The New Oxford American Dictionary defines gesture as a movement of part of the body, especially a hand or the head, to express an idea or meaning. In the particular context of HCI, we are more specifically interested in movements that a computing system can sense and respond to. A gesture can thus be seen as a function of time into a set of sensed dimensions that might include but are not limited to positional information (the pressure exerted on a contact surface being an example of non-positional dimension).
Simple pointing gestures have long been supported by interactive graphics systems and the advent of robust and affordable sensing technologies has somewhat broadened their use of gestures. Swiping, rotating and pinching gestures are now commonly supported on touch-sensitive devices, for example. Yet the expressive power of the available gestures remains limited. The increasing diversity and complexity of computer-supported activities calls for more powerful gestural interactions. Our goal is to foster the emergence of these new interactions, to further broaden the use of gesture by supporting more complex operations. We are developing the scientific and technical foundations required to facilitate the design, implementation and evaluation of these interactions. Our interests include:
Gestures captured using held, worn or touched objects (e.g. a mouse, a glove or a touchscreen) or contactless perceptual technologies (e.g. computer vision);
Computational representations of these gestures;
Methods for characterizing and recognizing them;
Transfer functions used for non-isometric object manipulations;
Feedback mechanisms, and more particularly haptic ones;
Engineering tools to facilitate the implementation of gestural interaction techniques;
A collaboration with Detjon Brahimaj and colleagues of the L2EP
was presented as a paper at the French
national conference on Human Computer Interaction : IHM 2023.
"Cross-modal interaction of stereoscopy, surface deformation and tactile feedback on the perception of texture roughness in an active touch condition"
The paper is accessible here.
A performance of the project
Vibrating Shapes in collaboration
with Sebastien Beaumont and Ivann Cruz, was selected to be presented at the
Conference on New Interfaces for Musical Expression 2023.
The "revealed" augmented reality technology designed by the team (presented here and here ) will be used in the cyber-opera "Terres Rares" created by Éolie Songe.
Vincent Reynaert will be presenting a paper entitled The Effect of Rhythm in Mid-air Gestures on the User Experience in Virtual Reality at INTERACT 2021.
Fatma Ben Guefrech presented the paper on "Revealable Volume Displays" at IEEE Virtual Reality and 3D User Interfaces. The paper also received a honorable mention.
The team will be presenting a paper at IEEE Virtual Reality and 3D User Interfaces 2021 entitled : "Reveable Volume Displays : 3D Exploration of Mixed-Reality Public Exhibitions. More details soon !