Living in Smart Environments:  Natural and economic gesture-based HCI

Interactive, intelligent computing as an efficient human–computer interaction is assuming utmost importance in our daily lives. Gesture recognition is the process by which the gestures produced by the user are recognized by the receiver. Gestures are expressive, meaningful body motions involving physical movements of the fingers, hands, arms, etc., with the intent of: 1) conveying meaningful information or 2) interacting with the environment. Although the current progress in gesture recognition is encouraging, further theoretical, as well as computational advances are needed before gestures can be widely used for HCI (human-computer interaction). The aim of this project is to develop a novel natural and economic gesture-based HCI, the so-called subtle gesture, i.e. systems able to recognize natural and economic gestures that take into account the painfulness and long-term efforts associated with the everyday use of certain gestures.

Keywords

Subtle gesture, gesture recognition, natural interaction, intelligent environment, HCI (human-computer interaction)

Outcomes

Main outcomes of the project can be resumed as follows: 1) Design and develop a hybrid sensor approach for gesture acquisition and recognition. Such hybrid sensor will fusion multimodal signals gathered from multiple sensing technologies either distributed in the physical environment or worn by the user himself; 2) Design and develop a consistent gesture language and dialogs. The gesture language will include both human and technical design factors and will adapt to the context of use and user posture.

Website of the project

 

Project Information

Ongoing project, started in 2011.​