ASGARD - Advanced System for Gesture and Activity Recognition and Detection

 
 
 
The goal of the ASGARD project is to create a system allowing people to interact with smart environments in a natural way. In particular, the aim of this project is integrating natural device-free gestural interaction in a smart environment and augmenting gestures' meaning through context awareness. Moreover, this system will integrate a smart feedback in the environment using augmented reality.

Keywords

gesture interaction, context awareness, activity recognition, augmented reality, ambient intelligence

Outcomes

A context-aware system for deictic gestures interaction with smart environments has been developed as a proof-of-concept. This prototype tracks multiple users; moreover, it recognizes inhabitants' postures and gestures in real-time. This information, enriched with smart objects' coordinates, is reconstructed in a 3D model to allow the recognition process. Finally, the system executes the programmed tasks to support the users' activity. Two Microsoft Kinect depth cameras have been used to acquire the data and a framework for the communication with the smart objects has been adopted.

Website of the project

project.eia-fr.ch/asgard

Project Information

Ongoing project, PhD thesis, duration 3 years (2010 - 2013).