Every year more and more people spend a considerable part of their life in cars. For this reason, carmakers are trying to make this “in-vehicle life” more enjoyable by equipping cars with various In-Vehicle Infotainment Systems (IVISs). All these systems need to be controllable by car users. The common approach is to position most of the controls in the central dashboard, in order to make them accessible also to passengers. Typical approaches make use of knobs and buttons, but over the years many carmakers have replaced these primordial systems with touchscreens, or advanced haptic controls. The AutoNUI project aims at conceiving, developing and studying a system for non-distracting and natural interaction with the In-Vehicle Information and communication System (IVIS).
The project is composed of different sub-projects. For example, in the WheelSense project, we embedded pressure sensors in the steering wheel in order to detect tangible gestures that the driver can perform on its surface. At the same time we detected the muscular activity of the driver using electromyography in order to improve the accuracy of gesture recognition. The driver can interact with the IVIS by means of tangible gestures that have been designed to allow the execution of secondary tasks. Thus, the proposed interface aims at minimizing the distraction of the driver from the primary task: driving.
multimodal interaction, tangible gestures, smart steering wheel, physiological signals, in-vehicle user interface, in-car natural interaction