Project description

Evaluation of the relevance of Human-Machine Interfaces to several sensory modalities

The massive introduction of touch screens has had the effect of suppressing the haptic feedback associated with the use of a physical button and increasing the solicitation of visual resources. The interest of the use of several sensory modalities is to avoid saturating one of the modalities, in most cases, that of vision. It is necessary to know that each sensory modality is associated with an attentional reservoir of its own. The challenge is to define the best combinations of human-machine interfaces (HMI) and to test them in different use cases.

Launched in 2018 for a period of three years, the CMI project focuses on the cockpit of the autonomous vehicle. It focuses on defining the best possible combinations of HMI solutions that solicit different human senses (sight, hearing, touch) to reduce the driver’s cognitive load and improve his or her intuitiveness.

Expected results

  • Develop and evaluate the relevance of HMI applications to multiple contextualized sensory modalities ;
  • Establish the right guidelines for HMI design ;
  • Explore the potential of a virtual assistant.

Implemented skills

Human-machine interaction
Systems engineering and software design

Targeted markets

  • Automobile

Supervised thesis in the framework of the project

Thesis#1: Reconfigurable virtual assistant for the interactive multimodal cockpit of an autonomous car (Université de Bordeaux).
Thesis#2: System architecture and compromise analysis of an intelligent glazing for the interactive multimodal cockpit of an autonomous car

Badge Mobility and autonomous transport
Mobility and autonomous transport
Project Status:État du projet : Completed
Industrial partner(s):Partenaire(s) industriel(s) :
Arkamys Renault Saint-Gobain Valeo
Academic Partner(s)Partenaire(s) académique(s)
CentraleSupélec ENSC (Bordeaux) IRT SystemX



Subscribe to IRT SystemX's

and receive every month the latest news from the institute: