Improving Perception & Interaction


Vehicle drivers – first responders on emergency calls and ordinary drivers using autonomous cars – must be able to access the right information at the right time, and be able to understand the situation immediately. API (Improving Perception & Interaction), a two-year project launched in April 2017, is attempting to meet these needs by building an environment to test the latest perception and interaction technologies and evaluate their contribution to the user experience.

The API project team will be working on leading-edge perception and human-machine interaction tools with a view to integrating them into smart systems capable of controlling user scenarios in complex environments and stressful situations. The innovative technologies being studied include augmented reality systems that display information in the field of view, voice recognition and language processing in noisy environments, driver monitoring and motion detection systems, etc.


The API project has five technological and scientific objectives:

  • Study and evaluate innovative perception and interaction technologies appropriate to both first responders and drivers of autonomous vehicles;
  • Define and evaluate the most suitable human-machine interface technologies;
  • Integrate these technologies into smart systems capable of analysing and instantly understanding the user’s situation;
  • Develop and integrate a vehicle demonstrator followed by a pedestrian demonstrator;
  • Provide demonstrations of usage scenarios investigated.

Target markets

Two major case studies will be carried out as part of the API project:

  • A B2B case study led by Airbus looking at secure communication systems for first responders (police, emergency services, etc.). At the moment they mainly use cumbersome walkie-talkie type systems that can handle voice and data. The use of more advanced multimedia and web-based solutions would be particularly beneficial for these users, who have specific requirements related to the critical nature of their missions (leaving hands free, concentration, stress, etc.), and the user experience extends beyond the vehicle into the pedestrian environment.
  • A B2C case study led by Valeo to develop a driver support system for autonomous vehicles. New human-machine interaction functionalities will be analysed and evaluated. This case study follows on from the LRA (Location and Augmented Reality) project which verified the ergonomic principle of augmented reality for driving and studied technical solutions to display information in AR.

Find out more

Autonomous Transport
Industrial Partners
Airbus Valeo
Project Manager
Serge Delmas