Launched as part of the IA2 programme (Artificial Intelligence and Augmented Engineering) of IRT SystemX, the objective of the CAB project is to improve human-machine cooperation. In particular, it aims to develop an intelligent cockpit demonstrator, integrating a bidirectional and multimodal virtual assistant to support the operator in his decision-making.
This 48-month collaborative R&D project brings together four manufacturers (Dassault Aviation, Orange, RTE, SNCF) around concrete use cases. It aims to define and evaluate an intelligent cockpit integrating a virtual agent which will increase the operator’s decision-making capacities in real time in the face of complex and / or atypical situations, and which is able to learns from as well as teach the expert. This research work articulating the AI and HMI fields is unprecedented and responds to a growing demand among operators of critical systems or sensitive networks.
Among the main scientific obstacles to overcome are the representation and development of knowledge models, the two-way learning between the operator and AI, the characterisation and understanding of a complex situation by the virtual assistant and the relevance of the analysis, the personalisation of recommendations on the operator’s situation and level of expertise (junior, senior), the explainability of the recommendations made by the virtual assistant or even the acceptability of those by humans.
The notion of multimodality will also be studied in this project, in order to understand which of the operators’ sense(s) would be most relevant for interaction with the bidirectional assistant: visual, auditory / vocal, tactile or combined modes of interaction, or via the monitoring of the operator’s senses (eye tracking, biometric data etc).