We will demonstrate the added value of our platform in 3 pilots

We will implement three different use cases as pilots to test the integration of all the components of the CORTEX2 framework, as a rehearsal of the framework that will be made available to selected projects during the FSTP step.

The pilots will not only take into account the technology performance but also the assessment of human, social and societal impact of the immersive platform. Hence,they will enable to assess in various contexts and environments the quality of the user experience when using our solution, its acceptability regarding the ecological and societal impact and the impacts of such immersive collaboration on the overall performance.


Industrial Remote Cooperation

We will demonstrate that an XR immersive experience can be reached with heterogeneous and off-the-shelf mobile devices and with limited bandwidth conditions while improving productivity and reducing environmental footprint.

It will highlight the implementation of these services:

  • Gesture analysis and scene semantics analysis to inject annotations in video streams.
  • Audio transcription and voice commands to control the immersive environment and document and record the intervention.
  • Support and mixing of multiple videos and IoT data sources from non-immersive devices to compose an on-demand immersive collaboration space with augmented data such as industrial data, gesture interpretation, 3D image insertion, that will meet the front-line technician and expert needs depending on their devices involved.
  • Optimization of network bandwidth usage through video and metadata streams orchestration as well as rendering distribution.


Remote technical training

We will demonstrate that VR/AR allow for efficient knowledge transmission in one-to-many situations where the remote instructor can simultaneously help several trainees while referring to physical objects such as industrial equipment.

This pilot will explore the use case of a trainer of a technical learning session being assisted to allow him to deliver remotely a learning session using VR and showing manipulation of a machine to trainees. The immersive collaboration space will enable trainees and trainer to interact in real-time not only between them but also with the machine model.

The use case is based on the training of qualified staff in complex and technical tasks on large and complex machines. The virtualization aspect should allow both face-to-face and remote training.

The main objectives of these trainings aim at:

  • The comprehension of the main components of the machine.
  • The correct operation of the vehicle in a safe way, linked with its surrounding environment.
  • The improvement of the efficiency while using the machine, improvement of the skills and the tuning of the settings.
  • The use of the virtual world allows to simulate dangerous situations, the collaborative aspect shall allow also to illustrate misuse scenarios of
    the machine.


Business meetings

We will demonstrate that VR/AR enriched business meetings allow seamless integration of remote participants and improve productivity.

This pilot will allow us to develop an innovative business meeting support system, integrating several functionalities to improve and enrich the participants’ experience. Such a tool will facilitate integration of remote participants using VR and AR techniques on the one hand, to provide remote users with a perception of visual and auditory immersion close to real presence; on the other hand, to offer a representation of the remote person to the other participants of the meeting.

The following advanced AR features will be made available to reinforce user inclusion:

  • Visual and audio immersion of the remote user.
  • Modalities such as overlay display for the visualisation of information concerning both the collaboration’s participants (name, function, profiles, etc.) and the interaction: subtitle, main topics discussed, recommendation of actions and any other information.
  • Filmed or avatar representation of the remote user, with symbolic transcription of the non-verbal communication acts of the remote person who is provided with a panel of predefined actions that are automatically recognized: request to speak, participation in a vote, expression of agreement or disagreement, etc.
  • Virtual representation of collaborative tools and artefacts such as board, projection screen and documents.
  • Advance added value services will be provided, such as meeting transcription and automatic subtitling as well as document summarization and automatic minutes generation.