THE CORTEX2PILOTS
We have demonstrated the added value of our platform in 3 pilots
We have implemented three different use cases as pilots to test the integration of all the components of our CORTEX2 framework.
The pilots have not only taken into account the technology performance but also assessed the human, social, and societal impact of the immersive platform. Hence, they have enabled us to assess the quality of the user experience when using our solution in various contexts and environments, its acceptability regarding the ecological and societal impacts, and the effects of such immersive collaboration on overall performance.
PILOT 01
Industrial Remote Cooperation
We have demonstrated that an XR immersive experience can be achieved with heterogeneous and off-the-shelf mobile devices, even under limited bandwidth conditions, while improving productivity and reducing the environmental footprint.
We have implemented these services:
- Gesture analysis and scene semantics analysis to inject annotations in video streams.
- Audio transcription and voice commands to control the immersive environment and document and record the intervention.
- Support and mixing of multiple videos and IoT data sources from non-immersive devices to compose an on-demand immersive collaboration space with augmented data, such as industrial data, gesture interpretation, and 3D image insertion, to meet the front-line technician and expert needs, depending on the devices involved.
- Optimisation of network bandwidth usage through video and metadata streams orchestration as well as rendering distribution.
PILOT 02
Remote Technical Training
We have demonstrated that VR/AR enables efficient knowledge transmission in one-to-many situations, where the remote instructor can simultaneously assist multiple trainees while referencing physical objects, such as industrial equipment.
This pilot explores the use case of a trainer in a technical learning session, being assisted to deliver a learning session remotely using VR, and showing the manipulation of a machine to trainees. The immersive collaboration space enables trainees and trainers to interact in real-time, not only with each other but also with the machine model.
The use case involves training qualified staff in complex and technical tasks on large and complex machines. The virtualisation aspect allows both face-to-face and remote training.
The main objectives of these trainings are:
- The comprehension of the main components of a machine.
- The correct operation of a vehicle in a safe way, linked with its surrounding environment.
- The improvement in efficiency while using a machine.
- The improvement of skills and tuning of settings.
The use of the virtual world enables the simulation of hazardous situations, and the collaborative aspect facilitates the illustration of misuse scenarios involving a machine.
PILOT 03
Business Meetings
We have demonstrated that VR/AR-enriched business meetings allow seamless integration of remote participants and improve productivity.
This pilot has allowed us to develop an innovative business meeting support system, integrating several functionalities to improve and enrich the participants’ experience. Such a tool facilitates the integration of remote participants using VR and AR techniques, on the one hand, to provide remote users with a perception of visual and auditory immersion close to real presence, and on the other hand, to offer a representation of the remote person to the other meeting participants.
The following advanced AR features have been made available to reinforce user inclusion:
- Visual and audio immersion of the remote user.
- Modalities such as overlay display for the visualisation of information concerning both the collaboration’s participants (name, function, profiles, etc.) and the interaction, including subtitles, main topics discussed, recommendations of actions, and any other relevant information.
- Filmed or avatar representation of the remote user, with symbolic transcription of the non-verbal communication acts of the remote person who is provided with a panel of predefined actions that are automatically recognised: request to speak, participation in a vote, expression of agreement or disagreement, etc.
- Virtual representation of collaborative tools and artefacts such as a board, a projection screen and documents.
- Advanced value-added services, such as meeting transcription and automatic subtitling, as well as document summarisation and automatic minutes generation.
