COoperative Real-Time experiences with EXtended reality

What is extended reality (XR)?

XR is used as an umbrella term to describe the immersive technologies that can connect physical and virtual worlds. From the technologies that already exist — augmented reality (AR), virtual reality (VR), and mixed reality (MR) —, to those that are still to be created.






Widespread adoption of remote work

  • The share of employees who usually or sometimes work from home rose from 14.6% to 24.4% between 2019 and 2021.
  • In Europe, the proportion of people who work remotely went from 5% to 40% due to the COVID-19 pandemic.
  • Today, all the signs are that remote work is here to stay. 72% of employees say their organisation is planning some form of permanent teleworking in the future and 97% would like to work remotely, at least part of their working day, for the rest of their career.

Difficulty for companies to adapt to new ways of working, where collaboration is vital

  • Existing services and applications aimed at facilitating remote team collaboration — from video conferencing systems to project management platforms — are not yet ready to efficiently and effectively support all types of activities.
  • Extended reality (XR)-based tools, which can enhance remote collaboration and communication, present significant challenges for most businesses (for example, because they require a large investment and are difficult to use).
  • Skills training for production sites, remote support for complex maintenance tasks or planning activities that depend on local physical configurations are examples of this.


To ensure large adoption and fast scaling of our XR remote cooperation solution, we will develop an innovative digital workplace using the teleconferencing solution Rainbow, from our partner Alcatel Lucent Enterprise, as a backbone.

Our XR framework will be open, versatile, inclusive, scalable and privacy aware.


Our project is pursuing 6 specific scientific and technical objectives leading to innovative solutions.

The main objective of CORTEX2 is to develop an open, versatile, inclusive and scalable digital workplace — thus addressing the challenges of the current limitation of technologies to support a large number of simultaneous users, joining with possibly heterogeneous devices.

The use of videoconferencing systems has a significant environmental footprint. For example, one hour of streaming or videoconferencing can emit between 150 and 1,000 grams of carbon dioxide, depending on the service used.

During online calls, many documents are shared, often just to convey the general idea of their content. Another of the project’s innovations will be to automatically summarise long documents before sharing them and send the full version only on demand.

Extended Reality experiences should be easy to use even for occasional users without strong technical background. Our objective is to simplify the use of AR by including several technical modules in our framework:

  • Instantaneous 3D modelling
  • Natural gestures recognition and interpretation
  • Semantic matching of surrounding spaces

High-level semantic understanding of visual situations and audio conversations will be beneficial to the users of remote tele-cooperation tools, since it allows for the development of additional services such as automatic meeting summary, visual AR support for spoken conversation and alignment of semantic spaces.

The objective is to create more immersive experiences for participants of videoconferences, by integrating rich contextual IoT information to video streams, rendered as AR annotations on top of displayed objects and persons.

Beyond Scientific and technical objectives…

CORTEX2 will create a novel technology to facilitate remote collaborative work, which raises ethical, legal and social challenges we will evaluate during the project.


Easy-to-use and powerful XR experiences with instant 3D reconstruction of environments and objects, and simplified use of natural gestures in collaborative meetings.

Full integration of internet of things (IoT) devices into XR experiences to optimise interaction with running systems and processes.

Full support for augmented reality (AR) experiences as an extension of video conferencing systems when using heterogeneous service end devices through a novel Mediation Gateway platform.

Resource-efficient teleconferencing tools through innovative transmission methods and automatic summarization of shared long documents.

Fusion of vision and audio for multichannel semantic interpretation and enhanced tools such as virtual conversational agents and automatic meeting summarization.

Optimal extension possibilities and broad adoption by delivering the core system with open APIs and launching open calls to enable further technical extensions, more comprehensive use cases, and deeper evaluation and assessment.