Our paper, “Augmented telecommunication in factory setting”, was presented at the Proceedings of the 21st EuroXR International Conference (EuroXR 2024).

Introduction

As a result of the COVID-19 pandemic, the possibility of remote work or “Home Office” has been normalized by many companies. Although the existing telecommunication tools are sufficient for many mainstream tasks, they lack the capabilities of 3D interaction which is necessary for complex tasks which require physical presence for engaging in problem solving.

As an early implementation of collaborative 3D communication tools, Microsoft (Chen et al., 2025) developed the HoloLens system, which introduces a novel interaction model for supporting collaboration between a head-mounted display (HMD) user and remote participants. The HoloLens allows remote companions to join the AR space by hitching onto the view of the primary HMD user through Skype-enabled devices, such as tablets or PCs. This system facilitates asynchronous interaction in a shared 3D space with digital objects, allowing remote parties to contribute to tasks and have their inputs reflected back to the primary user in real-time, thus enabling new scenarios for remote collaboration.

Further advancements in remote collaboration systems have focused on complex tasks like environmental pollution analysis, which require expertise from multiple fields. One such system was designed by Mahmood et al. (2019). It uses mixed reality to support co-presence and collaborative analysis, demonstrating improved remote analysis through shared user and data spaces. Drey et al. (2022) explored how the benefits of pair-learning and virtual reality (VR) can be combined by comparing symmetric systems, where both peers use VR, and asymmetric systems, where only one peer uses VR and the other uses a tablet. They found that the symmetric system significantly enhanced presence, immersion, and reduced cognitive load, which are important for learning. However, both systems resulted in similar learning outcomes, demonstrating that both symmetric and asymmetric setups are effective for co-located VR pair-learning.

In industrial and technical settings, operating machinery often requires assistance or training that can be difficult to acquire with traditional documentation or voice/video calls alone. These methods often fail to convey spatial relationships, leading to miscommunication and repeated explanations. To address these challenges, we start by 3D scanning the machines and environments ahead of time, to have them available when running the application. The technician on site is assisted by a remote expert, with the option for additional observers, using multi-device support. Depending on available hardware, participants join the session through their respective devices (PC, XR headset), with the software adapting to features like webcams and tracking.

In this setup, the expert views a virtual representation of the object or environment and can track the technician’s pose to better understand what they are looking at. The expert can place and manipulate 3D annotations in the scene and provide additional guidance via voice and video. The technician sees the actual scene through a webcam or XR headset with the expert’s annotations superimposed, matching the 3D position. This setup enables efficient collaboration between the expert and technician to solve complex problems more effectively.

While existing systems such as HoloLens and mixed reality platforms focus on immersive experiences, our approach is tailored to the industrial environment. It addresses the current issue of heterogeneous hardware availability and usage, allowing flexibility through multi-device integration and adapting to the hardware on hand.

Authors

Narek Minaskan, Bastian Krayer, Alain Pagani, and Didier Stricker

Read the full publication


Access all our CORTEX2 publications.

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.