As part of the CORTEX2 Innovators Support Programme, the SELFEX2 team has successfully developed and validated a mixed reality training platform that integrates augmented and virtual reality devices — such as HoloLens 2, MANUS and MAGOS haptic gloves — into remote learning environments. Now that the Programme has concluded, SELFEX2 has demonstrated both the technical feasibility and user value of its solution. From system usability to communication quality and task accuracy, the results point to a promising future for immersive remote training.

Keep reading to learn about the key achievements and future steps in SELFEX2.

SELFEX2’s progress on the CORTEX2 Programme

Q: How would you summarise the advances SELFEX2 has made during Phase 2 of the CORTEX2 Support Programme?

A: The SELFEX2 team has carried out the development and integration of SELFEX2 into the CORTEX2 framework and the subsequent deployment of an MVP. The initial architecture scheme has been followed and, together with the information provided by CORTEX2, it has been possible to achieve the real-time behaviour provided by using Rainbow calls. In this case, the SELFEX2 platform can launch a rainbow call between SELFEX2 users, sharing video, audio and kinematic information (finger and wrist positions).

Q: What has SELFEX2 achieved now that the Programme is complete?

A: As part of the validation plan, tests were performed considering four complementary approaches: P2P & Conference evaluation, evaluation from the actuator’s perspective and, subsequently, as an external observer of the system behaviour, and combined evaluation.

CORTEX2 Innovators_SELFEX2 progress update_Hololens2
Hololens 2

After the evaluation, specific questionnaires were sent to the participants to collect their assessment of the different components of the system. The forms cover aspects such as the use of HoloLens 2, the screen display, the experience with MANUS and MAGOS gloves, and the quality of communication, both in peer-to-peer calls and in multiconference sessions. The questions are oriented to evaluate comfort, usability, movement synchronisation, audio and video quality, as well as perceived value in the learning process. In addition to these subjective opinions, objective data were also collected. This included tracking the useful time spent in each session (taking into account preparation and execution times) and measuring the transmission delays for both audio (HoloLens 2) and glove data (MANUS and MAGOS). Combining user feedback with these technical measurements has provided a clear view of the system’s performance and its potential for real-world training environments.

SFC, as end-user, established a set of acceptance criteria to verify the fulfilment of the project requirements.

The general average rating for the MANUS and MAGOS gloves exceeded the expected threshold (>2), with scores of 3.08 and 2.48, respectively, for the models evaluated. Both results fall within the ‘acceptable’ range, reflecting a positive user perception of comfort and usability during the tests. In terms of accuracy, both MANUS and MAGOS gloves met the expected threshold (>2), achieving average scores of 3 and 2, respectively. Both results fall within the ‘acceptable’ range, confirming their suitability for the evaluated tasks.

In terms of visualisation, the feedback from participants was positive. HoloLens 2 scored 3.23 on average, and the screen display achieved 2.53, both above the defined acceptance threshold and aligned with the expected usability level for this kind of training. The measured communication delays stayed within acceptable limits for the tests performed. The audio delay through HoloLens 2 averaged 3449 ms (target <5 s), while the data transmission from the gloves via DataChannel maintained an excellent 41 ms average (target <1 s). Both results ensured the correct execution of the training sessions.

In summary, SELFEX2 demonstrated technical and functional feasibility in remote training environments, with full integration of devices such as HoloLens 2 and haptic gloves (MANUS and MAGOS). Execution times, communication quality and user experience have reached acceptable levels for the evaluated scenarios.

Q: What would you highlight about the Support Programme, what’s helped advance your solution the most?

A: The highlight of the CORTEX2 program has been access to key resources that have enabled a thorough and rigorous evaluation of the solution. Thanks to the Programme, we have had the environment, devices and technical support to validate our proposal in real conditions. This opportunity has been fundamental to advancing the development and confirming the technical and functional feasibility of the solution.

Q: What’s the status of SELFEX2 after completing the Programme? What are your next steps?

A: Following the conclusions obtained, some areas for improvement have been identified, which open the door to future optimisations in terms of accuracy, comfort and visualisation.

  • Virtualisation of the scene
  • Increased number of antennas for motion capture
  • Improved computer performance

Check out SELFEX2’s previous interview and stay updated on its progress!

Want to know more about other CORTEX2 innovators’ updates? Browse all our supported teams on the CORTEX2 website:

Open Call 1 winners  –  Open Call 2 winners

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.