In the rapidly evolving world of extended reality (XR), intuitive interaction is key. One of the innovators in CORTEX2’s support programme is MGL, a project that is pushing gesture recognition to the next level. It offers developers a high-precision toolkit for recognising hand gestures — enabling more natural, seamless interaction in immersive environments.

Learn more about their work and how CORTEX2 is helping them bring their vision to life.

Q: How would you describe MGL in one sentence?

A: The Magos Gestures Library (MGL), led by QUANTA & QUALIA, enables gesture recognition for intuitive interactions in XR environments.

Q: What problem are you solving, and what makes your solution unique?

A: Current hand-tracking solutions often fall short when it comes to precision, flexibility, and integration. They can feel clunky, unreliable, or limited to predefined gestures — posing challenges for developers and users alike.

Our solution changes that. MGL delivers millimetre-accurate tracking of finger joint movements, supports dynamic customisation of gestures, and integrates seamlessly with the CORTEX2 framework. That means more natural, personalised, and precise interactions, tailored to different use cases and user needs.

Q: What are the key objectives of MGL?

A: We are focused on three core goals:

  • Gesture recognition module development: Build a module that recognises at least 15 foundational gestures (like thumbs up, pinch, wave, and fist), offering reliable and intuitive gesture-based interaction in XR.

  • Customisation and extensibility: Enable developers to dynamically add and customise gestures, providing flexibility for diverse applications.

  • Seamless integration: Ensure smooth integration into the CORTEX2 framework, making MGL scalable and compatible with a range of XR scenarios.

CORTEX2 support programme progress

Q: What have you achieved so far?

A: We have already hit some significant milestones:

  • Completed a full library of foundational gestures — 15 in total, covering core actions like pinch, fist, and thumbs up.

  • Enabled dynamic gesture customisation, so that developers can tailor interactions to their specific use cases.

  • Built a Unity demo scene, offering a hands-on demonstration of all available gestures in action.

These steps not only showcase the system’s capabilities but also provide a flexible, user-friendly solution for XR environments, enhancing usability and interaction design.

Q: How is participating in CORTEX2 supporting MGL?

A: The programme has given us far more than just funding.

  • Mentorship and collaboration: CORTEX2 connects us with technical experts and a broader community of innovators. The guidance and feedback we have received have helped us solve challenges and refine our product.

  • Visibility and impact: Being part of the ecosystem opens doors—to new partnerships, wider audiences, and real-world adoption opportunities.

This support is helping us fast-track development and move closer to real-world deployment.

Q: What are your next steps within the CORTEX2 programme?

A: We are heading into the next phase of the programme with clear priorities:

  • Final integration of MGL into the CORTEX2 framework to ensure seamless compatibility.

  • Comprehensive validation, including extensive testing of gesture accuracy and reliability in real-world settings.

  • Demonstration and refinement: We will showcase the library in industrial applications, collect feedback, and continue to improve the system.

  • Collaboration and knowledge sharing: We will keep working closely with CORTEX2 partners to align objectives and spark further innovation.

MGL is setting a new standard for gesture-based interaction in XR — precision, personalisation, and plug-and-play readiness all in one. Stay tuned as they bring the power of natural gestures to immersive experiences across industries.


Learn more about MGL and stay updated on its progress!

Want to explore more XR innovation? Browse all our supported projects on the CORTEX2 website: 

Open Call 1 winners  –  Open Call 2 winners

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.