Q: What is INTERACT in one sentence?

A: INTERACT is an AI-driven project that develops real-time sign language translation through 3D avatars in Extended Reality (XR), enabling deaf, hard-of-hearing, and multilingual participants to fully engage in business meetings and teleconferences by integrating speech-to-text, sentiment analysis, and collaborative tools within an inclusive, scalable environment built on the CORTEX2 architecture

Q: What problem are you solving? What makes your solution unique?

A: INTERACT addresses the pressing issue of exclusion in remote professional communication by enabling deaf, hard-of-hearing, and multilingual individuals to fully participate in business meetings and teleconferences, where current systems often lack real-time sign language translation, emotional context awareness, and inclusive collaborative tools. What makes INTERACT unique is its integration of AI-powered 3D avatars capable of performing real-time sign language translation within immersive XR environments using Meta Quest 3 headsets, enhanced by sentiment analysis that allows avatars to reflect emotional cues and foster empathetic interactions. Unlike traditional solutions, INTERACT supports multilingual speech-to-text and text-to-sign translation in International Sign Language (ISL), while leveraging the CORTEX2 architecture, including the Rainbow SDK and Mediation Gateway. The system also provides added functionalities such as automatic meeting summarisation, live subtitling, and information overlays, all optimised for low-bandwidth and sustainable operation. This convergence of accessibility, emotional intelligence, multilingual support, and immersive interaction positions INTERACT as a pioneering and holistic solution for inclusive digital communication.

Q: What are INTERACT’s main objectives?

A: The key objectives of INTERACT are to develop AI-powered 3D avatars for real-time sign language translation within immersive AR/XR environments, enhance multilingual communication through integrated speech-to-text and text-to-sign translation in International Sign Language, and enable empathetic interactions via sentiment analysis. The project aims to ensure accessibility and inclusivity in teleconferencing by leveraging the CORTEX2 architecture. Additionally, it focuses on validating the platform in real-world business and educational settings, collecting user feedback to refine avatar behaviour, and promoting broader adoption through strategic dissemination and stakeholder engagement.

CORTEX2 support programme progress

Q: What were the main activities implemented and milestones achieved during Sprint 1 of the CORTEX2 Support Programme?

A: During Sprint 1, INTERACT focused on defining technical specifications, developing a detailed test plan, and establishing a robust data management framework aligned with FAIR principles. Key activities included configuring the Whisper AI speech-to-text model, exploring AI models for sentiment analysis and sign language translation, and integrating initial components of the CORTEX2 architecture. The team also created the project website and launched dissemination activities via LinkedIn. The main milestone achieved was the delivery of Deliverable D1: Specification and Test Plan, laying the groundwork for integration and development in the following sprints.

Q: What have you achieved so far?

A: During Sprint 2, INTERACT successfully integrated core CORTEX2 components, such as the Rainbow SDK, Mediation Gateway, and sentiment-enhanced avatars, into an immersive XR environment, enabling real-time speech-to-text transcription and initial sign language translation features. ISL models were built using online datasets, and sign animations were extracted using AI models, with avatar animations and emotional expressions tested for realistic interactions. A proof-of-concept prototype and demonstration video were delivered as key milestones, showcasing the system’s functionality and accessibility impact. These achievements marked a major step forward in validating the platform’s technical feasibility and readiness for user testing in real-world scenarios.

Q: How is participating in CORTEX2 supporting INTERACT?

A: Participating in CORTEX2 has provided our team with essential funding, access to advanced XR technologies, and technical support through the CORTEX2 architecture, enabling us to accelerate development, validate our solution in real-world scenarios, and scale an inclusive communication platform for deaf, hard-of-hearing, and multilingual users.

Q: What are your next steps within the CORTEX2 Programme?

A: Our next steps within the CORTEX2 program include conducting pilot demonstrations involving deaf participants, gathering user feedback through questionnaires, refining avatar animations and multilingual translation features based on real-world interactions, and validating our final KPIs through comprehensive user testing to ensure the platform’s accessibility, usability, and impact.


Learn more about INTERACT and stay updated on its progress!

Want to explore more XR innovation? Browse all our supported projects on the CORTEX2 website: 

Open Call 1 winners  –  Open Call 2 winners

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.