INTERACT

Inclusive Networking for Translation and Embodied Real-Time Augmented Communication Tool with Sign Language Integration

Inclusive Networking for Translation and Embodied Real-Time Augmented Communication Tool with Sign Language Integration

Objective: Develop AI-powered avatars for real-time sign language translation in augmented reality (AR) to support multilingual and hearing-impaired participants in business meetings.

INTERACT will leverage the CORTEX2 architecture to deliver scalable, sentiment-aware interactions in teleconferencing settings. It will improve accessibility for deaf and hard-of-hearing individuals, promoting inclusion in business meetings.

LEAD ORGANISATION

DASKALOS-APPS (France)

TOPIC

Embodied Avatar

STAY UPDATED ON THECORTEX2 INNOVATORS' JOURNEY

Discover how our Open Call winners are pushing the boundaries of XR technology, exploring impactful use cases, and bringing bold innovations
to life.

Stay tuned to our news, follow us on social media and sign up for our newsletter for updates on their progress, insights from their projects,
and highlights from their development journey