Cortex2_OpenCall2 results

Announcing the results of CORTEX2's second Open Call

We are excited to announce that we have achieved great success with the results of our second Open Call, which launched on June 13 and closed on August 15, 2024. We received 91 applications from 25 countries, covering all the call topics. Thanks to all applicants for your interest in joining our journey to democratise extended reality solutions for easy, efficient, and accessible remote collaboration, and good luck!

This opportunity is aimed at extended reality (XR) innovators (from tech startups/SMEs to researchers) to co-develop our innovative XR teleconference platform. They will introduce new modules and features, enhancing the platform's functionalities and opportunities.

Selected beneficiaries will receive funding (up to 100,000 EUR per project) and access to our 9-month support programme, which includes tailored guidance and support, access to tech and business experts, capacity building on CORTEX2 and XR technologies & trends, and resources to facilitate the integration and understanding of the CORTEX2 platform.

The CORTEX2 Open Call #2 results

Our second call has attracted a large number of applications from diverse origins, with 91 submitted out of 149 started — 19 proposals, including a consortium of 2 organisations.

Regarding interest distribution by topic, the open topic has received the most applications at 24, followed by Virtual Experiences Editor with 16 applications and Embodied Avatar with 11 applications.

Distribution of applications submitted to the Open Call #2 topics

  • Open topic: 24 applications
  • Virtual Experiences Editor: 16 applications
  • Embodied Avatar: 11 applications
  • MPRR (Multi Person reaction Recognition): 9 applications
  • Real-time voice translation: 9 applications
  • Gaussian-splatting-based reality capture for VR: 8 applications
  • 3D model database: 7 applications
  • Аnonymizing meeting's content for privacy-free data storage: 5 applications
  • Smart generator: 2 applications

The evaluation process is set to conclude by September 2024, and results are expected by the end of the month.

Cortex2_OpenCall2 results_2

Meet our XR innovators from Open Call #1 

We recently welcomed our Open Call #1 winners to CORTEX2! Twenty teams of exceptional professionals with innovative and diverse solutions that align with our mission to accelerate and democratise XR technology across Europe.

These teams will co-develop our cutting-edge XR platform to create value-added services and engage new use cases to demonstrate its adaptability in different domains. They will receive funding and mentorship to bring their visionary concepts to life.

We are incredibly excited about their potential and the impact they will make in the XR field, and we look forward to seeing how they contribute to the CORTEX2 ecosystem!

CORTEX2 Open Call 1 winners


Stay tuned for more updates on the progress of these groundbreaking projects and the results of our Open Call #2.

Follow us on X and LinkedIn
Subscribe to our newsletter


CORTEX2 Open Call #1 winners

Meet the CORTEX2 Open Call #1 winners

We are excited to announce the CORTEX2 Open Call #1 winners! Twenty teams led by innovators from all over Europe who will join us in shaping the future of extended reality (XR).

Meet our Open Call #1 XR innovators

Track 1: Co-development

These teams will co-develop our CORTEX2 platform to build value-added services, leveraging their expertise on specific market segments.

CDLPG: Co-development of a Dynamic Library of Personalised Gestures

  • Lead organisation: Sensorama Lab (Ukraine)
  • Topic: Dynamic library of personalised gestures

Developing a dynamic library of personalised gestures is essential in enhancing the user experience in augmented reality (AR) and virtual reality (VR) applications. CDLPG will create a module capable of accurately capturing and interpreting a wide range of hand gestures that can be used across various AR/VR applications within the CORTEX2 platform. This technology will make interactions more intuitive and facilitate greater adoption of AR/VR technology in different sectors.

TIP: The Infinity Palette

  • Lead organisation: 3D Interactive (Sweden)
  • Topic: 3D objects library

TIP aims to enrich the CORTEX2 platform with an innovative 2D/3D asset library optimised for Unity and Mozilla Hubs. Focusing on education, entertainment and culture sectors, they plan to create immersive and adaptable learning environments, including a traditional classroom, a group study room, and a library for individual learning, alongside interactive spaces for virtual concerts and cultural exhibitions. These environments, comprising a blend of static and dynamic assets, will be customisable to user needs. With these versatile, high-quality 3D assets, they aim to foster engaging educational experiences and enhance the cultural and entertainment appeal of the CORTEX2 platform, potentially broadening its user base and community engagement.

MGL: Magos Gestures Library

  • Lead organisation: QUANTA & QUALIA (Greece)
  • Topic: Dynamic library of personalised gestures

This project aims to develop and integrate the Magos Gestures Library (MGL) module into the CORTEX2 framework, optimising the interactions landscape of extended reality (XR) applications. Its significance lies in its transformative potential for almost all kinds of XR applications, such as simulations, collaborative training, and realistic interactions in industry applications such as Healthcare, Aerospace, and Industry 5.0.

MGL, an extension of the innovative Magos solution, focuses on advancing hand-tracking and gesture recognition technologies. Thanks to its highly accurate tracking system, it specifically emphasises personalised and dynamic gestures. The Magos Gestures Library will enable users to define and recognise specific gestures — a minimum of 15 — tailored to distinct actions within XR environments.

ARY: ARY the AR Media

  • Lead organisation: ARY (France)
  • Topic: Open

ARY is an augmented reality (AR) media that offers the capability to anchor 3D objects, videos, pictures or PDF files into indoor environments and make them available to anyone using a smartphone or other device.

EXTERNALISE: Enabling Support for Externalising Models in XR Collaboration

  • Lead organisation: MOVERSE (Greece)
  • Topic: Open

EXTERNALISE will develop a disruptive multi-user collaboration and communication module that will change how teams and groups collaborate remotely. By focusing on digitising and streaming the human characteristics that encode the nonverbal cues that compose human body language, EXTERNALISE will enrich users’ representation and boost their expressivity during communication.

VISOR: VIrtualization Service for Object Reconstruction

  • Lead organisation: Phasmatic (Greece)
  • Topic: Real-time virtualiser

Efficient 3D reconstruction is a key operation that can transform the content generation process of many applications, such as mixed reality (MR) applications, movies, game development, telepresence, 3D printing and 3D eCommerce. VISOR proposes a web service that will take images or a video stream of a small object and generate a digital twin as a triangular mesh that can be used by all current XR applications and game engines and be visualised on any device, enabling easy sharing of the 3D model across multiple stakeholders and environments.

SENSO3D: Revolutionizing Virtual Spaces: SENSO3D's Comprehensive 3D Object Library

  • Lead organisation: Senso3D (Portugal)
  • Topic: 3D objects library

SENSO3D is dedicated to building a comprehensive 3D object library meticulously curated for home appliances and household components. However, what sets SENSO3D apart is its innovative AI-powered model selector tool, designed to identify elements in 2D photos and seamlessly replace them with appropriate 3D models from an extensive database. This breakthrough technology promises to transform 2D images into enriched 3D scenes, unlocking a world of possibilities across various domains.

The project envisions creating detailed and accurate 3D models for XR applications, focusing on elder care, language learning, and interactive education. By converting 2D images into immersive 3D environments, SENSO3D enhances visualisation and interaction, offering substantial benefits to users, including those with special needs.

MHI: Multiplayer Haptic Interactions

  • Lead organisation: SenseGlove (The Netherlands)
  • Topic: Collaboration hand object manipulation

SenseGlove is developing a multiplayer toolkit designed to empower XR developers to seamlessly create interactive virtual environments featuring haptic gloves and hand tracking. Their strategy involves harnessing the capabilities of the CORTEX2 framework, specialising in the management of multiplayer objects, avatars, and scenes.

Complementing this toolkit, SenseGlove will provide end-users with a template scene comprising ten pre-fabricated interactable assets. This serves as an entry point, enabling non-developers to effortlessly construct multiplayer interactive scenarios, ensuring smooth interoperability between hand tracking and haptic (and force-feedback) gloves.

FLYTEX: Enhancing Videoconferences with Real-Time IoT Data in the Agrifood Sector

FLYTEX aims to revolutionise decision-making in the agricultural sector by providing real-time IoT sensor data during videoconferences. Focusing on an industry that is rapidly adopting digital technologies, the project aims to gather and share expert insights efficiently. This innovative approach not only aligns with the trend of digitalisation in agriculture but also introduces a new dimension of strategic decision-making by enabling data-driven insights. By integrating IoT data into communication platforms, FLYTEX enhances decision-making processes' quality, speed, and efficacy, making it a vital tool in the modern agricultural landscape.

RAX: Realistic Avatars for XR

  • Lead organisation: IGOODI (Italy)
  • Topic: User representation and user avatar customisation

The RAX project takes inspiration from its namesake (the Alps mountainous range) to set a high but realistic ambition for developing a scalable, automatic, integrated tool for realistic, customisable, interoperable, and multimodal Avatars. It will be integrated with the CORTEX2 technological ecosystem to extend its capabilities by covering User Representation and User Avatar Customization.

Track 2: Use-cases

These teams will address and/or propose use cases for deploying the CORTEX2 framework and developed features in the co-development track 1.

vScientist: Immersive Exploration of Fluid Dynamics: Developing an XR/VR Platform for CFD Virtual Testing in Education and Social Inclusion

VR technology can provide a paradigm shift in education with high-quality, inclusive, and sustainable tools using real-life examples. The vScientist project will develop a comprehensive XR & VR platform for enhancing the learning and accessibility of Computational Fluid Dynamics (CFD) for students and individuals from diverse backgrounds, promoting social inclusion and accessibility in STEM fields. Through this novel platform, users will visualise, interact, and analyse 3D virtual experiments in an immersive environment or run their own in seconds using machine learning.

SELFEX2: Real-time Remote Dexterity Training for “Hands-on” Industrial Applications

SELFEX2 aims to improve manufacturing training processes using wearable finger-tracking gloves and XR. It will allow for a synchronous self-training approach that provides a quantifiable degree of readiness to execute a dexterity-based task in the workplace.

By integrating the SELFEX2 concept into the CORTEX2 framework, real-time training between a teacher-senior operator at one location and several junior operators in remote locations—who can learn to execute dexterity-based tasks by combining video, voice, and the XR representation of the teacher's hands — will be possible.

FocusVR-ADHD: FocusVR: ADHD VR Solutions

FocusVR: ADHD VR Solutions integrates cutting-edge VR technology to revolutionise ADHD management. It aims to design immersive VR scenarios specifically tailored to improve cognitive skills in ADHD patients, including attention, memory, and emotional regulation. This innovative approach addresses the need for engaging, customised cognitive training, aligning with CORTEX2’s healthcare vision. The project promises significant advancements in ADHD therapy, offering a novel, effective tool for patients and clinicians alike, and setting new standards in VR-based healthcare solutions.

CORE-MHC: CO-facilitated and REmote Mixed-reality Mental Health Care interventions

CORE-MHC aims to define the next generation of XR experiences for mental healthcare. It will develop a gamified platform to support remote and co-facilitated mental health treatment using mixed reality. By making therapy engaging, interactive, and accessible, it will address the treatment gap in mental healthcare, enhance healthcare accessibility, and positively impact the European healthcare landscape.

C.A.R.E. XR: Critical Awareness and Response Enhancement with eXtended Reality

The C.A.R.E. XR project aims to revolutionise emergency management by integrating XR with the Next-Generation Incident Command System (NICS). This initiative will enhance situational awareness and decision-making through real-time, 3D XR visualisations and IoT data integration. Leveraging machine learning and natural language processing, the project will improve communication efficiency between first responders and command centres.

Aligning with the CORTEX² framework, it is set to establish a new standard in emergency response, reduce operational times, and increase safety for both responders and civilians. This initiative represents a major step towards the future of technologically advanced emergency services.

XRisis: Emergency Crisis Simulation & Preparedness Metaverse Toolkit

XRisis will create inclusive, engaging and easily repeatable simulated virtual crisis environments. It will implement and pilot an MVP for collaborative emergency and crisis management training and upskilling.

Three exemplary crisis management pilots will be built on top of CORTEX2 and its services and tested remotely by a minimum of 30 crisis response personnel across a minimum of four countries. This will provide evidence-based validation that a real-time XR communications environment can improve collaborative learning experiences, increase adoption, drive costs, increase training delivery efficiencies, and decrease logistical complexities.

HYMNE: Hybrid Music, New Experiences

  • Organisations:
    Technology developer: 4DR Studios (Netherlands)
    Technology adopter: Effenaar (Netherlands)
  • Domain: Entertainment and culture

The HYMNE use case in CORTEX2 aims to open a new angle on hybrid music events. It will focus on gathering the largest audience over time (rather than simultaneously) by creating unique, interactive, immersive concerts in which the audience plays an important part.

3-time Grammy winner and guitarist Steve Vai believes hybrid concerts will further revolutionise the music industry. In 2026, the project aims to be able to organise full hybrid events: live shows augmented with an XR experience. After the event, the audience can book an interactive VR concert. Artists will perform both in the live show and the VR concert, volumetrically recorded or live-streamed. Steve Vai will provide feedback on the virtual stage performance during the project.

SCIPLANT: Sustainable City Planning Tool

SCIPLANT is an innovative XR-based application designed to revolutionise urban planning. It integrates immersive technologies with real-time data to create dynamic urban models, enhancing efficiency and accuracy in city planning.

Aimed at fostering sustainable development, this tool facilitates collaborative decision-making among urban planners, architects, government officials, and citizens. Its gender-neutral design and customisable features ensure accessibility and inclusivity, accommodating diverse user needs. SCIPLANT is a testament to innovation in digital urban planning, paving the way for future advancements in sustainable city design.

XRehab: Extended Reality for Neurological Rehabilitation

  • Organisations:
    Technology developer: NEMO Lab (Italy)
    Technology adopter: Deep Reality (Italy)
  • Domain: Accessibility and Social Inclusion

The project will design, create, and test a cutting-edge virtual reality simulation environment tailored to support rehabilitation in hospital settings. Its goal is to provide a versatile tool available across all existing VR platforms, offering immersive or semi-immersive experiences with various modes of interaction. It will enhance patient involvement, enable remote accessibility, and contribute to a more comprehensive understanding of disease progression.

AgriVision: Extended Reality for Efficient and Sustainable Farming

AgriVision integrates XR with farm management information systems (FMIS), revolutionising how farmers can interpret complex data through intuitive, immersive visualisations. This innovation not only facilitates quicker and more efficient decision-making but also introduces a paradigm shift in farmer-consultant interactions, enabling real-time immersive XR visits and remote collaboration. This significantly diminishes the need for frequent in-person consultations, enhancing overall farming operational efficiency.

Agrivision will be offered in two versions: a “lite version” aiming to run on farmers' mobile devices and a “pro version” aiming to fully utilise XR capabilities, utilising dedicated XR devices, such as Hololens. The project will involve agricultural experts/consultants and farmers for pilot testing and feedback collection.


The CORTEX2 team is proud to welcome these fantastic teams into our journey. Stay tuned for more updates on their progress!

Follow us on X and LinkedIn
Subscribe to our newsletter


[RECORDING] CORTEX2 Open Call 2 Webinar 2: Application topics

On 31 July 2024, we held our second info webinar about our Open Call 2. In it, our technical colleagues presented the topics to apply to, covering challenges, requirements, expected outcomes, and what (technical) support and resources we will provide to the winners during our 9-month support programme.

Now that we have developed the backbone and specific features of our innovative extended reality (XR) teleconference platform, we are looking for partners — companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) — to collaborate with us on further developing it, providing new modules and features to expand its functionalities.

Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.

The open call topics

As an applicant, you should choose one of these topics to apply to. If you don’t find a suitable one, you can also apply for an open topic aligned with the CORTEX2 framework and objectives.

  1. Embodied Avatar
  2. Smart generator
  3. Virtual Experiences Editor
  4. MPRR (Multi Person reaction Recognition)
  5. Gaussian-splatting-based reality capture for VR
  6. 3D model database
  7. Real-time voice translation
  8. Аnonymizing meeting’s content for privacy-free data storage
  9. OPEN TOPIC: Submit your own project idea

The open call will be open until 15 August 2024 at 17:00 CET.

Check the Open Call 2 website, carefully review the call documents and recording below, and prepare to make a successful application.

Apply now!

https://www.youtube.com/watch?v=JlkiyNHxRMo


Follow us on X and LinkedIn
Subscribe to our newsletter