Announcing the winners of the CORTEX2 Open Call #2
We are excited to introduce the CORTEX2 Open Call #2 winners! An impressive group of innovators and visionaries committed to shaping the future of extended reality (XR) technologies and developing inclusive, efficient, and immersive experiences. After a competitive selection process, 10 groundbreaking projects were chosen, representing 12 beneficiaries from 8 countries, including 7 SMEs, 2 startups, and 3 research organisations (ROs).
The selected projects will contribute to the co-development of the CORTEX2 platform, tackling key challenges in XR, including sign language translation, immersive 3D collaboration, and anonymisation of data in XR environments.
They will demonstrate XR’s transformative potential in improving remote collaboration, communication, and training across industries.
Meet these projects and explore how they are advancing the XR landscape:
INTERACT: Inclusive Networking for Translation and Embodied Real-Time Augmented Communication Tool with Sign Language Integration
- Lead organisation: DASKALOS-APPS (France)
- Topic: Embodied Avatar
Objective: Develop AI-powered avatars for real-time sign language translation in augmented reality (AR) to support multilingual and hearing-impaired participants in business meetings.
INTERACT will leverage the CORTEX2 architecture to deliver scalable, sentiment-aware interactions in teleconferencing settings. It will improve accessibility for deaf and hard-of-hearing individuals, promoting inclusion in business meetings.
VISIXR: Vision AI for XR
- Lead organisation: ZAUBAR (Germany)
- Topic: Smart Generator
Objective: Develop a cutting-edge Smart Generator tool enabling real-time, AI-driven modification and understanding of 2D/3D assets within real-time 3D environments, integrating these into the CORTEX2 ecosystem.
VIRTEX: Virtual Experience Creation Platform for Seamless Integration of CORTEX2 Services in Industrial and Commercial XR Applications
- Lead organisations: MetaMedicsVR (Spain), Ludwig-Maximilians-Universität (Germany)
- Topic: Virtual Experiences Editor
Objective: Democratising XR technologies and enhancing remote collaboration across diverse industrial and commercial sectors.
To achieve this mission, VIRTEX aims to develop an innovative editor that integrates CORTEX2 services into VR/Web 3D applications. The project responds to the need for more effective remote collaboration and training tools, underscored by recent pandemics and wars.
NITRUS: Nonverbal Interactive Tool for gesture Recognition in mUltiperson Scenarios
- Lead organisation: Logicmelt Technologies SL (Spain)
- Topic: MPRR (Multi-Person Reaction Recognition)
Objective: Use state-of-the-art AI models to create a gesture recognition solution that enriches the interaction of people’s digital experiences.
Its key goals are to gather and create unbiased gesture datasets, benchmark and train AI algorithms for gesture recognition, optimise and develop pipelines for the real-time execution of the algorithms, and integrate them into the CORTEX2 framework.
NODAV: Next-Generation 3DGS Optimisation And Visualisation
- Lead organisation: i2CAT Foundation (Spain)
- Topic: Gaussian-splatting-based reality capture for VR
Objective: Advance Gaussian Splatting (GS) technology to enable high-quality reconstruction of static scenarios.
The project aims to tackle challenges in data efficiency by implementing pruning, data reduction, and compression solutions for improved transmission and storage of GS reconstructions. Additionally, NODAV seeks to unlock the potential of GS for real-time rendering in Unity across VR, mobile, and web applications. By addressing these critical aspects, NODAV enhances GS technology’s accessibility and scalability, paving the way for seamless integration into modern interactive and immersive platforms.
SAME-XR: Scalable Asset Management and Conversion Engine for XR Development
Lead organisation: Nara Eğitim Teknolojileri AŞ (Turkey)
Topic: 3D Model Database
Objective: Develop a comprehensive asset management tool to streamline 3D asset development workflow and integrate it into the CORTEX2 platform.
SAME-XR will simplify asset management for XR developers, saving time and improving workflows.
Some of its key innovations will be:
- Cloud-native 3D asset database with a comprehensive toolset
- Centralised library for 3D assets
- Unity GUI editor tool and add-ons for 3D software
- Asset discovery through integration with external databases
- Secure asset management with version control
- Support for multiple file formats and optimisation tools
- Web-based interface with 3D and AR preview capabilities
- Multi-User XR collaboration/prototyping.
PETER: Preserve Emotions in Translations for Extended Reality
- Lead organisations: Università degli Studi di Cagliari, R2M Solution S.r.l. (Italy)
- Topic: Real-time voice translation
Objective: Develop a quasi-real-time voice-to-voice translation system that preserves the emotional tone of XR applications.
The project will deliver fast, emotionally aware translations to enhance engagement and communication effectiveness and reduce misunderstandings while using XR technologies.
XR-CARE: Extended Reality Collaborative Anonymisation for Remote Healthcare
- Lead organisation: Logimade Lda (Portugal)
- Topic: Anonymising meetings content for privacy-free data storage
Objective: Anonymise XR teleconference data while preserving usability.
The project will anonymise multiple data types, including high-definition video, audio, and physiological data, without compromising personal information and preserve the rest of the content intact and usable.
VEM: Virtual Encrypted Meetings
- Lead organisation: Simplito sp. z o. o. (Poland)
- Topic: Open – Own project idea
Objective: Enhance data privacy and security in VR communication platforms.
The project's key features are comprehensive end-to-end message encryption (server & user level) and full data protection, ensuring secure VR communication for users.
R3in3D: RGBD Real-Time Representations of Humans and Objects in 3D
- Lead organisation: TNO, Netherlands Organisation for Applied Scientific Research (Netherlands)
- Topic: Open – Own project idea
Objective: Provide immersive 3D representation of humans and objects for XR applications.
R3in3D aims to redefine collaborative business meetings and remote training experiences in XR, fostering natural and engaging communication between participants and improving decision-making, knowledge retention, and skill acquisition.
The project will develop tools for realistic 3D capture and rendering modules for Unity and WebXR. Thus, it will allow the building of holographic communication pipelines while making the most out of next-generation networks (5G and 6G) and “network slices." These customisable pipelines allow the 3D representation of a real-world working station for a novel, immersive way of training in XR environments and business meetings.
Next step: The CORTEX2 Support Programme
Over the next 9 months, these projects will receive funding, resources, and expert guidance to advance their groundbreaking solutions and bring tangible results to life—all while they help shape the CORTEX2 platform, ensuring it evolves into a robust, scalable solution.
We look forward to showcasing their progress and achievements as they work to shape the future of XR.
Stay tuned for updates on their progress!
Follow us on X and LinkedIn
Subscribe to our newsletter
CORTEX2 will participate at EuroXR Conference
We are thrilled to announce that CORTEX2 will participate in the 21st EuroXR International Conference (EuroXR 2024), a leading event in the fields of virtual reality (VR), augmented reality (AR), and mixed reality (MR). This year, the conference is co-organized by the Institute of Communication and Computer Systems (ICCS) of the National Technical University of Athens (NTUA) and will take place in Athens, Greece, from November 27 to 29, 2024. It will provide an excellent opportunity for us to showcase our advancements in extended reality (XR) and engage with the broader XR community.
Every year, EuroXR attracts a diverse audience, including researchers, developers, industry leaders, and policymakers, eager to learn about the latest developments in VR, AR, and MR, and passionate about the transformative potential of XR technologies.
This year’s conference will highlight several EU-funded projects, including CORTEX2, SUN-XR, DIDYMOS-XR, THEIA-XR, SHARESPACE and LUMINOUS, along with initiatives from the Alliance4XR project.
Join us in shaping the future of XR at EuroXR 2024
We will seize this opportunity to connect with the XR community, share our work, and further strengthen our collaborations. We look forward to seeing many of you in Athens!
Learn more about EuroXR 2024 and secure your spot.
Announcing the results of CORTEX2's second Open Call
We are excited to announce that we have achieved great success with the results of our second Open Call, which launched on June 13 and closed on August 15, 2024. We received 91 applications from 25 countries, covering all the call topics. Thanks to all applicants for your interest in joining our journey to democratise extended reality solutions for easy, efficient, and accessible remote collaboration, and good luck!
This opportunity is aimed at extended reality (XR) innovators (from tech startups/SMEs to researchers) to co-develop our innovative XR teleconference platform. They will introduce new modules and features, enhancing the platform's functionalities and opportunities.
Selected beneficiaries will receive funding (up to 100,000 EUR per project) and access to our 9-month support programme, which includes tailored guidance and support, access to tech and business experts, capacity building on CORTEX2 and XR technologies & trends, and resources to facilitate the integration and understanding of the CORTEX2 platform.
The CORTEX2 Open Call #2 results
Our second call has attracted a large number of applications from diverse origins, with 91 submitted out of 149 started — 19 proposals, including a consortium of 2 organisations.
Regarding interest distribution by topic, the open topic has received the most applications at 24, followed by Virtual Experiences Editor with 16 applications and Embodied Avatar with 11 applications.
Distribution of applications submitted to the Open Call #2 topics
- Open topic: 24 applications
- Virtual Experiences Editor: 16 applications
- Embodied Avatar: 11 applications
- MPRR (Multi Person reaction Recognition): 9 applications
- Real-time voice translation: 9 applications
- Gaussian-splatting-based reality capture for VR: 8 applications
- 3D model database: 7 applications
- Аnonymizing meeting's content for privacy-free data storage: 5 applications
- Smart generator: 2 applications
The evaluation process is set to conclude by September 2024, and results are expected by the end of the month.
Meet our XR innovators from Open Call #1
We recently welcomed our Open Call #1 winners to CORTEX2! Twenty teams of exceptional professionals with innovative and diverse solutions that align with our mission to accelerate and democratise XR technology across Europe.
These teams will co-develop our cutting-edge XR platform to create value-added services and engage new use cases to demonstrate its adaptability in different domains. They will receive funding and mentorship to bring their visionary concepts to life.
We are incredibly excited about their potential and the impact they will make in the XR field, and we look forward to seeing how they contribute to the CORTEX2 ecosystem!
Stay tuned for more updates on the progress of these groundbreaking projects and the results of our Open Call #2.
Follow us on X and LinkedIn
Subscribe to our newsletter
Meet the CORTEX2 Open Call #1 winners
We are excited to announce the CORTEX2 Open Call #1 winners! Twenty teams led by innovators from all over Europe who will join us in shaping the future of extended reality (XR).
Meet our Open Call #1 XR innovators
Track 1: Co-development
These teams will co-develop our CORTEX2 platform to build value-added services, leveraging their expertise on specific market segments.
CDLPG: Co-development of a Dynamic Library of Personalised Gestures
- Lead organisation: Sensorama Lab (Ukraine)
- Topic: Dynamic library of personalised gestures
Developing a dynamic library of personalised gestures is essential in enhancing the user experience in augmented reality (AR) and virtual reality (VR) applications. CDLPG will create a module capable of accurately capturing and interpreting a wide range of hand gestures that can be used across various AR/VR applications within the CORTEX2 platform. This technology will make interactions more intuitive and facilitate greater adoption of AR/VR technology in different sectors.
TIP: The Infinity Palette
- Lead organisation: 3D Interactive (Sweden)
- Topic: 3D objects library
TIP aims to enrich the CORTEX2 platform with an innovative 2D/3D asset library optimised for Unity and Mozilla Hubs. Focusing on education, entertainment and culture sectors, they plan to create immersive and adaptable learning environments, including a traditional classroom, a group study room, and a library for individual learning, alongside interactive spaces for virtual concerts and cultural exhibitions. These environments, comprising a blend of static and dynamic assets, will be customisable to user needs. With these versatile, high-quality 3D assets, they aim to foster engaging educational experiences and enhance the cultural and entertainment appeal of the CORTEX2 platform, potentially broadening its user base and community engagement.
MGL: Magos Gestures Library
- Lead organisation: QUANTA & QUALIA (Greece)
- Topic: Dynamic library of personalised gestures
This project aims to develop and integrate the Magos Gestures Library (MGL) module into the CORTEX2 framework, optimising the interactions landscape of extended reality (XR) applications. Its significance lies in its transformative potential for almost all kinds of XR applications, such as simulations, collaborative training, and realistic interactions in industry applications such as Healthcare, Aerospace, and Industry 5.0.
MGL, an extension of the innovative Magos solution, focuses on advancing hand-tracking and gesture recognition technologies. Thanks to its highly accurate tracking system, it specifically emphasises personalised and dynamic gestures. The Magos Gestures Library will enable users to define and recognise specific gestures — a minimum of 15 — tailored to distinct actions within XR environments.
ARY: ARY the AR Media
- Lead organisation: ARY (France)
- Topic: Open
ARY is an augmented reality (AR) media that offers the capability to anchor 3D objects, videos, pictures or PDF files into indoor environments and make them available to anyone using a smartphone or other device.
EXTERNALISE: Enabling Support for Externalising Models in XR Collaboration
- Lead organisation: MOVERSE (Greece)
- Topic: Open
EXTERNALISE will develop a disruptive multi-user collaboration and communication module that will change how teams and groups collaborate remotely. By focusing on digitising and streaming the human characteristics that encode the nonverbal cues that compose human body language, EXTERNALISE will enrich users’ representation and boost their expressivity during communication.
VISOR: VIrtualization Service for Object Reconstruction
- Lead organisation: Phasmatic (Greece)
- Topic: Real-time virtualiser
Efficient 3D reconstruction is a key operation that can transform the content generation process of many applications, such as mixed reality (MR) applications, movies, game development, telepresence, 3D printing and 3D eCommerce. VISOR proposes a web service that will take images or a video stream of a small object and generate a digital twin as a triangular mesh that can be used by all current XR applications and game engines and be visualised on any device, enabling easy sharing of the 3D model across multiple stakeholders and environments.
SENSO3D: Revolutionizing Virtual Spaces: SENSO3D's Comprehensive 3D Object Library
- Lead organisation: Senso3D (Portugal)
- Topic: 3D objects library
SENSO3D is dedicated to building a comprehensive 3D object library meticulously curated for home appliances and household components. However, what sets SENSO3D apart is its innovative AI-powered model selector tool, designed to identify elements in 2D photos and seamlessly replace them with appropriate 3D models from an extensive database. This breakthrough technology promises to transform 2D images into enriched 3D scenes, unlocking a world of possibilities across various domains.
The project envisions creating detailed and accurate 3D models for XR applications, focusing on elder care, language learning, and interactive education. By converting 2D images into immersive 3D environments, SENSO3D enhances visualisation and interaction, offering substantial benefits to users, including those with special needs.
MHI: Multiplayer Haptic Interactions
- Lead organisation: SenseGlove (The Netherlands)
- Topic: Collaboration hand object manipulation
SenseGlove is developing a multiplayer toolkit designed to empower XR developers to seamlessly create interactive virtual environments featuring haptic gloves and hand tracking. Their strategy involves harnessing the capabilities of the CORTEX2 framework, specialising in the management of multiplayer objects, avatars, and scenes.
Complementing this toolkit, SenseGlove will provide end-users with a template scene comprising ten pre-fabricated interactable assets. This serves as an entry point, enabling non-developers to effortlessly construct multiplayer interactive scenarios, ensuring smooth interoperability between hand tracking and haptic (and force-feedback) gloves.
FLYTEX: Enhancing Videoconferences with Real-Time IoT Data in the Agrifood Sector
- Lead organisation: FlyThings Technologies (Spain)
- Topic: IoT Adaptability of the CORTEX2 Framework
FLYTEX aims to revolutionise decision-making in the agricultural sector by providing real-time IoT sensor data during videoconferences. Focusing on an industry that is rapidly adopting digital technologies, the project aims to gather and share expert insights efficiently. This innovative approach not only aligns with the trend of digitalisation in agriculture but also introduces a new dimension of strategic decision-making by enabling data-driven insights. By integrating IoT data into communication platforms, FLYTEX enhances decision-making processes' quality, speed, and efficacy, making it a vital tool in the modern agricultural landscape.
RAX: Realistic Avatars for XR
- Lead organisation: IGOODI (Italy)
- Topic: User representation and user avatar customisation
The RAX project takes inspiration from its namesake (the Alps mountainous range) to set a high but realistic ambition for developing a scalable, automatic, integrated tool for realistic, customisable, interoperable, and multimodal Avatars. It will be integrated with the CORTEX2 technological ecosystem to extend its capabilities by covering User Representation and User Avatar Customization.
Track 2: Use-cases
These teams will address and/or propose use cases for deploying the CORTEX2 framework and developed features in the co-development track 1.
vScientist: Immersive Exploration of Fluid Dynamics: Developing an XR/VR Platform for CFD Virtual Testing in Education and Social Inclusion
- Organisations:
Technology developer: National Technical University of Athens (Greece)
Technology adopter: MultiFluidX - Lyras EE (Greece) - Domain: Education
VR technology can provide a paradigm shift in education with high-quality, inclusive, and sustainable tools using real-life examples. The vScientist project will develop a comprehensive XR & VR platform for enhancing the learning and accessibility of Computational Fluid Dynamics (CFD) for students and individuals from diverse backgrounds, promoting social inclusion and accessibility in STEM fields. Through this novel platform, users will visualise, interact, and analyse 3D virtual experiments in an immersive environment or run their own in seconds using machine learning.
SELFEX2: Real-time Remote Dexterity Training for “Hands-on” Industrial Applications
- Organisations:
Technology developer: Automotive Technology Center of Galicia — CTAG (Spain)
Technology adopter: SmartFlexCell Solutions (Spain) - Domain: Industry
SELFEX2 aims to improve manufacturing training processes using wearable finger-tracking gloves and XR. It will allow for a synchronous self-training approach that provides a quantifiable degree of readiness to execute a dexterity-based task in the workplace.
By integrating the SELFEX2 concept into the CORTEX2 framework, real-time training between a teacher-senior operator at one location and several junior operators in remote locations—who can learn to execute dexterity-based tasks by combining video, voice, and the XR representation of the teacher's hands — will be possible.
FocusVR-ADHD: FocusVR: ADHD VR Solutions
- Organisations:
Technology developer: RTE Lab sp. Z.o.o. (Poland)
Technology adopter: Medical University of Lodz (Poland) - Domain: Healthcare
FocusVR: ADHD VR Solutions integrates cutting-edge VR technology to revolutionise ADHD management. It aims to design immersive VR scenarios specifically tailored to improve cognitive skills in ADHD patients, including attention, memory, and emotional regulation. This innovative approach addresses the need for engaging, customised cognitive training, aligning with CORTEX2’s healthcare vision. The project promises significant advancements in ADHD therapy, offering a novel, effective tool for patients and clinicians alike, and setting new standards in VR-based healthcare solutions.
CORE-MHC: CO-facilitated and REmote Mixed-reality Mental Health Care interventions
- Organisations:
Technology developer: Institute of Computer Technology — ITI (Spain)
Technology adopter: Fundación SASM (Spain) - Domain: Healthcare
CORE-MHC aims to define the next generation of XR experiences for mental healthcare. It will develop a gamified platform to support remote and co-facilitated mental health treatment using mixed reality. By making therapy engaging, interactive, and accessible, it will address the treatment gap in mental healthcare, enhance healthcare accessibility, and positively impact the European healthcare landscape.
C.A.R.E. XR: Critical Awareness and Response Enhancement with eXtended Reality
- Organisations:
Technology developer: South East European University (North Macedonia)
Technology adopter: Crisis Management Center (North Macedonia) - Domain: Emergency and crisis
The C.A.R.E. XR project aims to revolutionise emergency management by integrating XR with the Next-Generation Incident Command System (NICS). This initiative will enhance situational awareness and decision-making through real-time, 3D XR visualisations and IoT data integration. Leveraging machine learning and natural language processing, the project will improve communication efficiency between first responders and command centres.
Aligning with the CORTEX² framework, it is set to establish a new standard in emergency response, reduce operational times, and increase safety for both responders and civilians. This initiative represents a major step towards the future of technologically advanced emergency services.
XRisis: Emergency Crisis Simulation & Preparedness Metaverse Toolkit
- Organisations:
Technology developer: Nuwa Ltd - XR Ireland (Ireland)
Technology adopter: Action Contre La Faim (France) - Domain: Emergency and crisis
XRisis will create inclusive, engaging and easily repeatable simulated virtual crisis environments. It will implement and pilot an MVP for collaborative emergency and crisis management training and upskilling.
Three exemplary crisis management pilots will be built on top of CORTEX2 and its services and tested remotely by a minimum of 30 crisis response personnel across a minimum of four countries. This will provide evidence-based validation that a real-time XR communications environment can improve collaborative learning experiences, increase adoption, drive costs, increase training delivery efficiencies, and decrease logistical complexities.
HYMNE: Hybrid Music, New Experiences
- Organisations:
Technology developer: 4DR Studios (Netherlands)
Technology adopter: Effenaar (Netherlands) - Domain: Entertainment and culture
The HYMNE use case in CORTEX2 aims to open a new angle on hybrid music events. It will focus on gathering the largest audience over time (rather than simultaneously) by creating unique, interactive, immersive concerts in which the audience plays an important part.
3-time Grammy winner and guitarist Steve Vai believes hybrid concerts will further revolutionise the music industry. In 2026, the project aims to be able to organise full hybrid events: live shows augmented with an XR experience. After the event, the audience can book an interactive VR concert. Artists will perform both in the live show and the VR concert, volumetrically recorded or live-streamed. Steve Vai will provide feedback on the virtual stage performance during the project.
SCIPLANT: Sustainable City Planning Tool
- Organisation: Technology developer: Mercury Retrograde (Portugal)
- Domain: Smart cities
SCIPLANT is an innovative XR-based application designed to revolutionise urban planning. It integrates immersive technologies with real-time data to create dynamic urban models, enhancing efficiency and accuracy in city planning.
Aimed at fostering sustainable development, this tool facilitates collaborative decision-making among urban planners, architects, government officials, and citizens. Its gender-neutral design and customisable features ensure accessibility and inclusivity, accommodating diverse user needs. SCIPLANT is a testament to innovation in digital urban planning, paving the way for future advancements in sustainable city design.
XRehab: Extended Reality for Neurological Rehabilitation
- Organisations:
Technology developer: NEMO Lab (Italy)
Technology adopter: Deep Reality (Italy) - Domain: Accessibility and Social Inclusion
The project will design, create, and test a cutting-edge virtual reality simulation environment tailored to support rehabilitation in hospital settings. Its goal is to provide a versatile tool available across all existing VR platforms, offering immersive or semi-immersive experiences with various modes of interaction. It will enhance patient involvement, enable remote accessibility, and contribute to a more comprehensive understanding of disease progression.
AgriVision: Extended Reality for Efficient and Sustainable Farming
- Organisations:
Technology developer: bSpoke Solutions (Greece)
Technology adopter: University of Macedonia (Greece) - Domain: Open
AgriVision integrates XR with farm management information systems (FMIS), revolutionising how farmers can interpret complex data through intuitive, immersive visualisations. This innovation not only facilitates quicker and more efficient decision-making but also introduces a paradigm shift in farmer-consultant interactions, enabling real-time immersive XR visits and remote collaboration. This significantly diminishes the need for frequent in-person consultations, enhancing overall farming operational efficiency.
Agrivision will be offered in two versions: a “lite version” aiming to run on farmers' mobile devices and a “pro version” aiming to fully utilise XR capabilities, utilising dedicated XR devices, such as Hololens. The project will involve agricultural experts/consultants and farmers for pilot testing and feedback collection.
The CORTEX2 team is proud to welcome these fantastic teams into our journey. Stay tuned for more updates on their progress!
Follow us on X and LinkedIn
Subscribe to our newsletter
[RECORDING] CORTEX2 Open Call 2 Webinar 2: Application topics
On 31 July 2024, we held our second info webinar about our Open Call 2. In it, our technical colleagues presented the topics to apply to, covering challenges, requirements, expected outcomes, and what (technical) support and resources we will provide to the winners during our 9-month support programme.
Now that we have developed the backbone and specific features of our innovative extended reality (XR) teleconference platform, we are looking for partners — companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) — to collaborate with us on further developing it, providing new modules and features to expand its functionalities.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
The open call topics
As an applicant, you should choose one of these topics to apply to. If you don’t find a suitable one, you can also apply for an open topic aligned with the CORTEX2 framework and objectives.
- Embodied Avatar
- Smart generator
- Virtual Experiences Editor
- MPRR (Multi Person reaction Recognition)
- Gaussian-splatting-based reality capture for VR
- 3D model database
- Real-time voice translation
- Аnonymizing meeting’s content for privacy-free data storage
- OPEN TOPIC: Submit your own project idea
The open call will be open until 15 August 2024 at 17:00 CET.
Check the Open Call 2 website, carefully review the call documents and recording below, and prepare to make a successful application.
Apply now!
https://www.youtube.com/watch?v=JlkiyNHxRMo
Follow us on X and LinkedIn
Subscribe to our newsletter
[RECORDING] CORTEX2 Open Call 2 Webinar 1: How to apply?
On 12 July 2024, we held our first info webinar about our Open Call 2. In it, our colleagues Ellie Shtereva and Alain Pagani shared the keys to this funding opportunity for technology startups/SMEs and research organisations.
Now that we have developed the backbone and specific features of our innovative XR teleconference platform, we are looking for partners — companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) — to collaborate with us on further developing it and scaling their immersive communication technologies.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
The open call will be open until 15 August 2024 at 17:00 CET.
Check the Open Call 2 website, carefully review the call documents and recording below, and prepare to make a successful application.
Apply now!
https://www.youtube.com/watch?v=qwS6rDoYDxg
Follow us on X and LinkedIn
Subscribe to our newsletter
CORTEX2 Open Call 2 Webinar 2: Application topics
Join our second informative webinar on 31 July 2024 to learn about our open call 2 topics and how to successfully apply!
When? On Wednesday, 31 July 2024, at 10:00 am CET (Brussels time).
For whom? For companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) eager to join us in revolutionising remote collaboration with extended reality (XR) by applying to our Open Call 2.
Register now!
About the webinar
In this webinar, we will share the keys to our Open Call 2 topics:
- Embodied Avatar
- Smart generator
- Virtual Experiences Editor
- MPRR (Multi Person reaction Recognition)
- Gaussian-splatting-based reality capture for VR
- 3D model database
- Real-time voice translation
- Аnonymizing meeting’s content for privacy-free data storage
- OPEN TOPIC: Submit your own project idea
Our technical team will present each topic covering challenges, requirements, expected outcomes, including deliverables per sprint, and what (technical) support/resources we will provide to the winning teams.
About the open call
We are looking for XR innovators — from businesses to research institutions — to collaborate with us on developing our extended reality (XR) teleconference platform, providing new modules and features to expand its functionalities.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
Please check the Open Call 2 website, carefully review the documents, and prepare your questions. Our colleagues will ensure that they are answered.
The open call is open from 13 June to 15 August 2024 at 17:00 CET.
Book your place
Follow us on X and LinkedIn
Subscribe to our newsletter
CORTEX2 Open Call 2 Webinar 1: How to apply?
Join our first informative webinar on 12 July 2024 to learn how to successfully apply to our Open Call 2!
When? On Friday, 12 July 2024, at 11:00 am CET (Brussels time).
For whom? For companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) eager to join us in revolutionising remote collaboration with extended reality (XR) by applying to our Open Call 2.
Register now!
In this event, we will share the details of our Open Call 2, for which we are seeking XR innovators — from businesses to research institutions — to collaborate with us on developing our XR teleconference platform and scaling their immersive communication technologies.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
Check the Open Call 2 website, carefully review the documents, and prepare your questions. Our colleagues Iwa Stefanik and Ellie Shtereva from F6S, along with project coordinator Alain Pagani, from DFKI, will ensure to answer them.
The open call is open from 13 June to 15 August 2024 at 17:00 CET.
Discover this exciting opportunity and learn how to make a successful application!
Book your place
Agenda
11:00 am - Welcome (Alain Pagani)
11:05 am - Brief presentation of CORTEX2 (Alain Pagani)
11:15 am - Overview of CORTEX2 Open Call 2, including objectives, requirements, and relevant considerations (Iwa Stefanik and Ellie Shtereva)
11:30 am - Technical description of CORTEX2 services (Alain Pagani)
11:40 am - Q&A, final considerations and closing
Follow us on X and LinkedIn
Subscribe to our newsletter
Recap of CORTEX2's 3rd Progress Meeting in Castellón
Our CORTEX2 team gathered in Castellón, Spain, from March 21st to 22nd, 2024, for our third Progress Meeting, graciously hosted by Universitat Jaume I. The purpose of this meeting was to delve into the latest developments within our project, discuss the significant milestones we have achieved, conduct pilot demonstrations, and plan for the future.
Representatives from all ten organizations comprising CORTEX2 travelled from all over Europe to reconnect, update each other on their respective contributions to the project, and align on the crucial next steps necessary to continue advancing towards our goals.

During the first day of the meeting, we carried out a comprehensive review covering different areas of our work. From our extended reality modules and services to the ethical, legal, and social implications of the immersive technologies we are developing.
Furthermore, we had the opportunity to discuss our progress in innovation, dissemination, and exploitation strategies, as well as Financial Support for Third Parties (FSTP), emphasising the collaborative efforts that lie ahead in selecting the third parties from our Open Call #1, which we hope to announce soon.
We dedicated the second day to demonstrating and testing our three pilots. This allowed us to gather valuable insights and feedback on our progress to date and illuminate areas where further improvements are needed. We also take the opportunity to record videos of the process to be able to show the advancements we have made.
We concluded this fruitful and productive meeting, which marked another milestone in our CORTEX2 journey, with a summary of key takeaways, conclusions, and clear next steps.
Special thanks to our amazing colleagues at Universitat Jaume I for an excellent welcome in Castellón!
To stay up to date with our innovations, aimed at shaping a future of enhanced remote collaboration, and our opportunities for XR innovators, we invite you to subscribe to our newsletter and connect with us on social media (LinkedIn and Twitter).
Celebrating the results of our first Open Call
At CORTEX2, we are excited to announce that our first Open Call, launched on 24 October 2023 and closed on 16 January 2024, has been a great success, exceeding our expectations. We have received 146 applications coming from 41 countries! Thanks to everyone who has shown interest in being part of our journey, and good luck!
The CORTEX2 Open Call #1 results
Our first open call has attracted a high number of applications — 146 submitted out of 264 started. Regarding interest distribution, Track 1 (for co-developers) has attracted 103 started applications, out of which 49 have been submitted, and Track 2 (for use-cases) has attracted 161 started applications, out of which 97 have been submitted.
- Total submitted applications: 146 from 41 countries (264 started)
- Track 1 (co-development): 49 applications submitted (103 started)
- Track 2 (use-case): 97 applications submitted (161 initiated)
The evaluation process for Track 1 is set to conclude in mid-March 2024, with results expected by the end of the month. For Track 2, starting later, the evaluation process will conclude by the end of April 2024.
A big thank you to all applicants for your interest in joining us in democratising extended reality (XR) solutions for easy, efficient, and accessible remote collaboration!
Want to meet our XR innovators?
We will select 20 teams, comprised of innovative research-oriented institutions and business-driven organisations, to co-develop our XR platform and engage new use cases to demonstrate its replicability in different domains. They will also receive funding (€100.000 for co-developers and €200.000 for use-case) and be part of a 12-month support program, in which they will have continuous guidance and support, access to tech and business experts, capacity building on CORTEX2 and XR technologies & trends and access to resources to facilitate the integration and understanding of our CORTEX2 platform.
Stay tuned to our website and social media channels (LinkedIn and Twitter) to meet them!