Announcing the results of CORTEX2's second Open Call
We are excited to announce that we have achieved great success with the results of our second Open Call, which launched on June 13 and closed on August 15, 2024. We received 91 applications from 25 countries, covering all the call topics. Thanks to all applicants for your interest in joining our journey to democratise extended reality solutions for easy, efficient, and accessible remote collaboration, and good luck!
This opportunity is aimed at extended reality (XR) innovators (from tech startups/SMEs to researchers) to co-develop our innovative XR teleconference platform. They will introduce new modules and features, enhancing the platform's functionalities and opportunities.
Selected beneficiaries will receive funding (up to 100,000 EUR per project) and access to our 9-month support programme, which includes tailored guidance and support, access to tech and business experts, capacity building on CORTEX2 and XR technologies & trends, and resources to facilitate the integration and understanding of the CORTEX2 platform.
The CORTEX2 Open Call #2 results
Our second call has attracted a large number of applications from diverse origins, with 91 submitted out of 149 started — 19 proposals, including a consortium of 2 organisations.
Regarding interest distribution by topic, the open topic has received the most applications at 24, followed by Virtual Experiences Editor with 16 applications and Embodied Avatar with 11 applications.
Distribution of applications submitted to the Open Call #2 topics
- Open topic: 24 applications
- Virtual Experiences Editor: 16 applications
- Embodied Avatar: 11 applications
- MPRR (Multi Person reaction Recognition): 9 applications
- Real-time voice translation: 9 applications
- Gaussian-splatting-based reality capture for VR: 8 applications
- 3D model database: 7 applications
- Аnonymizing meeting's content for privacy-free data storage: 5 applications
- Smart generator: 2 applications
The evaluation process is set to conclude by September 2024, and results are expected by the end of the month.
Meet our XR innovators from Open Call #1
We recently welcomed our Open Call #1 winners to CORTEX2! Twenty teams of exceptional professionals with innovative and diverse solutions that align with our mission to accelerate and democratise XR technology across Europe.
These teams will co-develop our cutting-edge XR platform to create value-added services, and engage new use cases to demonstrate its adaptability in different domains. They will receive funding and mentorship to bring their visionary concepts to life.
We are incredibly excited about their potential and the impact they will make in the XR field, and we look forward to seeing how they contribute to the CORTEX2 ecosystem!
Stay tuned for more updates on the progress of these groundbreaking projects and the results of our Open Call #2.
Follow us on X and LinkedIn
Subscribe to our newsletter
Meet the CORTEX2 Open Call #1 winners
We are excited to announce the CORTEX2 Open Call #1 winners! Twenty teams led by innovators from all over Europe who will join us in shaping the future of extended reality (XR).
Meet our Open Call #1 XR innovators
Track 1: Co-development
These teams will co-develop our CORTEX2 platform to build value-added services, leveraging their expertise on specific market segments.
CDLPG: Co-development of a Dynamic library of personalized gestures
- Lead organisation: Sensorama Lab (Ukraine)
- Topic: Dynamic library of personalised gestures
The development of a dynamic library of personalised gestures is essential in enhancing the user experience in augmented reality (AR) and virtual reality (VR) applications. CDLPG will create a module capable of accurately capturing and interpreting a wide range of hand gestures that can be used across various AR/VR applications within the CORTEX2 platform. This technology will make interactions more intuitive and will facilitate greater adoption of AR/VR technology in different sectors.
TIP: The Infinity Palette
- Lead organisation: 3D Interactive Sthlm (Sweden)
- Topic: 3D objects library
TIP aims to enrich the CORTEX2 platform with an innovative 2D/3D asset library optimised for Unity and Mozilla Hubs. Focusing on education, entertainment and culture sectors, they plan to create immersive and adaptable learning environments, including a traditional classroom, a group study room, and a library for individual learning, alongside interactive spaces for virtual concerts and cultural exhibitions. These environments, comprising a blend of static and dynamic assets, will be customizable to user needs. With these versatile, high-quality 3D assets, they aim to foster engaging educational experiences and enhance the cultural and entertainment appeal of the CORTEX2 platform, potentially broadening its user base and community engagement.
MGL: Magos Gestures Library
- Lead organisation: QUANTA & QUALIA (Greece)
- Topic: Dynamic library of personalised gestures
This project aims to develop and integrate the Magos Gestures Library (MGL) module into the CORTEX2 framework, optimising the interactions landscape of extended reality (XR) applications. Its significance lies in the transformative potential it holds for almost all kinds of XR applications, such as simulations, collaborative training, and realistic interactions in industry applications such as Healthcare, Aerospace, and industry 5.0.
MGL, an extension of the innovative Magos solution, focuses on advancing hand tracking and gesture recognition technologies, with a specific emphasis on personalized and dynamic gestures, thanks to its highly accurate tracking system. The Magos Gestures Library will enable users to define and recognise specific gestures — a minimum of 15— tailored to distinct actions within XR environments.
ARY: ARY the AR Media
- Lead organisation: ARY
- Topic: Open
ARY is an augmented reality (AR) media which offers the capability to anchor 3D objects, videos, pictures or PDF files into indoor environments and make them available to anyone using a smartphone or other device.
EXTERNALISE: Enabling support for externalising models in XR collaboration
- Lead organisation: MOVERSE (Greece)
- Topic: Open
EXTERNALISE will develop a disruptive multi-user collaboration and communication module that will change how teams and groups collaborate remotely. By focusing on digitising and streaming the human characteristics that encode the nonverbal cues that compose human body language, EXTERNALISE will enrich users’ representation and boost their expressivity during communication.
VISOR: VIrtualization Service for Object Reconstruction
- Lead organisation: Phasmatic PC (Greece)
- Topic: Real time virtualizer
Efficient 3D reconstruction is a key operation that can transform the content generation process of many applications, such as mixed reality (MR) applications, movies, game development, telepresence, 3D printing and 3D eCommerce. VISOR proposes a web service that will take images or a video stream of a small object and generate a digital twin as a triangular mesh that can be used by all current XR applications and game engines and be visualised on any device, enabling easy sharing of the 3D model across multiple stakeholders and environments.
SENSO3D: Revolutionizing Virtual Spaces: SENSO3D's Comprehensive 3D Object Library
- Lead organisation: Senso3D (Portugal)
- Topic: 3D objects library
SENSO3D is dedicated to building a comprehensive 3D object library meticulously curated for home appliances and household components. But what sets SENSO3D apart is its innovative AI-powered model selector tool, designed to identify elements in 2D photos and seamlessly replace them with appropriate 3D models from an extensive database. This breakthrough technology promises to transform 2D images into enriched 3D scenes, unlocking a world of possibilities across various domains.
The project envisions creating detailed and accurate 3D models for XR applications, with a focus on areas such as elder care, language learning, and interactive education. By converting 2D images into immersive 3D environments, SENSO3D enhances visualisation and interaction, offering substantial benefits to users, including those with special needs.
MHI: Multiplayer Haptic interactions
- Lead organisation: SenseGlove (The Netherlands)
- Topic: Collaboration hand object manipulation
SenseGlove is embarking on the development of a multiplayer toolkit, designed to empower XR developers in seamlessly creating interactive virtual environments featuring haptic gloves and hand tracking. Their strategy involves harnessing the capabilities of the CORTEX2 framework, specialising in the management of multiplayer objects, avatars, and scenes.
Complementing this toolkit, SenseGlove will provide end-users with a template scene comprising 10 pre-fabricated interactable assets. This serves as an entry point, enabling non-developers to effortlessly construct multiplayer interactive scenarios, ensuring smooth interoperability between hand tracking and haptic (and force-feedback) gloves.
FLYTEX: Enhancing Videoconferences with Real-Time IoT Data in agrifood sector
- Lead organisation: FlyThings Technologies (Spain)
- Topic: IoT Adaptability of the CORTEX2 Framework
FLYTEX aims to revolutionise decision-making in the agricultural sector by providing real-time IoT sensor data during videoconferences. Focusing on an industry that is rapidly adopting digital technologies, the project aims to gather and share expert insights efficiently. This innovative approach not only aligns with the trend of digitalisation in agriculture but also introduces a new dimension of strategic decision-making by enabling data-driven insights. By integrating IoT data into communication platforms, FLYTEX enhances the quality, speed, and efficacy of decision-making processes, making it a vital tool in the modern agricultural landscape.
RAX: Realistic Avatars for XR
- Lead organisation: IGOODI SRL (Italy)
- Topic: User representation and user avatar customization
The RAX project takes inspiration from its namesake (the Alps mountainous range) to set a high but realistic ambition for developing a scalable, automatic, integrated tool for realistic, customisable, interoperable, and multimodal Avatars. It will be integrated with the CORTEX2 technological ecosystem to extend its capabilities by covering User Representation and User Avatar Customization.
Track 2: Use-cases
These teams will address and/or propose use cases for deploying the CORTEX2 framework and developed features in the co-development track 1.
vScientist: Immersive Exploration of Fluid Dynamics: Developing an XR/VR Platform for CFD virtual testing in Education and Social Inclusion
- Organisations:
Technology developer: National Technical University of Athens (Greece)
Technology adopter: MultiFluidX - Lyras EE - Domain: Education
VR technology can provide a paradigm shift in education with high-quality, inclusive, and sustainable tools using real-life examples. The vScientist project will develop a comprehensive XR & VR platform for enhancing the learning and accessibility of Computational Fluid Dynamics (CFD) for students and individuals from diverse backgrounds, promoting social inclusion and accessibility in STEM fields. Through this novel platform, users will visualise, interact, and analyse 3D virtual experiments in an immersive environment or run their own in seconds using machine learning.
SELFEX2: Real-Time remote dexterity training for “hands-on” industrial applications
- Organisations:
Technology developer: CTAG – Automotive Technology Center of Galicia (Spain)
Technology adopter: SmartFlexCell Solutions S.A. - Domain: Industry
SELFEX2 aims to improve training processes in manufacturing by using wearable finger tracking gloves and XR. It will allow for a synchronous self-training approach that provides a quantifiable degree of readiness to execute a dexterity-based task in the workplace.
By integrating the SELFEX2 concept into the CORTEX2 framework, the real-time training between a teacher-senior operator at one location and several junior operators in remote locations, who can learn to execute dexterity-based tasks by combining video, voice, and the XR representation of the hands of the teacher, will be possible.
FocusVR-ADHD: FocusVR: ADHD VR Solutions
- Organisations:
Technology developer: RTE Lab sp. Z.o.o. (Poland)
Technology adopter: Medical University of Lodz - Domain: Healthcare
FocusVR: ADHD VR Solutions integrates cutting-edge VR technology to revolutionize ADHD management. It aims to design immersive VR scenarios specifically tailored to improve cognitive skills in ADHD patients, including attention, memory, and emotional regulation. This innovative approach addresses the need for engaging, customised cognitive training, aligning with CORTEX2’s healthcare vision. The project promises significant advancements in ADHD therapy, offering a novel, effective tool for patients and clinicians alike, and setting new standards in VR-based healthcare solutions.
CORE-MHC: CO-facilitated and REmote Mixed-Reality Mental Health Care interventions
- Organisations:
Technology developer: Instituto Tecnológico de Informática (Spain)
Technology adopter: Fundación SASM - Domain: Healthcare
CORE-MHC aims to define the next generation of XR experiences for mental healthcare. It will develop a gamified platform to support remote and co-facilitated mental health treatment using mixed reality. By making therapy engaging, interactive, and accessible, it will address the treatment gap in mental healthcare, enhance healthcare accessibility, and positively impact the European healthcare landscape.
C.A.R.E. XR: Critical Awareness and Response Enhancement with eXtended Reality
- Organisations:
Technology developer: South East European University (North Macedonia)
Technology adopter: Crisis Management Center - Domain: Emergency and crisis
The C.A.R.E. XR project aims to revolutionise emergency management by integrating XR with the Next-Generation Incident Command System (NICS). This initiative will enhance situational awareness and decision-making through real-time, 3D XR visualizations, and IoT data integration. Leveraging machine learning and natural language processing, the project will improve communication efficiency between first responders and command centres.
Aligning with the CORTEX² framework, it is set to establish a new standard in emergency response, reduce operational times, and increase safety for both responders and civilians. This initiative represents a major step towards the future of technologically advanced emergency services.
XRisis: Emergency Crisis Simulation & Preparedness Metaverse Toolkit
- Organisations:
Technology developer: Nuwa Ltd - XR Ireland (Ireland)
Technology adopter: Action Contre La Faim - Domain: Emergency and crisis
XRisis will create inclusive, engaging and easily repeatable simulated virtual crisis environments. It will implement and pilot an MVP for collaborative emergency and crisis management training and upskilling.
Three exemplar crisis management pilots will be built on top of CORTEX2 & its services and tested by a minimum of 30 crisis response personnel remotely across a minimum of four countries, which will provide an evidence-based validation that a real time XR communications environment can improve collaborative learning experiences, increase adoption, drive costs, increase training delivery efficiencies, and decrease logistical complexities.
HYMNE: Hybrid Music, New Experiences
- Organisations:
Technology developer: 4DR Studios BV (Netherlands)
Technology adopter: Effenaar - Domain: Entertainment and culture
The HYMNE use case in CORTEX2 aims to open a new angle on hybrid music events. It will focus on gathering the largest audience over time (rather than at the same time) by creating unique interactive immersive concerts in which the audience plays an important part.
3-time Grammy winner and guitarist Steve Vai believes that hybrid concerts will further revolutionise the music industry. In 2026, the project aims to be able to organise full hybrid events: live shows augmented with an XR experience. After the event, the audience can book an interactive VR concert. Artists will perform both in the live show and in the VR concert, volumetrically recorded or live-streamed. Steve Vai will provide feedback on the virtual stage performance during the project.
SCIPLANT: Sustainable City Planning Tool
- Organisation: Technology developer: Mercury Retrograde LDA (Portugal)
- Domain: Smart cities
SCIPLANT is an innovative XR-based application designed to revolutionise urban planning. It integrates immersive technologies with real-time data to create dynamic urban models, enhancing efficiency and accuracy in city planning.
Aimed at fostering sustainable development, this tool facilitates collaborative decision-making among urban planners, architects, government officials, and citizens. Its gender-neutral design and customisable features ensure accessibility and inclusivity, accommodating a diverse range of user needs. SCIPLANT is a testament to innovation in digital urban planning, paving the way for future advancements in sustainable city design.
XRehab: Extended Reality for Neurological Rehabilitation
- Organisations:
Technology developer: NEMO Lab SRL (Italy)
Technology adopter: Deep Reality SRL - Domain: Accessibility and Social Inclusion
The project will design, create, and test a cutting-edge virtual reality simulation environment tailored for use in hospital settings to support rehabilitation. Its goal is to provide a versatile tool available across all existing VR platforms, offering immersive or semi-immersive experiences with various modes of interaction. It will enhance patient involvement, enable remote accessibility, and contribute to a more comprehensive understanding of disease progression.
AgriVision: Extended Reality for Efficient and Sustainable Farming
- Organisations:
Technology developer: bSpoke Solutions L.P. (Greece)
Technology adopter: University of Macedonia - Domain: Open
AgriVision integrates XR with farm management information systems (FMIS), revolutionising how farmers can interpret complex data through intuitive, immersive visualisations. This innovation not only facilitates quicker and more efficient decision-making but also introduces a paradigm shift in farmer-consultant interactions, enabling real-time immersive XR-visits and remote collaboration. This significantly diminishes the need for frequent in-person consultations, enhancing overall farming operational efficiency.
Agrivision will be offered in two versions: a “lite version” aiming to run on farmers mobile devices and a “pro version” aiming to fully utilise XR capabilities, utilising dedicated XR devices, such as Ηololens. The project will involve agricultural experts/consultants and farmers for pilot testing and feedback collection.
The CORTEX2 team is very proud to welcome these amazing teams into our journey. Stay tuned for more updates on their progress!
Follow us on X and LinkedIn
Subscribe to our newsletter
[RECORDING] CORTEX2 Open Call 2 Webinar 2: Application topics
On 31 July 2024, we held our second info webinar about our Open Call 2. In it, our technical colleagues presented the topics to apply to, covering challenges, requirements, expected outcomes, and what (technical) support and resources we will provide to the winners during our 9-month support programme.
Now that we have developed the backbone and specific features of our innovative extended reality (XR) teleconference platform, we are looking for partners — companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) — to collaborate with us on further developing it, providing new modules and features to expand its functionalities.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
The open call topics
As an applicant, you should choose one of these topics to apply to. If you don’t find a suitable one, you can also apply for an open topic aligned with the CORTEX2 framework and objectives.
- Embodied Avatar
- Smart generator
- Virtual Experiences Editor
- MPRR (Multi Person reaction Recognition)
- Gaussian-splatting-based reality capture for VR
- 3D model database
- Real-time voice translation
- Аnonymizing meeting’s content for privacy-free data storage
- OPEN TOPIC: Submit your own project idea
The open call will be open until 15 August 2024 at 17:00 CET.
Check the Open Call 2 website, carefully review the call documents and recording below, and prepare to make a successful application.
Apply now!
https://www.youtube.com/watch?v=JlkiyNHxRMo
Follow us on X and LinkedIn
Subscribe to our newsletter
[RECORDING] CORTEX2 Open Call 2 Webinar 1: How to apply?
On 12 July 2024, we held our first info webinar about our Open Call 2. In it, our colleagues Ellie Shtereva and Alain Pagani shared the keys to this funding opportunity for technology startups/SMEs and research organisations.
Now that we have developed the backbone and specific features of our innovative XR teleconference platform, we are looking for partners — companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) — to collaborate with us on further developing it and scaling their immersive communication technologies.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
The open call will be open until 15 August 2024 at 17:00 CET.
Check the Open Call 2 website, carefully review the call documents and recording below, and prepare to make a successful application.
Apply now!
https://www.youtube.com/watch?v=qwS6rDoYDxg
Follow us on X and LinkedIn
Subscribe to our newsletter
CORTEX2 Open Call 2 Webinar 2: Application topics
Join our second informative webinar on 31 July 2024 to learn about our open call 2 topics and how to successfully apply!
When? On Wednesday, 31 July 2024, at 10:00 am CET (Brussels time).
For whom? For companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) eager to join us in revolutionising remote collaboration with extended reality (XR) by applying to our Open Call 2.
Register now!
About the webinar
In this webinar, we will share the keys to our Open Call 2 topics:
- Embodied Avatar
- Smart generator
- Virtual Experiences Editor
- MPRR (Multi Person reaction Recognition)
- Gaussian-splatting-based reality capture for VR
- 3D model database
- Real-time voice translation
- Аnonymizing meeting’s content for privacy-free data storage
- OPEN TOPIC: Submit your own project idea
Our technical team will present each topic covering challenges, requirements, expected outcomes, including deliverables per sprint, and what (technical) support/resources we will provide to the winning teams.
About the open call
We are looking for XR innovators — from businesses to research institutions — to collaborate with us on developing our extended reality (XR) teleconference platform, providing new modules and features to expand its functionalities.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
Please check the Open Call 2 website, carefully review the documents, and prepare your questions. Our colleagues will ensure that they are answered.
The open call is open from 13 June to 15 August 2024 at 17:00 CET.
Book your place
Follow us on X and LinkedIn
Subscribe to our newsletter
CORTEX2 Open Call 2 Webinar 1: How to apply?
Join our first informative webinar on 12 July 2024 to learn how to successfully apply to our Open Call 2!
When? On Friday, 12 July 2024, at 11:00 am CET (Brussels time).
For whom? For companies (tech startups/SMEs) and research institutions (universities, NGOs, foundations, associations) eager to join us in revolutionising remote collaboration with extended reality (XR) by applying to our Open Call 2.
Register now!
In this event, we will share the details of our Open Call 2, for which we are seeking XR innovators — from businesses to research institutions — to collaborate with us on developing our XR teleconference platform and scaling their immersive communication technologies.
Applicants will become eligible to receive up to €100,000 and access our 9-month support programme. This includes tailored guidance and support, as well as access to technology and business experts, capacity building, and resources to facilitate the integration and understanding of our platform.
Check the Open Call 2 website, carefully review the documents, and prepare your questions. Our colleagues Iwa Stefanik and Ellie Shtereva from F6S, along with project coordinator Alain Pagani, from DFKI, will ensure to answer them.
The open call is open from 13 June to 15 August 2024 at 17:00 CET.
Discover this exciting opportunity and learn how to make a successful application!
Book your place
Agenda
11:00 am - Welcome (Alain Pagani)
11:05 am - Brief presentation of CORTEX2 (Alain Pagani)
11:15 am - Overview of CORTEX2 Open Call 2, including objectives, requirements, and relevant considerations (Iwa Stefanik and Ellie Shtereva)
11:30 am - Technical description of CORTEX2 services (Alain Pagani)
11:40 am - Q&A, final considerations and closing
Follow us on X and LinkedIn
Subscribe to our newsletter
Recap of CORTEX2's 3rd Progress Meeting in Castellón
Our CORTEX2 team gathered in Castellón, Spain, from March 21st to 22nd, 2024, for our third Progress Meeting, graciously hosted by Universitat Jaume I. The purpose of this meeting was to delve into the latest developments within our project, discuss the significant milestones we have achieved, conduct pilot demonstrations, and plan for the future.
Representatives from all ten organizations comprising CORTEX2 travelled from all over Europe to reconnect, update each other on their respective contributions to the project, and align on the crucial next steps necessary to continue advancing towards our goals.
During the first day of the meeting, we carried out a comprehensive review covering different areas of our work. From our extended reality modules and services to the ethical, legal, and social implications of the immersive technologies we are developing.
Furthermore, we had the opportunity to discuss our progress in innovation, dissemination, and exploitation strategies, as well as Financial Support for Third Parties (FSTP), emphasising the collaborative efforts that lie ahead in selecting the third parties from our Open Call #1, which we hope to announce soon.
We dedicated the second day to demonstrating and testing our three pilots. This allowed us to gather valuable insights and feedback on our progress to date and illuminate areas where further improvements are needed. We also take the opportunity to record videos of the process to be able to show the advancements we have made.
We concluded this fruitful and productive meeting, which marked another milestone in our CORTEX2 journey, with a summary of key takeaways, conclusions, and clear next steps.
Special thanks to our amazing colleagues at Universitat Jaume I for an excellent welcome in Castellón!
To stay up to date with our innovations, aimed at shaping a future of enhanced remote collaboration, and our opportunities for XR innovators, we invite you to subscribe to our newsletter and connect with us on social media (LinkedIn and Twitter).
Celebrating the results of our first Open Call
At CORTEX2, we are excited to announce that our first Open Call, launched on 24 October 2023 and closed on 16 January 2024, has been a great success, exceeding our expectations. We have received 146 applications coming from 41 countries! Thanks to everyone who has shown interest in being part of our journey, and good luck!
The CORTEX2 Open Call #1 results
Our first open call has attracted a high number of applications — 146 submitted out of 264 started. Regarding interest distribution, Track 1 (for co-developers) has attracted 103 started applications, out of which 49 have been submitted, and Track 2 (for use-cases) has attracted 161 started applications, out of which 97 have been submitted.
- Total submitted applications: 146 from 41 countries (264 started)
- Track 1 (co-development): 49 applications submitted (103 started)
- Track 2 (use-case): 97 applications submitted (161 initiated)
The evaluation process for Track 1 is set to conclude in mid-March 2024, with results expected by the end of the month. For Track 2, starting later, the evaluation process will conclude by the end of April 2024.
A big thank you to all applicants for your interest in joining us in democratising extended reality (XR) solutions for easy, efficient, and accessible remote collaboration!
Want to meet our XR innovators?
We will select 20 teams, comprised of innovative research-oriented institutions and business-driven organisations, to co-develop our XR platform and engage new use cases to demonstrate its replicability in different domains. They will also receive funding (€100.000 for co-developers and €200.000 for use-case) and be part of a 12-month support program, in which they will have continuous guidance and support, access to tech and business experts, capacity building on CORTEX2 and XR technologies & trends and access to resources to facilitate the integration and understanding of our CORTEX2 platform.
Stay tuned to our website and social media channels (LinkedIn and Twitter) to meet them!
CORTEX2 Open Call 1 Webinar 3: Decoding Open Call #1 Topics & Domains
On 14 December 2023, we held our third and final informative webinar on our Open Call 1, in which we went through the technical topics to apply to its Track 1 and 2 and held a live Q&A that allowed the innovators who joined us to better understand the key elements of this exciting opportunity.
On this occasion, the attendees had the opportunity to learn all about the open call Topics from Track 1 and the aspects of desired Domains in Track 2. Our technical team, represented by our partners from the Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI), Actimage GmbH, Intracom Telecom, Alcatel-Lucent Enterprise and LINAGORA, covered the open call challenges, requirements, expected outcomes, including three deliverables per sprint, and what (technical) support/resources we will provide to the winning parties.
The CORTEX2 Open Call 1 key details
Now that we have developed the backbone and specific features of our CORTEX2 XR platform, we are looking for partners to provide new modules to improve its functionalities, adaptability and replicability in new domains and use cases. In particular, we are looking for research-oriented institutions (universities, foundations, associations, NGOs, etc.) and business-driven organisations (tech startups and SMEs).
The open call beneficiaries will receive up to € 200.000, tailored coaching and monitoring, access to technology and business experts, capacity building on the CORTEX2 platform and XR technologies & trends, and exploitation and commercialisation support during a programme of up to 12 months.
Through this call, we will recruit and support 20 teams through one of our two tracks:
- Track 1: Co-development. For tech startups and SMEs to co-develop our platform's modules and features and shape the future of teleconferencing together.
- Track 2: Use-case. For tech adopters and developers to apply the platform in diverse domains.
The Open Call will be open until 16 January 2024 at 17:00 CET.
Head to our website to find all the relevant information and documents to read before applying, and watch the complete webinar below to decode the Topics and Domains of this opportunity!
https://www.youtube.com/watch?v=7vu5e52UFCw&list=PLTWMKUuxFCYHDexHP5VqnPYj-gLk8pBtv&index=5
Don't miss our other two informative webinars. Head to our dedicated YouTube playlist or learn more about them here:
and follow us on LinkedIn and Twitter to keep up to date on all things CORTEX2!
CORTEX2 Open Call 1 Webinar 2: Breaking Essentials for Successful Applying
On 6 December 2023, we held our second informative webinar about our Open Call 1. In it, our colleagues Iwa Stefanik and Ellie Shtereva, from F6S, shared the key requirements to apply to our €3M open call successfully.
They also dived into what to expect before, during and after applying:
- I applied, what’s next? The evaluation process
- I was selected, what’s next? The programme details
- I passed the review. When and how will I receive the funding?
Head to our website to find all the relevant information and documents to read before applying, and watch the complete webinar below to break down the essentials for a successful application!
https://www.youtube.com/watch?v=3RCttPtqXBU&list=PLTWMKUuxFCYHDexHP5VqnPYj-gLk8pBtv&index=4
Make sure you sign up for our last webinar, which we'll hold on 14 December 2023, to learn more about the open call Topics and Domains with our technical team and answer all your questions before submitting your application.
Don't miss our other two informative webinars. Head to our dedicated YouTube playlist or learn more about them here:
and follow us on LinkedIn and Twitter to keep up to date on all things CORTEX2!