From wildfires and floods to large-scale accidents, emergency responders need reliable, real-time information to manage crises safely and efficiently. The CARE XR project is exploring how immersive technologies can enhance communication, improve situational awareness, and facilitate faster and more informed decision-making.

Keep reading to learn how CARE XR is working to reshape emergency response — and where it’s headed next.

Q: How would you describe CARE XR in one sentence?

A: CARE XR improves situational awareness during emergency situations.

Q: What problem are you solving? What makes your solution unique?

A: The CARE XR project aims to transform emergency management by addressing the challenges faced by first responders and planners during crises.

Existing systems, like the Next-Generation Incident Command System (NICS), rely heavily on 2D maps and static visualisations, which often fall short in providing the depth of information needed to navigate complex emergencies. Additionally, communication systems are typically one-directional, and the overload of unprocessed data can hinder swift decision-making. These limitations make it harder for emergency teams to respond effectively, risking delays and mismanagement.

The solution offered by CARE XR is unique because it integrates cutting-edge technologies, including extended reality (XR), the internet of things (IoT), and artificial intelligence (AI), into the emergency response process. By replacing 2D maps with immersive 3D visualisations, it gives first responders and planners a much clearer view of the situation. IoT devices collect real-time data, such as temperature or smoke levels, and overlay this information onto the 3D models, making it easier to assess risks and resources. The system also uses machine learning to process large amounts of data quickly, ensuring that decision-makers focus only on the most important information. Moreover, it incorporates voice recognition and natural language processing to enhance communication, allowing responders to exchange critical updates efficiently.

This project stands out not just for its technology but for its vision of collaboration and accessibility. By enabling two-way interaction through XR tools, it empowers responders on the ground to communicate dynamically with command centres. The seamless integration of all these technologies into a unified system ensures faster response times, better resource management, and, ultimately, safer outcomes for both responders and the public. The CARE XR platform has the potential to set a new global standard for emergency management, demonstrating how advanced technologies can save lives and improve safety during critical incidents.

Q: What are CARE XR’s key objectives?

A:

  • Enhance emergency response: integrate XR, IoT, and ML with NICS to improve decision-making and resource management during emergencies.
  • Develop an XR platform: create a real-time, interactive 3D visualisation tool enriched with IoT data and ML analytics.
  • Improve communication: implement advanced systems for real-time audio and video communication using ML and NLP to prioritise critical information.

CORTEX2 support programme progress

Q: What have you achieved so far?

A: We have successfully achieved full integration with Rainbow, enabling the application to send data in various formats, including text, audio, and video. Additionally, it supports the automatic creation of bubbles and the ability to add users to them.

Moreover, we have integrated IoT devices with CORTEX2. These devices can transmit measurements to CORTEX2 endpoints at specified intervals. To facilitate this process, we developed a fully functional Python wrapper, which can also be utilised by other CORTEX2 users working with IoT devices.

Lastly, we have used Scene Reconstruction tools through Gaussian Splatting. At this stage, we can generate point clouds in .ply format. Our next objective is to progress further by generating meshes from these .ply files.

Q: How is participating in CORTEX2 supporting CARE XR?

A: Participation in CORTEX2 has been invaluable and pivotal to the success of our project. We have greatly benefited from the extensive libraries provided by CORTEX2, as well as from the expertise and insights shared by teams that have implemented these libraries, such as DFKI and Intracom. Furthermore, the mentorship and guidance of our mentor have been instrumental in helping us tackle challenges effectively. His support has not only helped us identify solutions but also provided clarity on how to approach complex problems, significantly enhancing our team’s progress and confidence.

Q: What are your next steps within the programme?

A: Our next steps within the CORTEX2 programme involve finalising the prototype application and transitioning toward a more stable and robust version. We are preparing to implement a newly designed UI and will also revise the interface for XR headsets to ensure optimal functionality and user experience.

On the technical side, we aim to complete the implementation of speech recognition and summarisation tools, which are yet to be deployed by CORTEX2. To mitigate potential delays, we are also formulating a contingency plan to utilise external libraries for these features if necessary. This dual approach ensures we maintain momentum and continue progressing effectively.


Learn more about CARE XR and stay updated on its progress!

Want to explore more XR innovation? Browse all our supported projects on the CORTEX2 website: 

Open Call 1 winners  –  Open Call 2 winners

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.