Sprint 2 of the CORTEX2 Innovators Support Programme has moved the SENSO3D team forward, bringing significant advancements across AI capabilities, asset creation, and the development of immersive environments. The team is now focusing on conference room components and deploying AI-powered tools for rapid scene generation, which enhances accessibility and utility in virtual environments. From building a large-scale 3D model repository to delivering polished Unity scenes ready for real-world use, SENSO3D is laying a strong foundation for scalable, high-quality extended reality (XR) content.

Continue reading to learn more about the SENSO3D team’s major milestones and next steps within the CORTEX2 programme.

SENSO3D’s progress during Sprint 2 of the CORTEX2 programme

Q: How would you summarise SENSO3D’s latest advancements within the CORTEX2 programme?

A: During Sprint 2 of the SENSO3D project, progress was made in various key areas:

  • CORTEX2 innovators_2nd progress update_SENSO3DExpansion of the 3D model library: Over 100 new high-quality 3D models were added, including office furniture, audiovisual equipment, and decorative elements. These were developed using structured-light scanning, photogrammetry, and AI-driven reconstruction.
  • Enhanced AI capabilities: Object detection algorithms were refined, enabling classification into 50 categories using the OMNI3D dataset and the F-Cube R-CNN model. New AI techniques significantly improved 2D-to-3D reconstruction, allowing for fast and accurate model generation from a single image​.
  • Unity Integration & Scene Development: A streamlined import workflow was implemented, automating model retrieval, scaling, texture assignments, and collision detection. Two new immersive environments, a 3D Lobby and a 3D Conference Room, were developed for AR/VR applications​.
  • Object Restoration & Optimisation: Mesh anomalies, texture misalignments, and polygon inconsistencies were resolved to ensure models met AR/VR standards. AI-enhanced model restoration techniques were applied to scanned assets​.
  • Deviations & Adjustments: The project shifted its focus from household objects to conference rooms due to market demand. Additionally, commercially available models were integrated to accelerate progress, and a new Prompt-Based Scene Creation Tool was introduced to automate virtual environment generation.

Q: What key milestones has SENSO3D achieved, and what impact have they had?

A: Sprint 2 was a crucial phase for the SENSO3D project, bringing several key advancements in AI, 3D modelling, and integration workflows.

Here are the most significant milestones and their impact:

Expansion of the 3D model library

  • Milestone: Over 100 new high-quality 3D models were added to the repository, including office chairs, podiums, AV equipment, and decorative elements. These models were developed using structured-light scanning, photogrammetry, and AI-driven reconstruction.
  • Impact: Enriched the SENSO3D asset library with realistic, high-fidelity models tailored for AR/VR environments. Provided developers with ready-to-use, customisable assets for immersive virtual spaces.

AI enhancements

  • Milestone:
    – AI-driven 3D object identification improved, allowing classification into 50 distinct categories.
    – Advanced 2D-to-3D reconstruction techniques (OMNI3D dataset, F-Cube R-CNN model) drastically enhanced accuracy and efficiency.
    – Introduced Sparse View 3D reconstruction, allowing high-quality 3D models to be created from just four images.
  • Impact:
    – Faster and more accurate object detection and reconstruction reduce manual modelling efforts.
    – Enabled scalability, making SENSO3D more accessible for various industries requiring AR/VR-ready assets.
    – Positioned the project ahead of schedule in AI advancements, exceeding initial expectations.

Unity integration & scene development

  • Milestone:
    – Developed a streamlined Unity import workflow, including:
    – Automated model retrieval and texture assignments.
    – Collision detection & prefab creation for AR/VR compatibility.
    – Designed two immersive environments:
    – 3D Lobby – A welcoming virtual entrance with interactive elements.
    – 3D Conference Room – A functional, realistic meeting space for AR/VR collaboration.
  • Impact:
    – Simplified asset deployment for developers by ensuring seamless integration into Unity.
    – Created realistic, immersive environments for virtual meetings, exhibitions, and training sessions.
    – Strengthened the project’s usability for real-world applications, particularly in business and education sectors.

Object restoration & quality standardisation

  • Milestone:
    – Developed a rigorous AI-driven restoration process to fix:
    – Mesh gaps, texture misalignments, and polygon irregularities.
    – Improved UV mapping for accurate texture application.
    – Optimised polygon count to balance visual fidelity and AR/VR performance.
  • Impact:
    – Ensured high-quality, optimised assets for resource-limited platforms like VR headsets & mobile AR applications.
    – Reduced manual intervention in model corrections, saving development time.

Introduction of the prompt-based scene creation tool

  • Milestone:
    – Developed a prototype AI tool that allows users to create virtual scenes from text prompts.
    – Automates search, retrieval, and placement of 3D objects into Unity environments.
  • Impact:
    – Revolutionises scene creation for non-technical users, making AR/VR design accessible.
    – Reduces manual work, accelerating content generation for XR experiences.

CORTEX2 innovators_2nd progress update_SENSO3D

Strategic shift from household objects to conference room focus

  • Milestone:
    – Based on market demand, the project pivoted from household objects to conference room components.
    – Integrated commercially available 3D models to accelerate progress.
  • Impact:
    – Aligns with real-world use cases, enhancing project relevance and adoption.
    – Expedited development by combining scanned + commercial models instead of manually creating everything.

Q: What are SENSO3D’s next steps within the CORTEX2 programme?

A: Our following main tasks are:

Finalising the 3D model repository

  • Tasks:
    – Complete the categorisation, optimisation, and metadata tagging of all 3D assets.
    – Ensure all models meet predefined quality standards for geometry, textures, and Unity compatibility.
    – Expand the library with additional AI-generated models.
  • Expected impact:
    – A fully structured and searchable 3D object repository ready for use in AR/VR applications.
    – Enhanced discoverability and reusability for developers working within the CORTEX2 framework.

Refining & polishing unity environments

  • Tasks:
    – Finalise the 3D Lobby and Conference Room environments, ensuring high visual and functional quality.
    – Implement additional interactivity (e.g., dynamic object manipulation, adjustable furniture layouts).
    – Conduct end-to-end testing of the Unity import pipeline.
  • Expected impact:
    – A seamless user experience in virtual environments, increasing adoption for business and training use cases.
    – Robust, high-performance Unity scenes for AR/VR applications.

Enhancing AI capabilities & scene creation tool

  • Tasks:
    – Further optimise AI object detection & reconstruction for improved accuracy and speed.
    – Refine the Prompt-Based Scene Creation Tool, enabling users to generate entire 3D scenes from text descriptions.
    – Integrate NLP (Natural Language Processing) enhancements for better understanding of user input.
  • Expected impact:
    – Faster, AI-assisted scene creation, reducing manual effort in virtual space design.
    – Broader accessibility, allowing even non-technical users to design immersive environments effortlessly.

CORTEX2 innovators_2nd progress update_SENSO3DPerformance optimisation for AR/VR deployment

  • Tasks:
    – Implement performance optimisation techniques such as:
    – Level-of-Detail (LOD) adjustments for scalability.
    – Texture compression to reduce load times.
    – Occlusion culling to improve rendering efficiency.
    – Conduct rigorous performance profiling across VR headsets, desktops, and mobile devices.
  • Expected impact:
    – Optimised performance across different platforms, ensuring smooth interactions and high frame rates.
    – Scalability for enterprise use cases, making SENSO3D accessible to a broader audience.

Pilot testing & user feedback collection

  • Tasks:
    – Deploy the 3D environments in pilot scenarios, including:
    – Virtual business meetings
    – Collaborative training sessions
    – Interactive exhibitions
    – Gather feedback from real users to refine usability and interaction design.
  • Expected impact:
    – Real-world validation of SENSO3D’s capabilities within the CORTEX2 ecosystem.
    – Iterative improvements based on user insights for better adoption.

Preparing for project handover & integration into CORTEX2

  • Tasks:
    – Ensure full compatibility with CORTEX2’s WebXR and WebGL frameworks.
    – Finalise technical documentation for developers who will integrate SENSO3D into their solutions.
    – Conduct a final review & testing phase before official project completion.
  • Expected impact:
    – Seamless integration into the CORTEX2 ecosystem, allowing wider adoption of SENSO3D assets.
    – A well-documented, scalable, and ready-to-use 3D object library for XR applications.

Check out SENSO3D’s previous interview and stay updated on its progress!

Want to know more about other CORTEX2 innovators’ updates? Browse all our supported teams on the CORTEX2 website:

Open Call 1 winners  –  Open Call 2 winners

Subscribe to our newsletter

This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement N° 101070192. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or European Union’s Horizon Europe research and innovation programme. Neither the European Union nor the granting authority can be held responsible for them.