Q: What is NODAV in one sentence?
A: Efficient compression and visualisation tools for 3D Gaussian Splatting to enable real-time rendering of lightweight 3D scenes across VR, web, and mobile platforms
Q: What problem are you solving? What makes your solution unique?
A: 3D Gaussian Splatting (3DGS) offers high-quality real-time 3D scene rendering, but the resulting datasets are large and computationally intensive—making them impractical for storage, transmission, and rendering on low-end or mobile devices. NODAV combines advanced pruning, quantisation, and real-time encoding/decoding techniques specifically tailored to 3DGS data, and integrates them into a unified pipeline that works seamlessly in Unity across VR, desktop, and web platforms—delivering high visual quality with drastically reduced file sizes (target: <50MB) and minimal performance trade-offs.
Q: What are NODAV’s main objectives?
A:
- Reduce the size of 3D Gaussian Splatting (.ply and .splat) files through pruning and quantisation, aiming for lightweight 3D scene representations.
- Develop an efficient real-time compression framework to encode/decode 3DGS data, enabling scalable cloud storage and streaming
- Integrate the compressed 3DGS data into the Unity engine for cross-platform visualization on VR headsets, mobile devices, and desktop.
- Seamlessly embed all modules into the CORTEX2 framework, including data synchronisation, scene storage, and client-side rendering capabilities.
- Conduct technical validation and user testing to ensure visual quality, usability across devices, and user validation.
- Share outcomes through scientific publications, public demos, dataset releases, and social media engagement to maximise impact and transparency.
CORTEX2 support programme progress
Q: What were the main activities implemented and milestones achieved during Sprint 1 of the CORTEX2 Support Programme?
A
Main activities:
- Developed an automatic evaluation tool to measure the quality of 3D Gaussian Splatting (3DGS) reconstructions.
- Built a pipeline using 360° cameras for efficient and high-quality scene capture.
- Created a dataset of 15 museum rooms for testing and validation.
- Applied data compression using pruning and spherical harmonics to reduce file size by up to 10×.
- Designed an encoding method using video compression (H.265 preferred over H.264 for quality).
- Developed a cross-platform renderer using Unity and Vulkan for Windows and Android VR devices.
Key milestones:
- Quality evaluation system running with key image metrics.
- Compression pipeline achieved up to 10× data reduction.
- First version of encoding/decoding module completed.
- Renderer deployed successfully on Windows and Meta Quest 3 (Android VR).
- Achieved 100 FPS on Windows VR; 15–20 FPS on Android VR.
- Dataset successfully recorded and processed (more than 10 scenes).
Q: What have you achieved so far?
A:
Main activities:
- Introduced quantisation techniques to enhance the existing compression pipeline. Achieved up to 25× data reduction with minimal quality loss.
- Switched to lossless LZW compression for critical attributes to maintain visual quality while reducing size.
- Built a modular, API-based pipeline integrating reconstruction, compression, quantisation, and encoding, deployable via Docker.
- Implemented single-pass multiview rendering to improve VR performance.
- Integrated WebGPU rendering for browser-based visualisation.
- Focused on untethered VR (Quest 3) with gains in rendering efficiency and stability.
Key milestones:
- Compression pipeline achieved up to 25× reduction (goal: 30×).
- New lossless encoder fully implemented and tested.
- Cross-platform rendering extended to Web (WebGPU) and optimised for VR (single-pass).
- Created a containerised service exposing each pipeline component via REST API.
- Achieved >50 FPS in WebGPU, >100 FPS on desktop, and improved Quest 3 performance.
- KPIs for compression and dataset recording were fulfilled; rendering speed and quality are ongoing.
Q: How is participating in CORTEX2 supporting NODAV?
A: CORTEX2 has directly supported our technical progress by providing a platform to validate and optimise our 3D Gaussian Splatting pipeline in real-world XR environments. It enabled us to test rendering performance on untethered VR devices, refine our compression pipeline, and integrate our system with CORTEX2’s web-based and mobile platforms.
Q: What are your next steps within the CORTEX2 Programme?
A: Our next steps can be summarised as:
– Full deployment and validation of our reconstruction, pruning and compression pipeline within CORTEX2 ecosystem.
– Set of pilots/demos to showcase to other partners and potential customers the technology we have developed.
Learn more about NODAV and stay updated on its progress!
Want to explore more XR innovation? Browse all our supported projects on the CORTEX2 website:
