Viewing Gaussian Splats in Augmented Reality


Gaussian splats are gaining traction as an intriguing area of interest in the realm of computer graphics. As developers focusing on AR/VR middleware, we wanted to determine the challenges of adapting existing Gaussian splat solutions for Augmented Reality (AR) viewing. Our device of choice for this experimentation was the MS HoloLens 2.Most early adopters and enthusiasts began their Gaussian splat journey with the SIBR_Viewer. Keen on building on this momentum, we decided to adapt it to leverage our AR/VR library. While the SIBR viewer is lauded for its open-source availability and rudimentary stereo support, it was fundamentally based on anaglyph stereo – a red/cyan image format requiring special glasses to view. However, we realized that our Quaternar Remoting library could potentially bridge this gap and allow for AR integration.

Technical Journey

At the heart of our experiment lay the core/view/RenderingMode.cpp, which handles anaglyph stereo rendering. By integrating the Quaternar library, we could synchronize the camera’s location at the start of each frame with the user’s head position in the real world. But this integration was not without its challenges:

  • Coordinate System Mismatch: OpenXR and SIBR_viewer operate on different coordinate systems. We had to engage in mathematical conversions to ensure consistency between the two.
  • Anaglyph Limitations: In anaglyph mode, both cameras share the same settings, including their field of view and view frustum. However, with the HoloLens, each eye’s display has slight manufacturing differences, leading to distinct view frustums for each eye.

Having navigated these initial challenges, our next step was to capture the OpenGL-rendered scene and send it to our Quaternar library. Here, we encountered the hurdle of mismatched coordinate systems between OpenGL and DX11 (Quaternar library’s internal API). The result? A completely inverted texture presentation.

But with these issues addressed, the moment of truth arrived. I put on the HoloLens, and what lay before me was a stunning bike rendered entirely of Gaussian splats.

A Picture is Worth a Thousand Words… Sometimes

My excitement led me to capture this experience using the Windows Device Portal via Mixed Reality Capture. To my surprise, the captured image vastly differed from what I had witnessed.

To comprehend this discrepancy, one must delve into the mechanics of the HoloLens 2. It functions by projecting lasers onto the user’s eyes through semi-reflective mirrors. The device does not block light, rendering it incapable of displaying the color black – which the brain interprets as transparent. This distinction between black and transparent becomes glaringly evident when utilizing Mixed Reality Capture.

Further investigation revealed that the Quaternar Remoting library, while using Microsoft Holographic Remoting for data transfer to HoloLens, only sends RGB data, omitting the Alpha channel. While this is satisfactory for regular viewing, it distorts screenshot captures.

To validate this, I designed a basic Unity application with a gradient of cubes.

The results mirrored our earlier observations. Darker shades were invisible in the AR capture, underscoring the issue of the missing Alpha channel. This limitation also affects blending of remotely rendered scenes with other on-device content. Notably, this issue only affects Holographic Remote Rendering; local rendering on HoloLens retains the alpha channel, ensuring consistent visuals in both the headset and Mixed Reality capture.

A Real-world Experiment

Eager to further understand the intricacies of this rendering challenge, I embarked on creating my Gaussian splatting models. Using a total of 150 pictures of a plant in our workspace, I generated a 3D model. The digital representation was impressive in its accuracy.

Ground truth (photo)
Rendered Gaussians

However, when viewed through the lens of Mixed Reality Capture, the same limitations appeared.

This reiterated the challenge we faced with the missing Alpha channel. While it might seem a minor omission in regular viewing, its absence becomes a significant hurdle in applications where blending and transitioning between different layers and elements are critical.


As we continue our exploration, we’re keen on investigating various remote rendering solutions, including potential in-house developments.