Skip to yearly menu bar Skip to main content


Poster

Efficient 3D Gaussian Splatting with Compressed Model Training

Sankeerth Durvasula · Sharanshangar Muhunthan · Zain Moustafa · Richard Chen · Ruofan Liang · Yushi Guan · Nilesh Ahuja · Nilesh Jain · Selvakumar Panneer · Nandita Vijaykumar


Abstract:

3D Gaussian Splatting (3DGS) is a state-of-art technique to model real-world scenes with high quality and real-time rendering.Typically, a higher quality representation can be achieved by using a large number of 3D Gaussians. However, using large 3D Gaussian counts significantly increases the GPU device memory for storing model parameters. A large model thus requires powerful GPUs with high memory capacities for training and has slower training/rendering latencies due to the inefficiencies of memory access and data movement. In this work, we introduce ContraGS, a method to enable training directly on compressed 3DGS representations without reducing the Gaussian Counts, and thus with a little loss in model quality. ContraGS leverages codebooks to compactly store a set of Gaussian parameter vectors throughout the training process, thereby significantly reducing memory consumption. While codebooks have been demonstrated to be highly effective at compressing fully trained 3DGS models, directly training using codebook representations is an unsolved challenge. ContraGS solves the problem of learning non-differentiable parameters in codebook-compressed representations by posing parameter estimation as a Bayesian inference problem. To this end, ContraGS provides a framework that effectively uses MCMC sampling to sample over a posterior distribution of these compressed representations. With ContraGS, we demonstrate that ContraGS significantly reduces the peak memory during training (on average 3.49X) and accelerated training and rendering 1.36Xand 1.88X on average, respectively), while retraining close to state-of-art quality.

Live content is unavailable. Log in and register to view live content