2024
Real-Time Cloud Rendering

Parker Ford
Master of Science Thesis Project, June 2024

[Proposal   Thesis   Presentation]

Paper: P. Ford and K. Sung, “Real-Time Atmospheric Cloud Rendering System,” in Proceedings of the 37th Conference on Graphics, Patterns and Images (SIBGRAPHI 2024). Available:  Full Paper,   Presentation Slides,   Parker at the Conference,   Parker working VERY hard with new Brazilian Friends.

Rendering realistic clouds is an important aspect of creating believable virtual worlds. The detailed shapes and complex light interactions present in clouds make this a daunting task to complete in a real-time application. Our solution, based on Schneider’s cloud modeling and Fong’s volumetric rendering frameworks for low-altitude cloudscapes, supports realism and real-time performance. For efficient approximations of radiance measurements, we adopt Hillaire’s energy-conserving integration method for light scattering. To simulate the effect of multiple light scattering, we followed Wrenninge’s approach for computing the multi-bounce diffusion of light within a volume. To capture the details of light interreflection off microscopic water droplets, the complex behavior of Mie scattering is approximated with Jenderise and d’Eon’s phase function modeling technique. To capture the details with nominal computational cost, we introduce a temporal anti-aliasing strategy that unifies the sampling strategy for the area over a pixel and interval of volumetric participating media.

The resulting system is capable of rendering scenes consisting of expansive cloudscapes well within real-time requirements, achieving frame rates between 2 and 3 milliseconds on a typical machine. Users can adjust parameters to control various types of low-altitude cloud formations and weather conditions, with presets available for easily transitioning between settings. Our unique combination of techniques adopted in the volumetric rendering process enhances both efficiency and visual fidelity where the novel approach to volumetric temporal anti-aliasing efficiently and effectively unifies the sampling of pixel areas and volumetric intervals. Looking forward, this technique could be adapted for real-time applications such as video games or flight simulations. Further improvements could refine the cloud modeling system, incorporating procedural generation for high-altitude clouds, thus broadening the range of cloudscapes that can be represented. Additionally, our volumetric rendering framework could be paired with recent investigations into voxel-based cloud rendering.

Under supervision of Dr. Kelvin Sung. Division of Computing Software Systems at UW Bothell