Real-time Techniques

Members: Rui Wang, Shi Li, Dejing He, Yuzhi Liang, Hujun Bao

Alumni: Chao Xu, Minghao Pan, Xiang Han

Multi-Scale Hybrid Micro-Appearance Modeling and Realtime Rendering of Thin Fabrics

Chao Xu1, Rui Wang1, Shuang Zhao2, Hujun Bao1,
1State Key Lab of CAD&CG, Zhejiang University
2University of California, Irvine

Accepted by IEEE Transactions on Visualization and Computer Graphics.

Abstract:

Micro-appearance models offer state-of-the-art quality for cloth renderings. Unfortunately, they usually rely on 3D volumes or fiber meshes that are not only data-intensive but also expensive to render. Traditional surface-based models, on the other hand, are light-weight and fast to render but normally lack the fidelity and details important for design and prototyping applications. We introduce a multi-scale, hybrid model to bridge this gap for thin fabrics. Our model enjoys both the compactness and speedy rendering offered by traditional surface-based models and the rich details provided by the micro-appearance models. Further, we propose a new algorithm to convert state-of-the-art micro-appearance models into our representation while qualitatively preserving the detailed appearance. We demonstrate the effectiveness of our technique by integrating it into a real-time rendering system.

Download:

Paper
Supplemental Video

Real-Time Rendering of Stereo-Consistent Contours

Dejing He, Rui Wang, Hujun Bao,
State Key Lab of CAD&CG, Zhejiang University

IEEE VR 2019.

Abstract:

Line drawing is an important and concise method to depict the shape of an object. Stereo line drawing, a combination of line drawing and stereo rendering, not only efficiently conveys shape but also provides users with a visual experience of a stereoscopic 3D world. Contours are the most important lines to draw. However, contours must be rendered consistently for two eyes because of their view-dependent nature; otherwise, they cause binocular rivalry and viewing discomfort. This paper proposes a novel solution to draw stereo-consistent contours in real time. First, we extend the concept of pipolarslidability and derive a new criterion to check epipolar-slidability by the monotonicity of the trajectory of the viewpoints of contour points. Then, we design an algorithm to test the epipolar-slidability of contours by conducting an image space search rather than sampling multiple viewpoints. Results show that the proposed method has a much lower cost than that of previous works, therefore enables the real-time rendering and editing of stereo-consistent contours for users, such as changing camera viewpoints, editing object geometry, tweaking parameters to show contours with different details, etc.

Download:

Paper
Supplemental Video

Real-Time Linear BRDF MIP-Mapping

Chao Xu1, Rui Wang1, Shuang Zhao2, Hujun Bao1,
1State Key Lab of CAD&CG, Zhejiang University
2University of California, Irvine

Computer Graphics Forum (Eurographics Symposium on Rendering), 2017.

Abstract:

We present a new technique to jointly MIP-map BRDF and normal maps. Starting with generating an instant BRDF map, our technique builds its MIP-mapped versions based on a highly efficient algorithm that interpolates von Mises-Fisher (vMF) distributions. In our BRDF MIP-maps, each pixel stores a vMF mixture approximating the average of all BRDF lobes from the finest level. Our method is capable of jointly MIP-mapping BRDF and normal maps, even with high-frequency variations, at real-time while preserving high-quality reflectance details. Further, it is very fast, easy to implement, and requires no precomputation.

Download:

Paper
Supplemental Video

Realtime Rendering Glossy to Glossy Reflections in Screen Space

Chao Xu, Rui Wang, Hujun Bao,
State Key Lab of CAD&CG, Zhejiang University

Computer Graphics Forum, 34(7), p57-66, Pacific Graphics 2015.

Abstract:

Glossy to glossy reflections are lights bounced between glossy surfaces. Such directional light transports are important for humans to perceive glossy materials, but difficult to simulate. This paper proposes a new method for rendering screen-space glossy to glossy reflections in realtime. We use spherical von Mises-Fisher (vMF) distributions to model glossy BRDFs at surfaces, and employ screen space directional occlusion (SSDO) rendering framework to trace indirect light transports bounced in the screen space. As our main contributions, we derive a new parameterization of vMF distribution so as to convert the non-linear fit of multiple vMF distributions into a linear sum in the new space. Then, we present a new linear filtering technique to build MIP-maps on glossy BRDFs, which allows us to create filtered radiance transfer functions at runtime, and efficiently estimate indirect glossy to glossy reflections. We demonstrate our method in a realtime application for rendering scenes with dynamic glossy objects. Compared with screen space directional occlusion, our approach only requires one extra texture and has a negligible overhead, 3% ∼ 6% loss at frame rate, but enables glossy to glossy reflections

Download:

Paper
Supplemental Video

Parallel and Adaptive Visibility Sampling for Rendering Dynamic Scenes with Spatially-Varying Reflectance

Rui Wang, Minghao Pan, Xiang Han, Weifeng Chen, Hujun Bao
State Key Lab of CAD&CG, Zhejiang University

CAD/GRAPHICS 2013, Computers & Graphics, vol.38, pp. 374-381, February 2014.

Abstract:

Glossy to glossy reflections are lights bounced between glossy surfaces. Such directional light transports are important for humans to perceive glossy materials, but difficult to simulate. This paper proposes a new method for rendering screen-space glossy to glossy reflections in realtime. We use spherical von Mises-Fisher (vMF) distributions to model glossy BRDFs at surfaces, and employ screen space directional occlusion (SSDO) rendering framework to trace indirect light transports bounced in the screen space. As our main contributions, we derive a new parameterization of vMF distribution so as to convert the non-linear fit of multiple vMF distributions into a linear sum in the new space. Then, we present a new linear filtering technique to build MIP-maps on glossy BRDFs, which allows us to create filtered radiance transfer functions at runtime, and efficiently estimate indirect glossy to glossy reflections. We demonstrate our method in a realtime application for rendering scenes with dynamic glossy objects. Compared with screen space directional occlusion, our approach only requires one extra texture and has a negligible overhead, 3% ∼ 6% loss at frame rate, but enables glossy to glossy reflections

Download:

Paper
Supplemental Document

Analytic Double Product Integrals for All-Frequency Relighting

Rui Wang, Minghao Pan, Weifeng Chen, Zhong Ren, Kun Zhou, Wei Hua, Hujun Bao
State Key Lab of CAD&CG, Zhejiang University

IEEE Transactions on Visualization and Computer Graphics (TVCG), vol.19, no.7, pp. 1133-1142, July 2013.

Abstract:

This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially-varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals – the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially-varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.

Download:

Paper
Supplemental Document