11 Environment and ambient illumination Steve Marschner CS5625 - - PowerPoint PPT Presentation
11 Environment and ambient illumination Steve Marschner CS5625 - - PowerPoint PPT Presentation
11 Environment and ambient illumination Steve Marschner CS5625 Spring 2020 Real-time environment illumination Now incident radiance field is stored in a texture, not defined by a polygon this makes it easy to do mirror reflections: single
Real-time environment illumination
Now incident radiance field is stored in a texture, not defined by a polygon
- this makes it easy to do mirror reflections: single cubemap lookup
Two special kinds of BRDFs are convenient
- diffuse BRDFs: irradiance depends only on surface normal (not v)
…leads to irradiance environment maps
- rotationally symmetric lobes (e.g. Phong): it’s a convolution of the environment map
…leads to prefiltered environment maps
Standard reflection map
For mirror-specular surface, the illumination integral reduces to This is a function only of r
- so store it in a cubemap and look up using r(v, n).
Lr(v) = Li(r(v, n))
<latexit sha1_base64="lzHQhec/0+3wgVOSm7sGDJyI+ec=">ACQ3icbZDPahRBEMZrEv/EVeMmOYrQuAhZiDszu2uyOYgBL4J7iOAmgZ1l6OmtSZp094zdPUOWIac8jVd9Bh/CZ/Ai4lWwZzaKigUNv/q6vrS3LBjQ2Cz97K6o2bt26v3WndvXd/UF7Y/PIZIVmOGZyPRJQg0KrnBiuRV4kmukMhF4nJy/rOvHJWrDM/XWLnKcSXqeMoZtU6K24/Gsd6OyrJLnpNxzB026Q6JStXtxu1O0BuFz4K9AVnCcP8adgck7AVNdF58hSYO4w1vPZpnrJCoLBPUmGkY5HYnFfhOzSqLWcCL1tRYTCn7Jye4tShohLNrGq2uSRPnDInabdUZY06p8dFZXGLGTibkpqz8y/tVr8X21a2HQ0q7jKC4uKLQelhSA2I7U1ZM41MisWDijT3P2VsDOqKbPOwFbUNFb+xLjMN1r6Y5oqhe+xQuZ+lwxUczRd3M1v3hauqfcNo2Dw35/NKwdrGEQ/oLRbweP+r1w0AveDsHr5dWwho8hMewDSHswQG8gkOYAIMreA8f4KP3yfvifO+L6+ueNc9W/BXeD9+Au1BsHo=</latexit>Irradiance environment map
For diffuse surface, illumination integral reduces to This is a function only of n
- so store it in a cubemap and look up using n.
Lr = R π Z
Ω
Li(l) (n · l) dl
<latexit sha1_base64="IaqHgxQwXRGt6294Fawpmh2WdCM=">ACZnicbVBdb9MwFHUzPrYCpRuaeODFokIa0miStrDuAVGJFyQqMRDdJtV5Dg3nTXbCbZTrYr6d/ZreAWJf8AjPwEnLQgQV7Lu8bn3+PqeOBfc2CD41vC2bty8dXt7p3n7r3W/fbu3qnJCs1gwjKR6fOYGhBcwcRyK+A810BlLOAsvnxd1c8WoA3P1Ee7zGEm6VzxlDNqHRW1R+NI45eYpJqy8sOqJDlfYcKVjcg7CXOKxE/IAvxFJNDlxVhSWbxhkhcjtqdoDsMnwdHfbwGg+MNeNHYTeo/PqB6rjJNptEiSsUKCskxQY6ZhkNvDVMAnNSuptpwJWDVJYSCn7JLOYeqgohLMrKwXuEnjklwml3lMU1+6eipNKYpYxdp6T2wvxbq8j/1aFTYezkqu8sKDYelBaCGwzXLmHE6BWbF0gDLN3V8xu6DO+s8bpJaWPoT426+0dIf81hTvfQtXMnU54qJIgHfzdX86tnCPeW2qR0c9HrDQeVgBfrhLzD87eBprxv2u8H7QWf0dm0l2kaP0GN0gEJ0hEboDTpBE8TQNfqMvqCvje9ey9v3Hq5bvcZG8wD9FR7+Cex5uvk=</latexit>Irradiance environment map
environment map irradiance map
Gary King in GPU Gems 2 http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter10.html
Irradiance map illumination
McGuire et al. HPG ’11 10.1145/2018323.2018327
Irradiance map illumination
McGuire et al. HPG ’11 10.1145/2018323.2018327
Prefiltered environment map
For a general specular surface, approximate the BRDF by a function
- where g defines a lobe of width σ that is symmetric around r
The illumination integral is This depends only on r and the scalar σ
- so store it in an array of cubemaps indexed by σ, and look up using r
fr(v, l) (n · l) ≈ gσ(l · r)
<latexit sha1_base64="qHRkES0l4m/bzqAshvDx3MfcZjw=">ACZXicbVDLbtNAFJ2YVwm0pFCxYcGICKmV0thOAk1XrcQGCRZFIm2lOLG4+t01HmYmbGVyOrn9GvYlgVfwJZPYOykCBXGunc+5j7klyzowNgu8t787de/cfbDxsP3q8ufWks/301KhCU5hQxZU+T4gBziRMLMcznMNRCQczpLd7V+VoI2TMnPdpnDTJC5ZBmjxDoq7hxlsd6NyrKHo5Lv4aiHXSYjmiq7Zkiea7XA8zgybC6Ikzm+1fVe3OkG/XH4JjgY4hUYHa7B2yEO+0ET3aOfqImTeLu1FaWKFgKkpZwYMw2D3PYyDl/krCLaMsrhqh0VBnJCL8kcpg5KIsDMqubeK/zaMSnOlHZPWtywf3ZURBizFImrFMRemH+1mvyfNi1sNp5VTOaFBUlXi7KCY6twbR5OmQZq+dIBQjVzf8X0gmhCrbO4HTWNlT8xLvONFv5Hlmil76Fhch8JikvUvDdXs0W+6Ub5a5pHBwNBuNR7WANhuEtGP928HTQD4f94NOoe/xhZSXaQC/QK7SLQnSAjtF7dImiKJr9BXdoG+tH96mt+M9X5V6rXPM/RXeC9/Aaisumo=</latexit>Lr(v) ≈ Z
Ω
gσ(l · r) Li(l) dl
<latexit sha1_base64="lIC2K78XcxySqv6DyL9lSW0PkUI=">ACbHicbVDtahQxFM1O/air1q36rwjBRanQ7szsbu36y4IgtWcNvCZhkymTvT0CQzJplhl6Fv5NP4T/QZxEcwM1tFxQshJ+e2/uiQvBjQ2CLx1v49r1Gzc3b3Vv37m7da+3f/E5KVmMGO5yPVZTA0IrmBmuRVwVmigMhZwGl+8avKnFWjDc/XBrgpYSJopnJGraOi3utpHdJVT3DhBaFzpeYcGUj8k5CRnEWEcMzSZ1CYMKS3GJSafdm0a8IRuYuDvq9YPBJDwIDkd4DcYvrsDzEQ4HQRv9lz9QG8fRdmeLJDkrJSjLBDVmHgaF3UsFfFSLmrLmYDLikNFJRd0AzmDioqwSzqdu1L/MQxCU5z7Y6yuGX/rKipNGYlY6eU1J6bf3MN+b/cvLTpZFzVZQWFsPSkuBbY4bD3HCNTArVg5Qprn7K2bnVFNmndNd0hbW/sy4l2+09Kc81lSvfAtLmfpcMVEm4Lu5mi/3K9fKbdM6OB4OJ+PGwQaMwl9g8tvBk+EgHA2C9+P+0du1lWgT7aDHaBeF6BAdoTfoGM0Q5/QZ/QVfet89x56O96jtdTrXNU8QH+F9/QnEzy9Tw=</latexit>Irradiance environment mapping
Akenine-Möller et al. RTR 3e
prefiltered map for glossy surface environment map for specular surface prefiltered map for diffuse surface
Prefiltered environment map
environment map prefiltered for Phong
Gary King in GPU Gems 2 http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter10.html
blitzcode.net
Prefiltered map variants
Many approaches to approximating the BRDF with symmetric lobes
- classic is to use a single lobe, loses stretching at grazing angles
- many techniques use multiple samples, similar to anisotropic texture filtering
- modern thought process is to treat the whole precomputation as approximating more accurate
BRDF lobes using multiple samples from arbitrary maps stored in mipmaps
Manson & Sloan 2016
Shadow baking
Rendering with no shadows, darker diffuse floor Floor shaded with irradiance from shadow texture Irradiance texture computed using rectangular light
a convex diffuse object in a constant-radiance environment
a non-convex diffuse scene under constant-radiance illumination
a non-convex diffuse object in a constant-radiance environment
a non-convex diffuse scene under constant-radiance illumination
- bject in a constant-radiance environment with no shadowing
Akenine-Möller et al. RTR 3e
slide courtesy of Kavita Bala, Cornell University
AO Maps
Ray traced vertex AO
Kavan et al. EGSR 2011 ambient occlusion sampled at vertices, interpolated as vertex color ambient occlusion sampled inside triangles, vertex values fit to samples, interpolated as vertex color ambient occlusion computed at each pixel (ground truth)
NVIDIA OptiX implementation images
https://developer.nvidia.com/optix-prime-baking-sample
NVIDIA OptiX implementation images
https://developer.nvidia.com/optix-prime-baking-sample
NVIDIA OptiX implementation images
https://developer.nvidia.com/optix-prime-baking-sample
slide courtesy of Kavita Bala, Cornell University
EnvMap
slide courtesy of Kavita Bala, Cornell University
AO
slide courtesy of Kavita Bala, Cornell University
Total
Akenine-Möller et al. RTR 3e
slide courtesy of Kavita Bala, Cornell University
Ambient Occlusion: Improvement
- At each point find
– Fraction of hemisphere that is occluded – Also, average unoccluded direction B
(bent normal) Use B for lighting (see later)
slide courtesy of Kavita Bala, Cornell University
What about B?
- The unoccluded direction gives an idea of
where the main illumination is coming from
slide courtesy of Kavita Bala, Cornell University
SSAO
- Restrict the hemisphere
– Why? Think of AO in box
- Typically add a drop-off as you get to the
hemisphere boundary
slide courtesy of Kavita Bala, Cornell University
Crytek for Crisis: SSAO
- Take z-buffer
- Consider sphere around a point p
– Distribute samples – Project to screen space and compare to z buffer
Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2
Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2
Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2
Hemisphere vs. Sphere
John Chapman http://john-chapman-graphics.blogspot.co.uk/2013/01/ssao-tutorial.html
Irradiance map + SSAO
McGuire et al. HPG ’11 10.1145/2018323.2018327
Irradiance map + SSAO
McGuire et al. HPG ’11 10.1145/2018323.2018327
Irradiance map + SSAO
McGuire et al. HPG ’11 10.1145/2018323.2018327
Irradiance map + SSAO
McGuire et al. HPG ’11 10.1145/2018323.2018327
slide courtesy of Kavita Bala, Cornell University
slide courtesy of Kavita Bala, Cornell University
slide courtesy of Kavita Bala, Cornell University
slide courtesy of Kavita Bala, Cornell University
slide courtesy of Kavita Bala, Cornell University
slide by Frédo Durand, MIT
Denoising from 1 image
- We can’t take average over
multiple images Noisy input
slide by Frédo Durand, MIT
Denoising from 1 image
- We can’t take average over
multiple images
- Idea 1: take a spatial
average
- Most pixels have roughly teh
same color as their neighbor
- Noise looks high frequency =>
do a low pass
- Here: Gaussian blur
Noisy input
slide by Frédo Durand, MIT
Gaussian blur
- Noise is mostly gone
- But image is blurry
- duh!
After Gaussian blur
slide by Frédo Durand, MIT
Gaussian blur
- Noise is mostly gone
- But image is blurry
- duh!
- Question: how to blur/
smooth/abstract image, but without destroying important features? After Gaussian blur
adapted from slide by Frédo Durand, MIT
slide by Frédo Durand, MIT
Bilateral filter
- [Tomasi and Manduci 1998]
–http://www.cse.ucsc.edu/~manduchi/Papers/ICCV98.pdf
- Developed for denoising
- Related to
–SUSAN filter [Smith and Brady 95]
http://citeseer.ist.psu.edu/smith95susan.html
–Digital-TV [Chan, Osher and Chen 2001]
http://citeseer.ist.psu.edu/chan01digital.html
–sigma filter http://www.geogr.ku.dk/CHIPS/Manual/f187.htm
- Full survey: http://people.csail.mit.edu/sparis/publi/2009/
fntcgv/Paris_09_Bilateral_filtering.pdf
slide by Frédo Durand, MIT
Start with Gaussian filtering
- Here, input is a step function + noise
- utput
input
J
=
f
⊗
I
slide by Frédo Durand, MIT
Gaussian filter as weighted average
- Weight of ξ depends on distance to x
- utput
input
ξ
∑
f (x,ξ)
I(ξ)
ξ x x ξ
J(x) =
slide by Frédo Durand, MIT
The problem of edges
- Here, “pollutes” our estimate J(x)
- It is too different
- utput
input
x ξ Ι(ξ) I(x) Ι(ξ)
ξ
∑
f (x,ξ)
I(ξ)
J(x) =
slide by Frédo Durand, MIT
Principle of Bilateral filtering
[Tomasi and Manduchi 1998]
- Penalty g on the intensity difference
- utput
input
J(x) =
1 k(x)
ξ
∑ f (x,ξ)
g(I(ξ) − I(x)) I(ξ)
x Ι(ξ) I(x)
slide by Frédo Durand, MIT
Bilateral filtering
[Tomasi and Manduchi 1998]
- Spatial Gaussian f
- utput
input
J(x) =
1 k(x)
ξ
∑ f (x,ξ)g(I(ξ) − I(x)) I(ξ)
x ξ x
slide by Frédo Durand, MIT
Bilateral filtering
[Tomasi and Manduchi 1998]
- Spatial Gaussian f
- Gaussian g on the intensity difference
- utput
input
J(x) =
1 k(x)
ξ
∑ f (x,ξ) g(I(ξ) − I(x))I(ξ)
x Ι(ξ) I(x)
slide by Frédo Durand, MIT
Normalization factor
[Tomasi and Manduchi 1998]
- k(x)=
- utput
input
J(x) =
1 k(x)
ξ
∑ f (x,ξ)
g(I(ξ) − I(x)) I(ξ)
ξ
∑ f (x,ξ)
g(I(ξ) − I(x))
slide by Frédo Durand, MIT
Bilateral filtering is non-linear
[Tomasi and Manduchi 1998]
- The weights are different for each output pixel
- utput
input
Cornell CS6640 Fall 2012
Effects of bilateral filter
55
size of domain filter size of range filter
[Tomasi & Manduchi 1998]
slide by Frédo Durand, MIT
Bilateral filter
Noisy input After gaussian blur
adapted from slide by Frédo Durand, MIT
slide by Frédo Durand, MIT