16 Soft Illumination E ff ects Steve Marschner CS5625 Spring 2019 - - PowerPoint PPT Presentation

16 soft illumination e ff ects
SMART_READER_LITE
LIVE PREVIEW

16 Soft Illumination E ff ects Steve Marschner CS5625 Spring 2019 - - PowerPoint PPT Presentation

16 Soft Illumination E ff ects Steve Marschner CS5625 Spring 2019 Irradiance environment mapping Akenine-Mller et al. RTR 3e environment map prefiltered map prefiltered map for specular surface for glossy surface for diffuse


slide-1
SLIDE 1

16 Soft Illumination Effects

Steve Marschner CS5625 Spring 2019

slide-2
SLIDE 2

Irradiance environment mapping

Akenine-Möller et al. RTR 3e

prefiltered map 
 for glossy surface environment map 
 for specular surface prefiltered map 
 for diffuse surface

slide-3
SLIDE 3

Prefiltered environment map

environment map prefiltered for Phong

Gary King in GPU Gems 2 http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter10.html

slide-4
SLIDE 4

Irradiance environment map

environment map irradiance map

Gary King in GPU Gems 2 http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter10.html

slide-5
SLIDE 5

Irradiance map illumination

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-6
SLIDE 6

Shadow baking

Rendering with no shadows,
 darker diffuse floor Floor shaded with irradiance
 from shadow texture Irradiance texture computed 
 using rectangular light

slide-7
SLIDE 7

Irradiance map illumination

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-8
SLIDE 8

a convex diffuse object in a constant-radiance environment

slide-9
SLIDE 9

a non-convex diffuse scene under constant-radiance illumination

slide-10
SLIDE 10

a non-convex diffuse object in a constant-radiance environment

slide-11
SLIDE 11

a non-convex diffuse scene under constant-radiance illumination

slide-12
SLIDE 12
  • bject in a constant-radiance environment with no shadowing
slide-13
SLIDE 13

Akenine-Möller et al. RTR 3e

slide-14
SLIDE 14

slide courtesy of Kavita Bala, Cornell University

AO Maps

slide-15
SLIDE 15

Ray traced vertex AO

Kavan et al. EGSR 2011 ambient occlusion sampled at vertices, interpolated as vertex color ambient occlusion sampled inside
 triangles, vertex values fit to samples, interpolated as vertex color ambient occlusion computed at each
 pixel (ground truth)

slide-16
SLIDE 16

NVIDIA OptiX implementation images

https://developer.nvidia.com/optix-prime-baking-sample

slide-17
SLIDE 17

NVIDIA OptiX implementation images

https://developer.nvidia.com/optix-prime-baking-sample

slide-18
SLIDE 18

NVIDIA OptiX implementation images

https://developer.nvidia.com/optix-prime-baking-sample

slide-19
SLIDE 19

slide courtesy of Kavita Bala, Cornell University

EnvMap

slide-20
SLIDE 20

slide courtesy of Kavita Bala, Cornell University

AO

slide-21
SLIDE 21

slide courtesy of Kavita Bala, Cornell University

Total

slide-22
SLIDE 22

Akenine-Möller et al. RTR 3e

slide-23
SLIDE 23

slide courtesy of Kavita Bala, Cornell University

Ambient Occlusion: Improvement

  • At each point find

– Fraction of hemisphere that is occluded – Also, average unoccluded direction B

(bent normal) Use B for lighting (see later)

slide-24
SLIDE 24

slide courtesy of Kavita Bala, Cornell University

What about B?

  • The unoccluded direction gives an idea of

where the main illumination is coming from

slide-25
SLIDE 25

slide courtesy of Kavita Bala, Cornell University

Computing AO using shadow maps

4 samples 32 samples

  • Create shadow maps from N point lights on

sphere

  • Check visibility of point wrt each light and

determine occlusion: accumulation buffer

slide-26
SLIDE 26

slide courtesy of Kavita Bala, Cornell University

Computing the values: SM

512 samples

slide-27
SLIDE 27

slide courtesy of Kavita Bala, Cornell University

SSAO

  • Restrict the hemisphere

– Why? Think of AO in box

  • Typically add a drop-off as you get to the

hemisphere boundary

slide-28
SLIDE 28

slide courtesy of Kavita Bala, Cornell University

Crytek for Crisis: SSAO

  • Take z-buffer
  • Consider sphere around a point p

– Distribute samples – Project to screen space and compare to z buffer

slide-29
SLIDE 29

Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2

slide-30
SLIDE 30

Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2

slide-31
SLIDE 31

Martin Mittring, Crysis GmBH http://crytek.com/cryengine/presentations/finding-next-gen-cryengine--2

slide-32
SLIDE 32

Hemisphere vs. Sphere

John Chapman http://john-chapman-graphics.blogspot.co.uk/2013/01/ssao-tutorial.html

slide-33
SLIDE 33

Irradiance map + SSAO

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-34
SLIDE 34

Irradiance map + SSAO

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-35
SLIDE 35

Irradiance map + SSAO

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-36
SLIDE 36

Irradiance map + SSAO

McGuire et al. HPG ’11 10.1145/2018323.2018327

slide-37
SLIDE 37

slide courtesy of Kavita Bala, Cornell University

slide-38
SLIDE 38

slide courtesy of Kavita Bala, Cornell University

slide-39
SLIDE 39

slide courtesy of Kavita Bala, Cornell University

slide-40
SLIDE 40

slide courtesy of Kavita Bala, Cornell University

slide-41
SLIDE 41

slide courtesy of Kavita Bala, Cornell University

slide-42
SLIDE 42

slide by Frédo Durand, MIT

Denoising from 1 image

  • We can’t take average over

multiple images Noisy input

slide-43
SLIDE 43

slide by Frédo Durand, MIT

Denoising from 1 image

  • We can’t take average over

multiple images

  • Idea 1: take a spatial

average

  • Most pixels have roughly teh

same color as their neighbor

  • Noise looks high frequency =>

do a low pass

  • Here: Gaussian blur

Noisy input

slide-44
SLIDE 44

slide by Frédo Durand, MIT

Gaussian blur

  • Noise is mostly gone
  • But image is blurry
  • duh!

After Gaussian blur

slide-45
SLIDE 45

slide by Frédo Durand, MIT

Gaussian blur

  • Noise is mostly gone
  • But image is blurry
  • duh!
  • Question: how to blur/

smooth/abstract image, but without destroying important features? After Gaussian blur

adapted from slide by Frédo Durand, MIT

slide-46
SLIDE 46

slide by Frédo Durand, MIT

Bilateral filter

  • [Tomasi and Manduci 1998]

–http://www.cse.ucsc.edu/~manduchi/Papers/ICCV98.pdf

  • Developed for denoising
  • Related to

–SUSAN filter [Smith and Brady 95] 


http://citeseer.ist.psu.edu/smith95susan.html

–Digital-TV [Chan, Osher and Chen 2001]


http://citeseer.ist.psu.edu/chan01digital.html

–sigma filter http://www.geogr.ku.dk/CHIPS/Manual/f187.htm

  • Full survey: http://people.csail.mit.edu/sparis/publi/2009/

fntcgv/Paris_09_Bilateral_filtering.pdf

slide-47
SLIDE 47

slide by Frédo Durand, MIT

Start with Gaussian filtering

  • Here, input is a step function + noise
  • utput

input

J

=

f

I

slide-48
SLIDE 48

slide by Frédo Durand, MIT

Gaussian filter as weighted average

  • Weight of ξ depends on distance to x
  • utput

input

ξ

f (x,ξ)

I(ξ)

ξ x x ξ

J(x) =

slide-49
SLIDE 49

slide by Frédo Durand, MIT

The problem of edges

  • Here, “pollutes” our estimate J(x)
  • It is too different
  • utput

input

x ξ Ι(ξ) I(x) Ι(ξ)

ξ

f (x,ξ)

I(ξ)

J(x) =

slide-50
SLIDE 50

slide by Frédo Durand, MIT

Principle of Bilateral filtering

[Tomasi and Manduchi 1998]

  • Penalty g on the intensity difference
  • utput

input

J(x) =

1 k(x)

ξ

∑ f (x,ξ)

g(I(ξ) − I(x)) I(ξ)

x Ι(ξ) I(x)

slide-51
SLIDE 51

slide by Frédo Durand, MIT

Bilateral filtering

[Tomasi and Manduchi 1998]

  • Spatial Gaussian f
  • utput

input

J(x) =

1 k(x)

ξ

∑ f (x,ξ)g(I(ξ) − I(x)) I(ξ)

x ξ x

slide-52
SLIDE 52

slide by Frédo Durand, MIT

Bilateral filtering

[Tomasi and Manduchi 1998]

  • Spatial Gaussian f
  • Gaussian g on the intensity difference
  • utput

input

J(x) =

1 k(x)

ξ

∑ f (x,ξ) g(I(ξ) − I(x))I(ξ)

x Ι(ξ) I(x)

slide-53
SLIDE 53

slide by Frédo Durand, MIT

Normalization factor

[Tomasi and Manduchi 1998]

  • k(x)=
  • utput

input

J(x) =

1 k(x)

ξ

∑ f (x,ξ)

g(I(ξ) − I(x)) I(ξ)

ξ

∑ f (x,ξ)

g(I(ξ) − I(x))

slide-54
SLIDE 54

slide by Frédo Durand, MIT

Bilateral filtering is non-linear

[Tomasi and Manduchi 1998]

  • The weights are different for each output pixel
  • utput

input

slide-55
SLIDE 55

Cornell CS6640 Fall 2012

Effects of bilateral filter

50

size of domain filter size of range filter

[Tomasi & Manduchi 1998]

slide-56
SLIDE 56

slide by Frédo Durand, MIT

Bilateral filter

Noisy input After gaussian blur

adapted from slide by Frédo Durand, MIT

slide-57
SLIDE 57

slide by Frédo Durand, MIT

Bilateral filter

Noisy input After bilateral filter