Week 7 - Friday What did we talk about last time? Lighting in - - PowerPoint PPT Presentation

week 7 friday what did we talk about last time lighting
SMART_READER_LITE
LIVE PREVIEW

Week 7 - Friday What did we talk about last time? Lighting in - - PowerPoint PPT Presentation

Week 7 - Friday What did we talk about last time? Lighting in MonoGame Cube example Antialiasing Partially transparent objects significantly increase the difficulty of rendering a scene We will talk about really difficult


slide-1
SLIDE 1

Week 7 - Friday

slide-2
SLIDE 2

 What did we talk about last time?  Lighting in MonoGame  Cube example  Antialiasing

slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7
slide-8
SLIDE 8
slide-9
SLIDE 9

 Partially transparent objects significantly increase the difficulty of

rendering a scene

 We will talk about really difficult effects like frosted glass or light

bending later

 Just rendering transparent objects at all is a huge pain because the

Z-buffer doesn't work anymore

 Workarounds:

  • Screen door transparency
  • Sorting
  • Depth peeling
slide-10
SLIDE 10

 We render an object with a checkerboard pattern of holes in it,

leaving whatever is beneath the object showing through

 Problems:

  • It really only works for 50% transparent
  • bjects
  • Only one overlapping

transparent object really works

 But it is simple and inexpensive

slide-11
SLIDE 11

 Most transparency methods use the over operator, which

combines two colors using the alpha of the one you're putting

  • n top

 c0 = αscs + (1 - αs)cd

  • cs is the new (source) color
  • cd is the old (destination) color
  • co is the resulting (over) color
  • αs is the opacity (alpha) of the object
slide-12
SLIDE 12

 The over operator is order dependent  To render correctly we can do the following:

  • Render all the opaque objects
  • Sort the centroids of the transparent objects in distance from the viewer
  • Render the transparent objects in back to front order

 To make sure that you don't draw on top of an opaque object, you

test against the Z-buffer but don't update it

slide-13
SLIDE 13

 It is not always possible to sort polygons

  • They can interpenetrate

 Hacks:

  • At the very least, use a Z-buffer test but not replacement
  • Turning off culling can help
  • Or render transparent polygons twice:
  • nce for each face
slide-14
SLIDE 14

 It is possible to use two depth buffers to render transparency

correctly

 First render all the opaque objects updating the first depth

buffer

  • Make second depth buffer maximally close

 On the second (and future) rendering passes, render those

fragments that are closer than the z values in the first depth buffer but further than the value in the second depth buffer

  • Update the second depth buffer

 Repeat the process until no pixels are updated

slide-15
SLIDE 15

1 layer 2 layers 3 layers 4 layers

slide-16
SLIDE 16

 Alpha values can be used for antialiasing, by lowering the

  • pacity of edges that partially cover pixels

 Additive blending is an alternative to the over operator

  • c0 = αscs + cd
  • This is only useful for effects like glows where the new color never

makes the original darker

  • Unlike transparency, it can be applied in any order
slide-17
SLIDE 17
slide-18
SLIDE 18

 I don't want to go deeply into gamma  The trouble is that real light has a wide range of color values

that we need to store in some limited range (such as 0 – 255)

 Then, we have to display these values, moving back from the

limited range to the "real world" range

slide-19
SLIDE 19

 Physical computations should be performed in the linear (real)

space

 To convert that linear space into nonlinear frame buffer space,

we have to raise values by a power, typically 0.45 for PCs and 0.55 for Macs

 Each component of physical color (0.3, 0.5, 0.6) is raised to

0.45 giving (0.582, 0.732, 0.794) then scaled to the 0-255 range, giving (148, 187, 203)

slide-20
SLIDE 20

 Usually, gamma correction is taken care of for you  If you are writing something where you need to do computations

in the "real life" color space (such as a raytracer), you may have to worry about it

 Calculations in the wrong space can have visually unrealistic

effects

slide-21
SLIDE 21
slide-22
SLIDE 22

 We've got polygons, but they are all one color

  • At most, we could have different colors at each vertex

 We want to "paint" a picture on the polygon

  • Because the surface is supposed to be colorful
  • To appear as if there is greater complexity than there is (a texture of

bricks rather than a complex geometry of bricks)

  • To apply other effects to the surface such as changes in material or

normal

slide-23
SLIDE 23

 We never get tired of pipelines

  • Go from object space to parameter

space

  • Go from parameter space to texture

space

  • Get the texture value
  • Transform the texture value

Projector function Object space

Corresponder function Parameter space

Obtain value Texture space Value transform function Texture value

Transformed value

slide-24
SLIDE 24

 The projector function goes from the

model space (a 3D location on a surface) to a 2D (u,v) coordinate on a texture

 Usually, this is based on a map from the

model to the texture, made by an artist

  • Tools exist to help artists "unwrap" the

model

  • Different kinds of mapping make this easier

 In other scenarios, a mapping could be

determined at run time

slide-25
SLIDE 25

 From (u,v) coordinates we have to find a corresponding texture

pixel (or texel)

 Often this just maps directly from u,v ∈ [0,1] to a pixel in the full

width, height range

 But matrix transformations can be applied  Also, values outside of [0,1] can be given, with different choices of

interpretation

slide-26
SLIDE 26

 Usually the texture value is just an RGB triple (or an RGBα

value)

 But, it could be procedurally generated  It could be a bump mapping or other surface data  It might need some transformation after retrieval

slide-27
SLIDE 27
slide-28
SLIDE 28
slide-29
SLIDE 29

 Image texturing

  • Magnification and minification
  • Mipmapping
  • Anisotropic filtering

 MonoGame examples

  • Textures in shader code
slide-30
SLIDE 30

 No class on Monday or Wednesday!

  • Because of October Break and then the debates

 Keep working on Project 2  Keep working on Assignment 3  Keep reading Chapter 6