CS 488 More Shading and Illumination Luc R ENAMBOT 1 Illumination - - PowerPoint PPT Presentation

cs 488 more shading and illumination
SMART_READER_LITE
LIVE PREVIEW

CS 488 More Shading and Illumination Luc R ENAMBOT 1 Illumination - - PowerPoint PPT Presentation

CS 488 More Shading and Illumination Luc R ENAMBOT 1 Illumination No Lighting Ambient model Light sources Diffuse reflection Specular reflection Model: ambient + specular + diffuse Shading: flat, gouraud, phong, ...


slide-1
SLIDE 1

CS 488 More Shading and Illumination

1

Luc RENAMBOT

slide-2
SLIDE 2

Illumination

  • No Lighting
  • Ambient model
  • Light sources
  • Diffuse reflection
  • Specular reflection
  • Model: ambient + specular + diffuse
  • Shading: flat, gouraud, phong, ...

2

slide-3
SLIDE 3

Texture Mapping

  • Texture mapping is the process of

taking a 2D image and mapping onto a polygon in the scene

  • This texture acts like a painting,

adding 2D detail to the 2D polygon

  • Instead of filling a polygon with a

color in the scan conversion process with fill the pixels of the polygon with the pixels of the texture (texels)

3

slide-4
SLIDE 4

Mapping

  • Various spaces involved: The texture map is a 2D image
  • It is mapped onto a 2D polygon (or set of 2D polygons
  • The texture, the polygon(s) and the screen all have their own

coordinate systems:

  • Texture in (u,w) coordinates
  • u = j(s,t)
  • w = k(s,t)
  • Polygon in (s,t) coordinates
  • s = f(u,w)
  • t = g(u,w)
  • Polygon in (x,y,z) coordinates
  • Screen in (x,y) coordinates

4

slide-5
SLIDE 5

Texture Coordinates

5

slide-6
SLIDE 6

Mapping

  • What we want are linear equations of the form:
  • s = A * u + B
  • t = C * w + D
  • to make s and t functions of the texture space.
  • By mapping the four corners of the texture space to

the four corners of the object we get the values for A, B,C, and D in these equations

  • The inverse of these equations gives the mapping

from object space to texture space.

6

slide-7
SLIDE 7

Mapping

  • When doing the scan conversion of the polygon
  • nto the screen the pixels at the corners of the

polygon are mapped onto the corners of the texture

  • Each pixel (in the screen space) can now be

related to one or more texels (in the texture space.)

  • This allows the pixel value to be determined by

averaging one or more texel values

7

slide-8
SLIDE 8

Algorithms

  • Several algorithms, including:
  • Catmull: continue to subdivide the object

until the subdivided component is within a single pixel. Object decides what pixel is going to be - can cause problems

  • Blinn & Newell: maps each pixel from

screen space to object space to texture space

8

slide-9
SLIDE 9

Examples

9

slide-10
SLIDE 10

Borders

  • Textures can usually be defined to either repeat or clamp at

the edges to determine what happens if the texture is not 'big enough' to cover the object (that is if the pixel coordinates transformed into (u,w) coordinates falls outside the space occupied by the texture

  • If the texture repeats then the same texture pattern repeats

itself over and over again on the polygon (useful for wood floors or brick walls or stucco walls) where a very small texture can be used to cover a very large space, or the texture can be told to clamp at the edges

10

slide-11
SLIDE 11

Clamp and Repeat

11

slide-12
SLIDE 12

Example

12

slide-13
SLIDE 13

Shadows

  • The lighting algorithms discussed last time worked on each
  • bject separately
  • Objects were not able to affect the illumination of other
  • bjects. This is not terribly realistic. In the 'real world' objects

can cast shadows on other objects.

  • We have used visible surface algorithms to determine which

polygonal surfaces are visible to the viewer. We can use similar algorithms to determine which surfaces are 'visible' to a light source - and are therefor lit

  • Ambient light will still affect all polygons in the scene but the

diffuse and specular components will depend on whether the polygon is visible to the light

13

slide-14
SLIDE 14

Transparency

  • We have assumed that objects are all
  • paque, but many objects in the 'real

world' are transparent or translucent

  • These surfaces also tend to refract the

light coming through them

  • Dealing with refraction is quite

difficult, while transparency is relatively easy

14

slide-15
SLIDE 15

Interpolated Transparency

  • Kti is the transparency of (nearer) object 1 (0 - totally opaque, 1 -

totally transparent)

  • if Kti is 0 then the nearer object is totally opaque and the far object

contributes nothing where they overlap

  • if Kti is 1 then the nearer object is totally transparent and the near
  • bject contributes nothing where they overlap
  • in between 0 and 1 the intensity is the interpolation of the intensities
  • f the two objects
  • each pixel is linearly interpolated

15

Ilambda = (1 − Kt1)Ilambda1 + Kt1Ilambda2

slide-16
SLIDE 16

Screen-door transparency

  • This is the same idea as interpolated transparency

except that each individual pixel is either given the value of the near object or the value of the far

  • bject

16

  • The ratio of pixels given to the far

versus the near is Kti. This is faster to compute but gives a much less pleasing effect.

  • Basically it is using dithering to

generate transparency

slide-17
SLIDE 17

Filtered Transparency

  • Otlambda is the transparency color of (nearer)
  • bject 1
  • In all of these cases, the value of Ilambda2 may

itself be the result of a transparency calculation.

17

Ilambda = Ilambda1 + Kt1OtlambdaIlambda2

slide-18
SLIDE 18

Transparency

  • Screen-door transparency is easy to implement along with

the z-buffer since the order that polygons are drawn does not affect screen-door transparency

  • For the others, the order of drawing is important: One of

the advantages of using a z-buffer is that the order in which the polygons are drawn became irrelevant. Here it is again necessary to draw the polygons back to front so that transparency can be correctly calculated

  • Solution: draw all opaque polygons first and sort the

transparent ones

18

slide-19
SLIDE 19

Raytracing

  • Traces the path for reflected and transmitted

rays through an environment

  • Recursive processing
  • A ray is traced for each pixel from the viewer's

eye into the scene

  • The ray is infinitely thin
  • The ray is infinitely long
  • Surfaces are perfectly smooth

19

slide-20
SLIDE 20

Features

  • Hidden surfaces
  • Shadows
  • Reflection
  • Refraction
  • Global specular interaction
  • Orthographic and perspective views

20

slide-21
SLIDE 21

Raytracing

  • The power of this kind of system is that

instead of just having one ray (as in visible surface determination, or shadows) that one ray can generate other rays which continue through the scene

  • Recursive algorithm

21

slide-22
SLIDE 22

Raytracing

  • Given:
  • Polygon vertices
  • V - input vector
  • Light sources
  • Want to find:
  • R' - reflected vector
  • P' - transmitted vector

22

slide-23
SLIDE 23

Raytracing

  • Reflected
  • R' = N' + ( N' +

V') where V' = V / | V * N |

  • Transmitted
  • P' = Kp x (N' +

V') - N'

  • where Kp determines the amount of refraction

23

slide-24
SLIDE 24

Illumination

  • For Li:
  • I = Ka Ia + sum for all lights (Kd Ip (N' *

L') + Ks Ip (R' * V')^n) + Kr Ir + Kt It

  • Most of this we talked about last week but

now there are two new terms

  • Kr Ir deals with the reflected light
  • Kt It deals with transmitted light

24

slide-25
SLIDE 25

Algorithm

  • So if we follow V from the eye through a

given pixel on the screen and into the scene we can see its interaction as shown here:

25

slide-26
SLIDE 26

shade(object, ray, point, normal, depth)

{ color = ambient term for (each light) { sRay = ray from light to point if (dot product of normal and direction to light is positive) { compute how much light is blocked by opaque and transparent surfaces scale diffuse and specular terms before adding them to color } } if (depth < maxDepth) { if (object is reflective) { rRay = ray in reflection direction from point rColor = trace(rRay, depth+1) scale rColor by specular coefficient and add to color } if (object is transparent) { tRay = ray in refraction direction from point if (total internal reflection does not occur) { tColor = trace(tRay, depth+1) scale tColor by transmission coefficient and add to color } } } shade = color } 26

slide-27
SLIDE 27

//-------------------------------------------------------------------

trace(ray, depth)

{ determine closest intersection of the ray with an object if (object is hit by ray) { compute normal at intersection return(shade(closest object hit, ray, intersection, normal, depth)) } else return(BACKGROUND_VALUE) } //-------------------------------------------------------------------

main()

{ for each scan line in the image for each pixel in the scan line { determine ray from center of projection through that pixel pixel = trace(ray, 1) } }

27

slide-28
SLIDE 28

Examples

28

slide-29
SLIDE 29

Radiosity

  • Radiosity is a method of trying to simulate lighting

effects using much more realistic models than were used previously: lighting simulation

  • Closed environment: so light energy is conserved.

All of the light energy is accounted for

  • No need for an ambient light term anymore as what

the ambient term simulated will now be specifically computed

29

slide-30
SLIDE 30

Radiosity

  • Radiosity is the rate at which energy

leaves a surface (via emittance, reflectance

  • r transmittance)
  • Light interactions are computed first for the

entire scene without regard for the viewpoint: diffuse reflection

  • Generate images from any viewpoint

30

slide-31
SLIDE 31

Radiosity

  • Light sources are not treated as separate

from the objects in the scene

  • All objects emit light which can give more

realistic effects when areas are giving off light rather than several discrete sources

  • Space divided into n discrete finite sized

patches which emit and reflect light uniformly over their area

31

slide-32
SLIDE 32

Radiosity Equation

  • Bi - radiosity of patch i
  • Bj - radiosity of patch j
  • Ei - rate at which light is emitted from patch i
  • pi - reflectivity of patch i
  • Fj-i - form factor (configuration factor) fraction of

energy that leaves j and arrives at i

  • Ai - area of patch i
  • Aj - area of patch j

32

Bi = Ei + pi ∗ n

j=1 BjFji(Aj/Ai)

slide-33
SLIDE 33

Radiosity

  • So the radiosity of a unit area is the sum of

the emitted light + reflected incident light

  • Note that the summation includes patch i -

that is an object can reflect light onto itself

33

slide-34
SLIDE 34

Radiosity System

  • Rewrite the previous equation
  • End up with a set of simultaneous equations to

solve, one for each object in the scene: linear system to solve

  • Since these equations have nothing to do with

the users viewpoint there is no need to recompute if the user moves through the

34

Bi − pi ∗ n

j=1 FijBj = Ei

slide-35
SLIDE 35

35

slide-36
SLIDE 36

Example

36

slide-37
SLIDE 37

Example

37

slide-38
SLIDE 38

Special Topics (web)

  • Stereopsis
  • Virtual Reality
  • Scientific Visualization
  • Medical

Visualization

  • New devices
  • GeoWall, Paris, Tiled display

38

slide-39
SLIDE 39

39

slide-40
SLIDE 40

The END

40

slide-41
SLIDE 41

Next

  • CS 588 - Comp. Graphics II

current topics in computer graphics - generally with students presenting papers from the most recent couple years of SIGGRAPH publications, and creating projects based around these ideas.

  • CS 522 - Human-Computer Interaction
  • CS 523 - Multi-Media Systems
  • CS 527 - Computer Animation
  • CS 528 -

Virtual Reality

41