texture mapping 1 why texture mapping? objects have spatially - - PowerPoint PPT Presentation

texture mapping
SMART_READER_LITE
LIVE PREVIEW

texture mapping 1 why texture mapping? objects have spatially - - PowerPoint PPT Presentation

texture mapping 1 why texture mapping? objects have spatially varying details represent as geometry: correct, but very expensive 2 why texture mapping? use simple geometry store varying properties in images map to objects 3 why texture


slide-1
SLIDE 1

texture mapping

1

slide-2
SLIDE 2

why texture mapping?

  • bjects have spatially varying details

represent as geometry: correct, but very expensive

2

slide-3
SLIDE 3

why texture mapping?

use simple geometry store varying properties in images map to objects

3

slide-4
SLIDE 4

why texture mapping?

produces compelling results

[Jeremy Birn]

4

slide-5
SLIDE 5

why texture mapping?

easily change object appearance

[Praun et al., 2001]

5

slide-6
SLIDE 6

mapping function

surfaces are 2D domains determine a function that maps them to images

6

slide-7
SLIDE 7

mapping functions - projections

maps 3D surface points to 2D image coordinates different types of projects

  • ften corresponding to simple shapes

useful for simple objects

[Wolfe / SG97 Slide set]

f : Ü [0, 1 R3 ]2

7

slide-8
SLIDE 8

projections - planar

8

slide-9
SLIDE 9

projections - cubical

9

slide-10
SLIDE 10

projections - cylindrical

10

slide-11
SLIDE 11

projections - spherical

11

slide-12
SLIDE 12

projections

planar projections along xy plane of size use affine transform to orient the plane differently spherical projection of unit sphere consider point in spherical coordinates

(w, h) f (p) = ( /w, /h) px py f (p) = (Y, ?)

12

slide-13
SLIDE 13

projections

cylindrical projection of unit cylinder of height consider point in cylindrical coordinates treat caps separately

h f (p) = (Y, /h) py

13

slide-14
SLIDE 14

looking up texture values

normal: do not repeat texture clamp image coordinates to then lookup tiled: repeat texture multiple times take mod of image coordinates then lookup

[0, 1]

14

slide-15
SLIDE 15

texture mapping artifacts

tiling textures might introduce seems discontinuities in the mapping function change textures to be "tileable" when possible

15

slide-16
SLIDE 16

texture mapping artifacts

mapping textures will introduce distortions unavoidable artifacts local scale and rotation differences

16

slide-17
SLIDE 17

mapping function - explicit coordinates

store texture coordinates on control points interpolate as any other parameter follow interpolation rule defined by surface type parametric surfaces: can use parameters directly known as UV mapping

17

slide-18
SLIDE 18

uv mapping vs. projection

parameterization projection

18

slide-19
SLIDE 19

uv mapping subdivision surfaces

level 0 level 1 level 2

19

slide-20
SLIDE 20

uv mapping parameters surfaces

[Wolfe / SG97 Slide set]

20

slide-21
SLIDE 21

uv mapping polygon meshes

21

slide-22
SLIDE 22

uv mapping polygon meshes - "pelting"

[Piponi et al., 2000]

22

slide-23
SLIDE 23

uv mapping polygon meshes - "atlas"

break up model intro single texture [(c) Discreet]

23

slide-24
SLIDE 24

interpolating uv coordinates

pay attention when rasterizating triangles for raytracing just use baricentric coordinates texture linear interp.

  • persp. interp.

[MIT OpenCourseware]

24

slide-25
SLIDE 25

painting textures on models

if painting is required, paint directly on surfaces system determines inverse mapping to update image seams/distortions present, but user does not know

25

slide-26
SLIDE 26

texture magnification

linearly interpolate closest pixels in texture texture rendered image [MIT OpenCourseware]

26

slide-27
SLIDE 27

texture minification

compute average of texture pixels projected onto each view pixel texture rendered image [MIT OpenCourseware]

27

slide-28
SLIDE 28

texture minification

remember point-sampling introduces artifacts need average of texture below a pixel

[MIT OpenCourseware]

28

slide-29
SLIDE 29

mip-mapping

approximate algorithm for computing filters store textures at different resolution look up the appropriate image based on its projected size

[MIT OpenCourseware]

29

slide-30
SLIDE 30

3d solid texturing

define a 3D field of values, indexed using in-memory array: too much memory procedurally: hard to define

  • ften add noisy-like details on 2D images

[Wolfe / SG97 Slide set]

P

30

slide-31
SLIDE 31

ptex

uses no explicit UV assignment instead, each quad has its own texture works with Catmull-Clark subdiv allows artists to get detail they need where they need it well-defined filtering

[(c) Disney]

31

slide-32
SLIDE 32

ptex

212k faces 4x4 texels/face auto-sized 1 ptex file 3.4m texels 3.4m texels [(c) Disney]

32

slide-33
SLIDE 33

types of mapping

33

slide-34
SLIDE 34

texture mapping material parameters

diffuse coefficient

34

slide-35
SLIDE 35

texture mapping material parameters

specular coefficient

35

slide-36
SLIDE 36

displacement mapping

variations of surface positions, thus normals requires fine tessellation of object geometry

36

slide-37
SLIDE 37

displacement mapping

update position by displacing points along normal recompute normals by evaluating derivatives no closed form solution: do it numerically

= P + hN Pd V × ƒ × Nd <Pd <u <Pd <v šPd šu šPd šv

37

slide-38
SLIDE 38

bump mapping

variation of surface normals apply normal perturbation without updating positions

38

slide-39
SLIDE 39

bump mapping

simple example: bump mapping plane

xy (u, v) Pd = P(u, v) + h(u, v)N = = ux + vy + h(u, v)z Nd V × = (x + z) × (y + z) = <Pd <u <Pd <v <h <u <h <v = z − x − y <h <u <h <v

39

slide-40
SLIDE 40

bump vs. displacement mapping

bump map displacement map

40

slide-41
SLIDE 41

bump vs. displacement mapping

bump map displacement map

41

slide-42
SLIDE 42

combining map types

combine multiple maps to achieve realistic effects

42

slide-43
SLIDE 43

lighting effects using texture mapping

43

slide-44
SLIDE 44

shadow mapping

graphics pipeline does not allow shadow queries we can use texturing and a multipass algorithm project a color texture "project" a depth texture [NVIDIA/Everitt et al.]

44

slide-45
SLIDE 45

shadow mapping algorithm

pass 1 render sceen from light view copy depth buffer in a new texture pass 2 render scene from camera view transform each pixel to light space compare value to depth buffer if current < buffer depth then shadow

45

slide-46
SLIDE 46

shadow mapping algorithm

camera view light view shadow buffer [NVIDIA/Everitt et al.]

46

slide-47
SLIDE 47

shadow mapping algorithm

camera view light distance projected shadow buffer [NVIDIA/Everitt et al.]

47

slide-48
SLIDE 48

shadow mapping limitations

not enough resolution: blocky shadows pixels in shadow buffer too large when projected

[Fernando et al., 2002]

biasing: surfaces shadow themselves remember the epsilon in raytracing made much worse by resolution limitation

48

slide-49
SLIDE 49

environment mapping

graphics pipeline does not allow reflections we can use texturing and a multipass algorithm [Wolfe / SG97 Slide set]

49

slide-50
SLIDE 50

environment mapping algorithm

pass 1 render scene 6 times from object center store images onto a cube pass 2 render scene from the camera view use cube projection to look up values variation of this works also for refraction

50

slide-51
SLIDE 51

environment map limitations

incorrect reflections

  • bjects in incorrect positions: better for distant objects

"rays" go through objects inefficient: need one map for each object

51

slide-52
SLIDE 52

light effects take home message

pipeline not suitable for lighting computations algorithms are complex to implement and not robust lots of tricks and special cases but fast interactive graphics: use pipeline algorithms high-quality graphics; use pipeline for view, raytracing for lighting

52