Week 9 - Friday What did we talk about last time? Bump mapping - - PowerPoint PPT Presentation

week 9 friday what did we talk about last time bump
SMART_READER_LITE
LIVE PREVIEW

Week 9 - Friday What did we talk about last time? Bump mapping - - PowerPoint PPT Presentation

Week 9 - Friday What did we talk about last time? Bump mapping Radiometry Photometry Colorimetry Lighting with shader code Ambient Directional (diffuse and specular) Adding a specular component to the diffuse


slide-1
SLIDE 1

Week 9 - Friday

slide-2
SLIDE 2

 What did we talk about last time?  Bump mapping  Radiometry  Photometry  Colorimetry  Lighting with shader code

  • Ambient
  • Directional (diffuse and specular)
slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5

 Adding a specular component to the diffuse shader requires

incorporating the view vector

 It will be included in the shader file and be set as a parameter

in the C# code

slide-6
SLIDE 6

 The camera location is added to the declarations  As are specular colors and a shininess parameter

float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess = 20; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;

slide-7
SLIDE 7

 The output adds a normal so that the half vector can be computed

in the pixel shader

 A world position lets us compute the view vector to the camera

struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; };

slide-8
SLIDE 8

 The same computations as the diffuse shader, but we store

the normal and the transformed world position in the output

VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);

  • utput.WorldPosition = worldPosition;

float4 viewPosition = mul(worldPosition, View);

  • utput.Position = mul(viewPosition, Projection);

float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection));

  • utput.Color = saturate(DiffuseColor * DiffuseIntensity *

lightIntensity);

  • utput.Normal = normal;

return output; }

slide-9
SLIDE 9

 Here we finally have a real computation because we need to use the pixel

normal (which is averaged from vertices) in combination with the view vector

 The technique is the same

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal - light ); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); }

slide-10
SLIDE 10
slide-11
SLIDE 11

 Point lights model omni lights at a specific position

  • They generally attenuate (get dimmer) over a distance and have a

maximum range

  • DirectX has a constant attenuation, linear attenuation, and a quadratic

attenuation

  • You can choose attenuation levels through shaders

 They are more computationally expensive than directional lights

because a light vector has to be computed for every pixel

 It is possible to implement point lights in a deferred shader,

lighting only those pixels that actually get used

slide-12
SLIDE 12

 We add light position and radius

float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float LightRadius = 100; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess = 20; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;

slide-13
SLIDE 13

 We no longer need color in the output  We do need the vector to the camera from the location  We keep the world location at that fragment struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; };

slide-14
SLIDE 14

 We compute the normal and the world position

VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);

  • utput.WorldPosition = worldPosition;

float4 viewPosition = mul(worldPosition, View);

  • utput.Position = mul(viewPosition, Projection);

float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));

  • utput.Normal = normal;

return output; }

slide-15
SLIDE 15

 Lots of junk in here

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 lightDirection = LightPosition – (float3)input.WorldPosition; float3 normal = normalize(input.Normal); float intensity = pow(1 - saturate(length(lightDirection) / LightRadius), 2); lightDirection = normalize(lightDirection); float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection) * intensity; float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess) * intensity; return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); }

slide-16
SLIDE 16
slide-17
SLIDE 17
slide-18
SLIDE 18

 The bidirectional reflectance distribution function is a function

that describes the difference between outgoing radiance and incoming irradiance

 This function changes based on:

  • Wavelength
  • Angle of light to surface
  • Angle of viewer from surface

 For point or directional lights, we do not need differentials and

can write the BRDF:

i L

  • θ

E L f cos ) ( ) , ( v v l =

slide-19
SLIDE 19

 We've been talking about lighting models

  • Lambertian, specular, etc.

 A BRDF is an attempt to model physics slightly better  A big difference is that different wavelengths are absorbed

and reflected different by different materials

 Rendering models in real time with (more) accurate BRDFs is

still an open research problem

slide-20
SLIDE 20

 They also have global lighting (shadows and reflections)  Taken from www.kevinbeason.com

slide-21
SLIDE 21

 The BRDF is supposed to account for all the light interactions

we discussed in Chapter 5 (reflection and refraction)

 We can see the similarity to the lighting equation from

Chapter 5, now with a BRDF:

=

⊗ =

n k i L k

  • k

k

θ E f L

1

cos ) , ( ) ( v l v

slide-22
SLIDE 22

 If the subsurface scattering effects are great, the size of the

pixel may matter

 Then, a bidirectional surface scattering reflectance

distribution function (BSSRDF) is needed

 Or if the surface characteristics change in different areas, you

need a spatially varying BRDF

 And so on…

slide-23
SLIDE 23

 Helmholtz reciprocity:

  • f(l,v) = f(v,l)

 Conservation of energy:

  • Outgoing energy cannot be greater than incoming energy

 The simplest BRDF is Lambertian shading  We assume that energy is scattered equally in all directions  Integrating over the hemisphere gives a factor of π  Dividing by π gives us exactly what we saw before:

=

⊗ =

n k i L

  • k

E π L

1 diff

θ cos ) ( c v

slide-24
SLIDE 24
slide-25
SLIDE 25

 We'll start with our specular shader for directional light and

add textures to it

slide-26
SLIDE 26

 The texture for the ship is below:

slide-27
SLIDE 27

 We add a Texture2D variable called ModelTexture  We also add a SamplerState structure that specifies how to

filter the texture

Texture2D ModelTexture; SamplerState ModelTextureSampler { MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Clamp; AddressV = Clamp; };

slide-28
SLIDE 28

 We add a texture coordinate to the input and the output of

the vertex shader

struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; float2 Texture : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD0; };

slide-29
SLIDE 29

 Almost nothing changes here except that we copy the input

texture coordinate into the output

VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);

  • utput.WorldPosition = worldPosition;

float4 viewPosition = mul(worldPosition, View);

  • utput.Position = mul(viewPosition, Projection);

float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection));

  • utput.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);
  • utput.Normal = normal;
  • utput.Texture = input.Texture;

return output; }

slide-30
SLIDE 30

 We have to pull the color from the texture and set its alpha to 1  Then scale the components of the color by the texture color

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal - light); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); }

slide-31
SLIDE 31

 To use a texture, we naturally have to load a texture  We also set the texture as a value on the effect

texture = Content.Load<Texture2D>("shipdiffuse"); effect.Parameters["ModelTexture"].SetValue(texture);

slide-32
SLIDE 32

 It's easiest to do bump mapping in MonoGame using a normal

map

 Of course, a normal map is hard to create by hand  What's more common is to create a height map and then use

a tool for creating a normal map from it

 xNormal is a free utility to do this

  • http://www.xnormal.net/Downloads.aspx
slide-33
SLIDE 33

 The conversion from a grayscale height map to a normal map

looks like this

slide-34
SLIDE 34

 We have a normal to a surface, but there are

also tangent directions

 We call these the tangent and the binormal

  • Apparently serious mathematicians think it

should be called the bitangent

  • The binormal is tangent to the surface and
  • rthogonal to the other tangent

 We distort the normal with weighted sums

  • f the tangent and binormal (stored in our

normal map)

 We have to tell MonoGame to generate

tangent frames on our model to have these

Normal Tangent Binormal

slide-35
SLIDE 35

 We add another Texture2D variable called NormalTexture  We could add another SamplerState structure or use the first one  We add a BumpConstant value to say how strong the bump effect should be

Texture2D NormalTexture; float BumpConstant = 1; SamplerState NormalTextureSampler { MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Clamp; AddressV = Clamp; };

slide-36
SLIDE 36

 We add tangent and binormal information to our vertex

shader input and output

struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; float3 Tangent : TANGENT0; float3 Binormal : BINORMAL0; float2 Texture : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD0; float3 Normal : TEXCOORD1; float3 Tangent : TEXCOORD2; float3 Binormal : TEXCOORD3; };

slide-37
SLIDE 37

 The vertex shader gets simpler because we're going to defer lighting to the pixel shader  However, we have to compute the tangent and binormals in world space VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);

  • utput.WorldPosition = worldPosition;

float4 viewPosition = mul(worldPosition, View);

  • utput.Position = mul(viewPosition, Projection);
  • utput.Normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));
  • utput.Tangent = normalize(mul(input.Tangent, (float3x3)WorldInverseTranspose));
  • utput.Binormal = normalize(mul(input.Binormal, (float3x3)WorldInverseTranspose));
  • utput.Texture = input.Texture;

return output; }

slide-38
SLIDE 38

 A lot goes on in the pixel shader, but essentially, all the work is the same

except that the normal has been altered by the NormalTexture

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 bump = BumpConstant * (NormalTexture.Sample(NormalTextureSampler, input.Texture)

  • (0.5, 0.5, 0.5));

float3 bumpNormal = input.Normal + (bump.x * input.Tangent + bump.y * input.Binormal); bumpNormal = normalize(bumpNormal); float diffuseIntensity = dot(normalize(DiffuseLightDirection), bumpNormal); if (diffuseIntensity < 0) diffuseIntensity = 0; float3 light = normalize(DiffuseLightDirection); float3 reflect = normalize(2 * dot(light, bumpNormal) * bumpNormal - light); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(textureColor * diffuseIntensity + AmbientColor * AmbientIntensity + specular); }

slide-39
SLIDE 39

 As with an image texture, we have to load the normal map  We also set the normal map as a value on the effect

normal = Content.Load<Texture2D>("shipnormal"); effect.Parameters["NormalTexture"].SetValue(normal);

slide-40
SLIDE 40
slide-41
SLIDE 41
slide-42
SLIDE 42

 Area lighting  Environmental lighting

slide-43
SLIDE 43

 Read Chapter 8  Work on Project 2

  • Due next Friday by midnight3