Week 9 - Friday
Week 9 - Friday What did we talk about last time? Bump mapping - - PowerPoint PPT Presentation
Week 9 - Friday What did we talk about last time? Bump mapping - - PowerPoint PPT Presentation
Week 9 - Friday What did we talk about last time? Bump mapping Radiometry Photometry Colorimetry Lighting with shader code Ambient Directional (diffuse and specular) Adding a specular component to the diffuse
What did we talk about last time? Bump mapping Radiometry Photometry Colorimetry Lighting with shader code
- Ambient
- Directional (diffuse and specular)
Adding a specular component to the diffuse shader requires
incorporating the view vector
It will be included in the shader file and be set as a parameter
in the C# code
The camera location is added to the declarations As are specular colors and a shininess parameter
float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float3 DiffuseLightDirection; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess = 20; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;
The output adds a normal so that the half vector can be computed
in the pixel shader
A world position lets us compute the view vector to the camera
struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; };
The same computations as the diffuse shader, but we store
the normal and the transformed world position in the output
VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);
- utput.WorldPosition = worldPosition;
float4 viewPosition = mul(worldPosition, View);
- utput.Position = mul(viewPosition, Projection);
float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection));
- utput.Color = saturate(DiffuseColor * DiffuseIntensity *
lightIntensity);
- utput.Normal = normal;
return output; }
Here we finally have a real computation because we need to use the pixel
normal (which is averaged from vertices) in combination with the view vector
The technique is the same
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal - light ); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(input.Color + AmbientColor * AmbientIntensity + specular); }
Point lights model omni lights at a specific position
- They generally attenuate (get dimmer) over a distance and have a
maximum range
- DirectX has a constant attenuation, linear attenuation, and a quadratic
attenuation
- You can choose attenuation levels through shaders
They are more computationally expensive than directional lights
because a light vector has to be computed for every pixel
It is possible to implement point lights in a deferred shader,
lighting only those pixels that actually get used
We add light position and radius
float4x4 World; float4x4 View; float4x4 Projection; float4x4 WorldInverseTranspose; float3 LightPosition; float LightRadius = 100; float3 Camera; static const float PI = 3.14159265f; float4 AmbientColor = float4(1, 1, 1, 1); float AmbientIntensity = 0.1; float4 DiffuseColor = float4(1, 1, 1, 1); float DiffuseIntensity = 0.7; float Shininess = 20; float4 SpecularColor = float4(1, 1, 1, 1); float SpecularIntensity = 0.5;
We no longer need color in the output We do need the vector to the camera from the location We keep the world location at that fragment struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; }; struct VertexShaderOutput { float4 Position : POSITION0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; };
We compute the normal and the world position
VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);
- utput.WorldPosition = worldPosition;
float4 viewPosition = mul(worldPosition, View);
- utput.Position = mul(viewPosition, Projection);
float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));
- utput.Normal = normal;
return output; }
Lots of junk in here
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 lightDirection = LightPosition – (float3)input.WorldPosition; float3 normal = normalize(input.Normal); float intensity = pow(1 - saturate(length(lightDirection) / LightRadius), 2); lightDirection = normalize(lightDirection); float3 view = normalize(Camera - (float3)input.WorldPosition); float diffuseColor = dot(normal, lightDirection) * intensity; float3 reflect = normalize(2 * diffuseColor * normal – lightDirection); float dotProduct = dot(reflect, view); float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess) * intensity; return saturate(diffuseColor + AmbientColor * AmbientIntensity + specular); }
The bidirectional reflectance distribution function is a function
that describes the difference between outgoing radiance and incoming irradiance
This function changes based on:
- Wavelength
- Angle of light to surface
- Angle of viewer from surface
For point or directional lights, we do not need differentials and
can write the BRDF:
i L
- θ
E L f cos ) ( ) , ( v v l =
We've been talking about lighting models
- Lambertian, specular, etc.
A BRDF is an attempt to model physics slightly better A big difference is that different wavelengths are absorbed
and reflected different by different materials
Rendering models in real time with (more) accurate BRDFs is
still an open research problem
They also have global lighting (shadows and reflections) Taken from www.kevinbeason.com
The BRDF is supposed to account for all the light interactions
we discussed in Chapter 5 (reflection and refraction)
We can see the similarity to the lighting equation from
Chapter 5, now with a BRDF:
∑
=
⊗ =
n k i L k
- k
k
θ E f L
1
cos ) , ( ) ( v l v
If the subsurface scattering effects are great, the size of the
pixel may matter
Then, a bidirectional surface scattering reflectance
distribution function (BSSRDF) is needed
Or if the surface characteristics change in different areas, you
need a spatially varying BRDF
And so on…
Helmholtz reciprocity:
- f(l,v) = f(v,l)
Conservation of energy:
- Outgoing energy cannot be greater than incoming energy
The simplest BRDF is Lambertian shading We assume that energy is scattered equally in all directions Integrating over the hemisphere gives a factor of π Dividing by π gives us exactly what we saw before:
∑
=
⊗ =
n k i L
- k
E π L
1 diff
θ cos ) ( c v
We'll start with our specular shader for directional light and
add textures to it
The texture for the ship is below:
We add a Texture2D variable called ModelTexture We also add a SamplerState structure that specifies how to
filter the texture
Texture2D ModelTexture; SamplerState ModelTextureSampler { MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Clamp; AddressV = Clamp; };
We add a texture coordinate to the input and the output of
the vertex shader
struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; float2 Texture : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 Color : COLOR0; float3 Normal : NORMAL0; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD0; };
Almost nothing changes here except that we copy the input
texture coordinate into the output
VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);
- utput.WorldPosition = worldPosition;
float4 viewPosition = mul(worldPosition, View);
- utput.Position = mul(viewPosition, Projection);
float3 normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose)); float lightIntensity = dot(normal, normalize(DiffuseLightDirection));
- utput.Color = saturate(DiffuseColor * DiffuseIntensity * lightIntensity);
- utput.Normal = normal;
- utput.Texture = input.Texture;
return output; }
We have to pull the color from the texture and set its alpha to 1 Then scale the components of the color by the texture color
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 light = normalize(DiffuseLightDirection); float3 normal = normalize(input.Normal); float3 reflect = normalize(2 * dot(light, normal) * normal - light); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(textureColor * input.Color + AmbientColor * AmbientIntensity + specular); }
To use a texture, we naturally have to load a texture We also set the texture as a value on the effect
texture = Content.Load<Texture2D>("shipdiffuse"); effect.Parameters["ModelTexture"].SetValue(texture);
It's easiest to do bump mapping in MonoGame using a normal
map
Of course, a normal map is hard to create by hand What's more common is to create a height map and then use
a tool for creating a normal map from it
xNormal is a free utility to do this
- http://www.xnormal.net/Downloads.aspx
The conversion from a grayscale height map to a normal map
looks like this
We have a normal to a surface, but there are
also tangent directions
We call these the tangent and the binormal
- Apparently serious mathematicians think it
should be called the bitangent
- The binormal is tangent to the surface and
- rthogonal to the other tangent
We distort the normal with weighted sums
- f the tangent and binormal (stored in our
normal map)
We have to tell MonoGame to generate
tangent frames on our model to have these
Normal Tangent Binormal
We add another Texture2D variable called NormalTexture We could add another SamplerState structure or use the first one We add a BumpConstant value to say how strong the bump effect should be
Texture2D NormalTexture; float BumpConstant = 1; SamplerState NormalTextureSampler { MinFilter = Linear; MagFilter = Linear; MipFilter = Linear; AddressU = Clamp; AddressV = Clamp; };
We add tangent and binormal information to our vertex
shader input and output
struct VertexShaderInput { float4 Position : POSITION0; float3 Normal : NORMAL0; float3 Tangent : TANGENT0; float3 Binormal : BINORMAL0; float2 Texture : TEXCOORD0; }; struct VertexShaderOutput { float4 Position : POSITION0; float4 WorldPosition : POSITIONT; float2 Texture : TEXCOORD0; float3 Normal : TEXCOORD1; float3 Tangent : TEXCOORD2; float3 Binormal : TEXCOORD3; };
The vertex shader gets simpler because we're going to defer lighting to the pixel shader However, we have to compute the tangent and binormals in world space VertexShaderOutput VertexShaderFunction(VertexShaderInput input) { VertexShaderOutput output; float4 worldPosition = mul(input.Position, World);
- utput.WorldPosition = worldPosition;
float4 viewPosition = mul(worldPosition, View);
- utput.Position = mul(viewPosition, Projection);
- utput.Normal = normalize(mul(input.Normal, (float3x3)WorldInverseTranspose));
- utput.Tangent = normalize(mul(input.Tangent, (float3x3)WorldInverseTranspose));
- utput.Binormal = normalize(mul(input.Binormal, (float3x3)WorldInverseTranspose));
- utput.Texture = input.Texture;
return output; }
A lot goes on in the pixel shader, but essentially, all the work is the same
except that the normal has been altered by the NormalTexture
float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0 { float3 bump = BumpConstant * (NormalTexture.Sample(NormalTextureSampler, input.Texture)
- (0.5, 0.5, 0.5));
float3 bumpNormal = input.Normal + (bump.x * input.Tangent + bump.y * input.Binormal); bumpNormal = normalize(bumpNormal); float diffuseIntensity = dot(normalize(DiffuseLightDirection), bumpNormal); if (diffuseIntensity < 0) diffuseIntensity = 0; float3 light = normalize(DiffuseLightDirection); float3 reflect = normalize(2 * dot(light, bumpNormal) * bumpNormal - light); float3 view = normalize(Camera - (float3)input.WorldPosition); float dotProduct = dot(reflect, view); float4 textureColor = ModelTexture.Sample(ModelTextureSampler, input.Texture); textureColor.a = 1; float4 specular = (8 + Shininess) / (8 * PI) * SpecularIntensity * SpecularColor * pow(saturate(dotProduct), Shininess); return saturate(textureColor * diffuseIntensity + AmbientColor * AmbientIntensity + specular); }
As with an image texture, we have to load the normal map We also set the normal map as a value on the effect
normal = Content.Load<Texture2D>("shipnormal"); effect.Parameters["NormalTexture"].SetValue(normal);
Area lighting Environmental lighting
Read Chapter 8 Work on Project 2
- Due next Friday by midnight3