Week 3 - Monday What did we talk about last time? Graphics - - PowerPoint PPT Presentation

week 3 monday what did we talk about last time graphics
SMART_READER_LITE
LIVE PREVIEW

Week 3 - Monday What did we talk about last time? Graphics - - PowerPoint PPT Presentation

Week 3 - Monday What did we talk about last time? Graphics rendering pipeline Rasterizer Stage The final screen data containing the colors for each pixel is stored in the color buffer The merging stage is responsible for merging


slide-1
SLIDE 1

Week 3 - Monday

slide-2
SLIDE 2

 What did we talk about last time?  Graphics rendering pipeline

  • Rasterizer Stage
slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

 The final screen data containing the colors for each pixel is

stored in the color buffer

 The merging stage is responsible for merging the colors from

each of the fragments from the pixel shading stage into a final color for a pixel

 Deeply linked with merging is visibility: The final color of the

pixel should be the one corresponding to a visible polygon (and not one behind it)

slide-8
SLIDE 8

 To deal with the question of visibility, most modern systems

use a Z-buffer or depth buffer

 The Z-buffer keeps track of the z-values for each pixel on the

screen

 As a fragment is rendered, its color is put into the color buffer

  • nly if its z value is closer than the current value in the z-buffer

(which is then updated)

 This is called a depth test

slide-9
SLIDE 9

 Pros

  • Polygons can usually be rendered in any order
  • Universal hardware support is available

 Cons

  • Partially transparent objects must be rendered in back to front order

(painter's algorithm)

  • Completely transparent values can mess up the z buffer unless they

are checked

  • z-fighting can occur when two polygons have the same (or nearly the

same) z values

slide-10
SLIDE 10

 A stencil buffer can be used to record a rendered polygon

  • This stores the part of the screen covered by the polygon and can be

used for special effects

 Frame buffer is a general term for the set of all buffers  Different images can be rendered to an accumulation buffer

and then averaged together to achieve special effects like blurring or antialiasing

 A back buffer allows us to render off screen to avoid popping

and tearing

slide-11
SLIDE 11

 This pipeline is focused on interactive graphics

  • Micropolygon pipelines are usually used for film production
  • Predictive rendering applications usually use ray tracing renderers

 The old model was the fixed-function pipeline which gave

little control over the application of shading functions

 The book focuses on programmable GPUs which allow all

kinds of tricks to be done in hardware

slide-12
SLIDE 12
slide-13
SLIDE 13

 An effect says how things should be rendered on the screen  We can specify this in details using shader programs  The BasicEffect class gives you the ability to do effects without

creating a shader program

  • Less flexibility, but quick and easy

 The BasicEffect class has properties for:

  • World transform
  • View transform
  • Projection transform
  • Texture to be applied
  • Lighting
  • Fog
slide-14
SLIDE 14

 Vertices can be stored in many different formats depending

  • n data you want to keep
  • Position is pretty standard
  • Normals are optional
  • Color is optional

 We will commonly use the VertexPositionColor type to

hold vertices with color

slide-15
SLIDE 15

 The GPU holds vertices in a buffer that can be indexed into  Because it is special purpose hardware, it has to be accessed in special ways  It seems cumbersome, but we will often create an array of vertices, create an

appropriately sized vertex buffer, and then store the vertices into the buffer

VertexPositionColor[] vertices = new VertexPositionColor[3] { new VertexPositionColor(new Vector3(0, 1, 0), Color.Red), new VertexPositionColor(new Vector3(+0.5f, 0, 0), Color.Green), new VertexPositionColor(new Vector3(-0.5f, 0, 0), Color.Blue) }; vertexBuffer = new VertexBuffer(GraphicsDevice, typeof(VertexPositionColor), 3, BufferUsage.WriteOnly); vertexBuffer.SetData<VertexPositionColor>(vertices);

slide-16
SLIDE 16

 In order to draw a vertex buffer, you have to:

  • Set the basic effect to have the appropriate transformations
  • Set the vertex buffer on the device as the current one being drawn
  • Loop over the passes in the basic effect, applying them
  • Draw the appropriate kind of primitives

effect.World = world; effect.View = view; effect.Projection = projection; effect.VertexColorEnabled = true; GraphicsDevice.SetVertexBuffer(vertexBuffer); foreach (EffectPass pass in effect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawPrimitives(PrimitiveType.TriangleList, 0, 1); }

slide-17
SLIDE 17

 Sometimes a mesh will repeat

many vertices

 Instead of repeating those

vertices, we can give a list of indexes into the vertex list instead

short[] indices = new short[3] {0, 1, 2}; indexBuffer = new IndexBuffer(GraphicsDevice, typeof(short), 3, BufferUsage.WriteOnly); indexBuffer.SetData<short>(indices);

slide-18
SLIDE 18

 Once you have the index buffer, drawing with it is very similar to drawing

without it

 You simply have to set it on the device  Then call DrawIndexedPrimitives() instead of DrawPrimitives()

  • n the device

basicEffect.World = world; basicEffect.View = view; basicEffect.Projection = projection; GraphicsDevice.SetVertexBuffer<VertexPositionColor>(vertexBuffer); GraphicsDevice.Indices = indexBuffer; foreach (EffectPass pass in basicEffect.CurrentTechnique.Passes) { pass.Apply(); GraphicsDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, 0, 1); }

slide-19
SLIDE 19

 An icosahedron has 20 sides, but it

  • nly has 12 vertices

 By using an index buffer, we can use

  • nly 12 vertices and 60 indices

 Check out the XNA tutorial on RB

Whitaker's site for the data:

  • http://rbwhitaker.wikidot.com/index-

and-vertex-buffers

 There are some minor changes

needed to make the code work

slide-20
SLIDE 20

 It is very common to define primitives in terms of lists and

strips

 A list gives all the vertex indices for each of the shapes drawn

  • 2n indices to draw n lines
  • 3n indices to draw n triangles

 A strip gives only the needed information to draw a series of

connected primitives

  • n + 1 indices to draw a connected series of n lines
  • n + 2 indices to draw a connected series of n triangles
slide-21
SLIDE 21
slide-22
SLIDE 22
slide-23
SLIDE 23

 GPU stands for graphics processing unit  The term was coined by NVIDIA in 1999 to differentiate the

GeForce256 from chips that did not have hardware vertex processing

 Dedicated 3D hardware was just becoming the norm and

many enthusiasts used an add-on board in addition to their normal 2D graphics card

  • Voodoo2
slide-24
SLIDE 24

 Modern GPU's are generally responsible for the geometry and

rasterization stages of the overall rendering pipeline

 The following shows color-coded functional stages inside

those stages

  • Red is fully programmable
  • Purple is configurable
  • Blue is not programmable at all

Vertex Shader Geometry Shader Clipping Screen Mapping Triangle Setup Triangle Traversal Pixel Shader Merger

slide-25
SLIDE 25
slide-26
SLIDE 26

 GPU architecture

  • Vertex shading
  • Geometry shading
  • Pixel shading
slide-27
SLIDE 27

 Keep reading Chapter 3  Keep working on Assignment 1, due this Friday by 11:59  Keep working on Project 1, due Friday, September 27 by 11:59  CS Club

  • Tonight from 4-6 p.m. in The Point 113!