Week 2 - Friday What did we talk about last time? Graphics - - PowerPoint PPT Presentation

week 2 friday what did we talk about last time graphics
SMART_READER_LITE
LIVE PREVIEW

Week 2 - Friday What did we talk about last time? Graphics - - PowerPoint PPT Presentation

Week 2 - Friday What did we talk about last time? Graphics rendering pipeline Geometry Stage We're going to start by drawing a 3D model Eventually, we'll go back and create our own primitives Like other MonoGame content, the


slide-1
SLIDE 1

Week 2 - Friday

slide-2
SLIDE 2

 What did we talk about last time?  Graphics rendering pipeline

  • Geometry Stage
slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

 We're going to start by drawing a 3D model

  • Eventually, we'll go back and create our own primitives

 Like other MonoGame content, the easiest way to manage it

is to add it to your Content folder and load it through the Content Management Pipeline

  • MonoGame can load (some).fbx, .x, and .obj files
  • Note that getting just the right kind of files (with textures or not) is

sometimes challenging

slide-8
SLIDE 8

 First, we declare a member variable to hold the model  Then we load the model in the LoadContent() method

model = Content.Load<Model>("Ship"); Model model;

slide-9
SLIDE 9

 To draw anything in 3D, we need a world matrix, a view matrix and a

projection matrix

 Since you'll need these repeatedly, you could store them as members

Matrix world = Matrix.CreateTranslation(new Vector3(0, 0, 0)); Matrix view =

  • Matrix. CreateLookAt(new Vector3(0, 0, 7), new Vector3(0, 0, 0),

Vector3.UnitY); Matrix projection = Matrix.CreatePerspectiveFieldOfView(0.9f, (float)GraphicsDevice.Viewport.Width / GraphicsDevice.Viewport.Height, 0.1f, 100.0f);

slide-10
SLIDE 10

 The world matrix controls how the model is translated, scaled,

and rotated with respect to the global coordinate system

 This code makes a matrix that moves the model 0 units in x, 0

units in y, and 0 units in z

  • In other words, it does nothing

Matrix world = Matrix.CreateTranslation(new Vector3(0, 0, 0));

slide-11
SLIDE 11

 The view matrix sets up the orientation of the camera  The easiest way to do so is to give

  • Camera location
  • What the camera is pointed at
  • Which way is up
  • This camera is at (0, 0, 7), looking at the origin, with positive y as up

Matrix view = Matrix.CreateLookAt(new Vector3(0, 0, 7), new Vector3(0, 0, 0), Vector3.UnitY);

slide-12
SLIDE 12

 The projection matrix determines how the scene is projected into

2D

 It can be specified with

  • Field of view in radians
  • Aspect ratio of screen (width / height)
  • Near plane location
  • Far plane location

Matrix projection = Matrix.CreatePerspectiveFieldOfView( .9f, (float)GraphicsDevice.Viewport.Width / GraphicsDevice.Viewport.Height, 0.1f, 100f);

slide-13
SLIDE 13

 Drawing the model is done by drawing all the individual meshes that make it up  Each mesh has a series of effects

  • Effects are used for texture mapping, visual appearance, and other things
  • They need to know the world, view, and projection matrices

foreach(ModelMesh mesh in model.Meshes) { foreach(BasicEffect effect in mesh.Effects) { effect.World = world; effect.View = view; effect.Projection = projection; } mesh.Draw(); }

slide-14
SLIDE 14

 I did not properly describe an important optimization done in the Geometry

Stage: backface culling

 Backface culling removes all polygons that are not facing toward the screen  A simple dot product is all that is needed  This step is done in hardware in MonoGame and OpenGL  You just have to turn it on  Beware: If you screw up your normals, polygons could vanish

slide-15
SLIDE 15

 For API design, practical top-down problem solving, and

hardware design, and efficiency, rendering is described as a pipeline

 This pipeline contains three conceptual stages:

Produces material to be rendered

Application

Decides what, how, and where to render

Geometry

Renders the final image

Rasterizer

slide-16
SLIDE 16
slide-17
SLIDE 17
slide-18
SLIDE 18

 The goal of the Rasterizer Stage is to take all the transformed

geometric data and set colors for all the pixels in the screen space

 Doing so is called:

  • Rasterization
  • Scan Conversion

 Note that the word pixel is actually a portmanteau for

"picture element"

slide-19
SLIDE 19

 As you should expect, the Rasterization Stage is also divided

into a pipeline of several functional stages: Triangle Setup Triangle Traversal Pixel Shading Merging

slide-20
SLIDE 20

 Data for each triangle is computed  This could include normals  This is boring anyway because fixed-operation (non-

customizable) hardware does all the work

slide-21
SLIDE 21

 Each pixel whose center is overlapped by a triangle must have

a fragment generated for the part of the triangle that

  • verlaps the pixel

 The properties of this fragment are created by interpolating

data from the vertices

 Again, boring, fixed-operation hardware does this

slide-22
SLIDE 22

 This is where the magic happens  Given the data from the other stages, per-pixel shading

(coloring) happens here

 This stage is programmable, allowing for many different

shading effects to be applied

 Perhaps the most important effect is texturing or texture

mapping

slide-23
SLIDE 23

 Texturing is gluing a (usually) 2D image onto a polygon  To do so, we map texture coordinates onto polygon coordinates  Pixels in a texture are called texels  This is fully supported in hardware  Multiple textures can be applied in some cases

slide-24
SLIDE 24

 The final screen data containing the colors for each pixel is

stored in the color buffer

 The merging stage is responsible for merging the colors from

each of the fragments from the pixel shading stage into a final color for a pixel

 Deeply linked with merging is visibility: The final color of the

pixel should be the one corresponding to a visible polygon (and not one behind it)

slide-25
SLIDE 25

 To deal with the question of visibility, most modern systems

use a Z-buffer or depth buffer

 The Z-buffer keeps track of the z-values for each pixel on the

screen

 As a fragment is rendered, its color is put into the color buffer

  • nly if its z value is closer than the current value in the z-buffer

(which is then updated)

 This is called a depth test

slide-26
SLIDE 26

 Pros

  • Polygons can usually be rendered in any order
  • Universal hardware support is available

 Cons

  • Partially transparent objects must be rendered in back to front order

(painter's algorithm)

  • Completely transparent values can mess up the z buffer unless they

are checked

  • z-fighting can occur when two polygons have the same (or nearly the

same) z values

slide-27
SLIDE 27

 A stencil buffer can be used to record a rendered polygon

  • This stores the part of the screen covered by the polygon and can be

used for special effects

 Frame buffer is a general term for the set of all buffers  Different images can be rendered to an accumulation buffer

and then averaged together to achieve special effects like blurring or antialiasing

 A back buffer allows us to render off screen to avoid popping

and tearing

slide-28
SLIDE 28

 This pipeline is focused on interactive graphics

  • Micropolygon pipelines are usually used for film production
  • Predictive rendering applications usually use ray tracing renderers

 The old model was the fixed-function pipeline which gave

little control over the application of shading functions

 The book focuses on programmable GPUs which allow all

kinds of tricks to be done in hardware

slide-29
SLIDE 29
slide-30
SLIDE 30
slide-31
SLIDE 31

 GPU architecture

  • Programmable shading
slide-32
SLIDE 32

 Read Chapter 3  Start on Assignment 1, due next Friday, September 13 by

11:59

 Keep working on Project 1, due Friday, September 27 by 11:59  Amazon Alexa Developer meetup

  • Thursday, September 12 at 6 p.m.
  • Here at The Point
  • Hear about new technology
  • There might be pizza…
slide-33
SLIDE 33

 Want a Williams-Sonoma internship?

  • Visit http://wsisupplychain.weebly.com/

 Interested in coaching 7-18 year old kids in programming?

  • Consider working at theCoderSchool
  • For more information:

▪ Visit https://www.thecoderschool.com/locations/westerville/ ▪ Contact Kevin Choo at kevin@thecoderschool.com ▪ Ask me!