Simulation Engines TDA571|DIT030 Multimedia and scenegraphs - - PowerPoint PPT Presentation

simulation engines tda571 dit030 multimedia and
SMART_READER_LITE
LIVE PREVIEW

Simulation Engines TDA571|DIT030 Multimedia and scenegraphs - - PowerPoint PPT Presentation

Simulation Engines TDA571|DIT030 Multimedia and scenegraphs Tommaso Piazza 1 Administrative stuff Student Representatives People with orange or red extension IDC | Interaction Design Collegium 2 Introduction Multimedia is


slide-1
SLIDE 1

Simulation Engines TDA571|DIT030 Multimedia and scenegraphs

Tommaso Piazza

1

slide-2
SLIDE 2

IDC | Interaction Design Collegium

Administrative stuff

  • Student Representatives
  • People with orange or red extension

2

slide-3
SLIDE 3

IDC | Interaction Design Collegium

Introduction

  • Multimedia is everything about the external appearance of

the game that is not directly related to the real-time rendering

  • More specifically
  • Music
  • MP3 and Ogg audio formats
  • Contextual event-driven music
  • Sound effects
  • File formats
  • 3D sound
  • Environmental audio
  • Video
  • Cutscenes (using video or in-game rendering and scripting)
  • Graphical User Interface (GUI) (Covered in another lecture)

3

slide-4
SLIDE 4

IDC | Interaction Design Collegium

Example: Data screens in Doom3

  • Doom3 uses a lot of interesting multimedia
  • In-game computer screens are interactive and show

animations that help progress the game

  • Cutscenes are done in-game with the engine

4

slide-5
SLIDE 5

IDC | Interaction Design Collegium

Fundamentals of digital audio

  • Audio is a pressure wave in the air
  • Digitally represented by discrete samples
  • Each sample represents the amplitude of the

wave at a given time

  • The frequency of samples
  • Sample rate
  • The sample size
  • 8 or 16 bit mostly
  • Number of channels
  • 1 for mono, 2 for stereo

5

slide-6
SLIDE 6

IDC | Interaction Design Collegium

Sample rates

  • Telephone quality
  • 11kHz, 1 channel
  • Radio quality
  • 22kHz, 1 or 2 channels
  • CD quality
  • 44kHz, 2 channels
  • Studio quality
  • > 48kHz, 24 bits, up to 24 tracks
  • Human ear is sensitive up to 20kHz, human voice

has a spectrum of ~4kHz

  • Nyquist-Shannon theorem

6

slide-7
SLIDE 7

IDC | Interaction Design Collegium

Windows WaveOut

  • Standard component on all flavors of Windows
  • Provides basic functionality for playing raw sound data
  • Easy to use and works on much hardware
  • Corresponding WaveIn also part of the WaveForm API
  • How to use
  • 1. Open the desired sound device with waveOutOpen()
  • 2. Allocate a memory buffer and a WAVEHDR structure
  • 3. Call waveOutPrepareHeader() to prepare for playback
  • 4. Fill in the buffer with sound data
  • 5. Call waveOutWrite() to play it
  • 6. Call waveOutReset() to force the device to relase all buffers
  • 7. Call waveOutUnprepareHeader to release resources
  • 8. Deallocate the buffer memory
  • 9. Calls waveOutClose to release the sound device
  • Typically not used in games except for basic fallback
  • Slow, not optimized for games and restricted to the Windows platform

7

slide-8
SLIDE 8

IDC | Interaction Design Collegium

3D and environmental sound

  • Throughout the years sound playback has been improved with

support for

  • 3D sound (positioning)
  • Environmental sound (effects that emulate the size, shape and

properties of the surrounding area)

  • Sources
  • Physical sound sources. Includes information about 3D position,

velocity and direction

  • Listeners
  • Global listeners instance. Represents the 3D position and velocity of

the player receiving the sound

  • Environment
  • Information about the properties of the environment, including its size,

shape and other physical attributes. Often tied to the listener

8

slide-9
SLIDE 9

IDC | Interaction Design Collegium

3D sound – Speaker configuration

  • 3D sound must take in considerations the

hardware configuration of the listener

  • There are many standards setups and formats
  • Headphones
  • Stereo systems
  • Dolby Digital 5.1
  • DTS
  • ...
  • Usually this is abstracted away by the sound

API

9

slide-10
SLIDE 10

IDC | Interaction Design Collegium

Creative EAX

  • Creative's API for environmental audio
  • One of the first with commercial support
  • Handles environmental effects
  • 3D is handed off to DirectSound, OpenAL, etc
  • 3.0 was called EAX Advanced HD, latest version on the X-fi is 5.0 with 128

voices and up to 4 effects on each voice

  • Multiple environments
  • Instead of a single set of environment properties, tied to the listener, the later EAXs support

concurrent environments that affect sound differently

  • Morphing
  • Morphing of environmental properties as the listener moves
  • Panning
  • Advanced control of the movement of sound in a 3D acoustic space
  • Reflection
  • Functionality for simulating reflections and echoes of sound against surfaces in 3D

10

slide-11
SLIDE 11

IDC | Interaction Design Collegium

DirectSound

  • The sound component of the DirectX SDK
  • Optimized for real-time applications
  • Better than WaveOut/In but less hardware support
  • DirectSound3D gives support for 3D

positioning and environmental effects

  • Eats more CPU but can be hardware accelerated
  • Slowly being replaced by Xaudio and XACT
  • XP, Vista and the XBOX 360 (part of XNA)
  • DirectSound on Vista lacks support for hardware

acceleration

11

slide-12
SLIDE 12

IDC | Interaction Design Collegium

OpenAL

  • Designed to be the “OpenGL of audio”
  • Provides a standardized API for accessing 3D audio
  • Uses extensions for adding future functionality

(same as OpenGL)

  • No standardized support for environmental

effects, but available with extensions

12

slide-13
SLIDE 13

IDC | Interaction Design Collegium

OpenAL: Fundamentals

  • OpenAL uses a simple model of 3D acoustics
  • Three kinds of entities
  • Sources
  • A source of sound in the world. Linked to one or more buffers

containing sound data. Also contains information about position and velocity.

  • Listener
  • An “ear” in the world representing the player, including position and

velocity.

  • Buffers
  • Memory buffers for the storage of sound date for playback.

Controlled by sources and are never manipulated directly. In many cases they are transferred to the sound card prior to playback and are thus immutable.

13

slide-14
SLIDE 14

IDC | Interaction Design Collegium

OpenAL: Buffer creating & loading

  • After initializing OpenAL we need to create our

buffers ALuint buffer; alGenBuffers(1, &buffer);

  • Each buffer is denoted by a handle (integer)
  • Fill buffers with data with

alBufferData(buffer, format, data, size, freq);

  • OpenAL does not in itself contain loaders

for common file formats

14

slide-15
SLIDE 15

IDC | Interaction Design Collegium

ALUT

  • Luckily, OpenAL, like OpenGL, contains a

simple portable API called ALUT with support for a lot of common tasks ALenum format; ALsizei size, freq; ALboolean loop; Alvoid *data; alutLoadWAVFile(“sound.wav”, &format, &data, &size, &freq, &loop);

15

slide-16
SLIDE 16

IDC | Interaction Design Collegium

OpenAL: Source creation

  • To create a source

ALuint source; alGenSources(1, &source);

  • Bind the buffer to a source

alSourcei(source, AL_BUFFER, buffers);

  • Set 3D properties

alSourcefv(source, AL_POSITION, sourcePos); alSourcefv(source, AL_VELOCITY, sourceVel); alSourcefv(source, AL_DIRECTION, sourceDir);

16

slide-17
SLIDE 17

IDC | Interaction Design Collegium

OpenAL: Listeners

  • Since there can only be one listener, it does

not need to be created.

  • Configure its 3D position

alListenerfv(AL_POSITION, listenerPos); alListenerfv(AL_VELOCITY, listenerVel); alListenerfv(AL_DIRECTION, listenerDir);

  • listenerDir is an array of 6 floats defining both

the “forward” and “up” vectors

17

slide-18
SLIDE 18

IDC | Interaction Design Collegium

OpenAL: Playing sounds

  • Playing operations

alSourcePlay(source); alSourceStop(source); alSourceRewind(source); alSourcePause(source);

  • To cleanup

alDeleteSources(NUM_SOURCES, source); alDeleteBuffers(NUM_BUFFERS, buffer);

18

slide-19
SLIDE 19

IDC | Interaction Design Collegium

FMOD

  • FMOD is a cross-platform audio library

supporting the same functionality on a wide range of hardware and platforms

  • Includes support for software sound mixing, 3D

audio, music playback, Dolby support, etc

  • Free for non-commercial use (no sources)
  • $1000-4000 per platform for commercial use
  • $100 “shareware/hobbyist” license exists as well

19

slide-20
SLIDE 20

IDC | Interaction Design Collegium

Music

  • Sound effects suspend disbelief
  • Music helps with immersion
  • Extremely important
  • Older games used CD audio
  • Sometimes even for voiceovers (Loom)
  • Gives little opportunity for processing

20

slide-21
SLIDE 21

IDC | Interaction Design Collegium

MP3

  • Developed by the Fraunhofer Institute in 1997
  • ISO-MPEG Audio Layer 3, MP3
  • Non-audible frequency principle
  • Sound sources next to other sound sources with higher intensity

are masked. Example: A bird singing next to a running car is hard to hear and can be eliminated

  • Principle of less important signals
  • The parts of the sound spectrum less important to fidelity

can be eliminated

  • Commonly used in games, but the format is proprietary

and jealously guarded by Fraunhofer, so using it requires a substantial licensing fee

21

slide-22
SLIDE 22

IDC | Interaction Design Collegium

Ogg Vorbis

  • Open source equivalent to MP3
  • Freely available
  • Arguably better quality sound than

MP3 on equivalent bitrates

  • There are free libraries available

for using with Ogg Vorbis

22

slide-23
SLIDE 23

IDC | Interaction Design Collegium

MOD music

  • Short for “module”
  • Devised for the Amiga computers
  • Capable of playing four simultaneous sounds
  • Stores both samples and the actual melodies
  • Generally smaller than MP3/OGG
  • Relatively easy to manipulate
  • Change pitch
  • Change tempo
  • Inject tunes

23

slide-24
SLIDE 24

IDC | Interaction Design Collegium

Event-driven music

  • Instead of using a single linear song, the music can be

structured into smaller segments, controlled by events in the game

  • Allows for immersive effects
  • One of the first was Michael Z. Land of LucasArts with the

iMUSE system (used in X-Wing)

  • Two components are needed
  • Control logic
  • Commands and scripts that control the flow of music, including the transition between

segments

  • Segments
  • Audio segments that can be combined into in-game music
  • We want support for both simultaneous and end-to-end

combinations

24

slide-25
SLIDE 25

IDC | Interaction Design Collegium

Event-driven music

25

slide-26
SLIDE 26

IDC | Interaction Design Collegium

Cutscenes

  • A common story-telling tool where the story is

progressed by non- or semi-interactive sequences of events shown to the player

  • Good examples: The Wing Commander series
  • $12 million budget (WC4), most of it cutscenes
  • In the past cutscenes have been pre-rendered
  • Nowadays, they are mostly done in the game's

engine in real-time

26

slide-27
SLIDE 27

IDC | Interaction Design Collegium

Cutscenes

27

slide-28
SLIDE 28

IDC | Interaction Design Collegium

Video

  • If they are made from film footage or are pre-rendered animations,

cutscenes are stored in video

  • MNG
  • Sibling of the PNG-format, but for animations. Good for simple sequences.
  • Video for Windows
  • Uses AVI-files and can use a multitude of codecs (DivX, Xvid, etc)
  • MPEG
  • Standard format. MPEG-2 is the DVD-format.
  • Quicktime
  • Apple's video format. Uses .MOV-files and supports multiple codecs.
  • DirectShow (ActiveMovie)
  • Microsoft's API for video programming for DirectX
  • Smacker and Bink
  • Commercial SDK and codec specifically designed for games programming

28

slide-29
SLIDE 29

IDC | Interaction Design Collegium

In-game cutscenes

  • Current 3D engines are powerful enough to allow for in-

game cutscenes

  • Provides for more dynamic control of the contents
  • Maintains the immersion of the game
  • Some requirements for this
  • Powerful scripting engine
  • Artists must be given tools for scripting entities, cameras, characters, etc

using scripting languages or other tools.

  • Facial animation
  • Cutscenes typically involve dialogue. The engine needs to be capable of

rendering characters in closeup.

  • The art of creating real-time rendered films is referred to

as “Machinima”

29

slide-30
SLIDE 30

IDC | Interaction Design Collegium

Facial animation

  • Believable facial animation is extremely difficult to achieve
  • 5 categories according to Watt & Policarpo
  • Simple movement of the head
  • Simple movement of the eyes
  • Complex simultaneous deformation of different parts of the

face to form expressions

  • Speech – complex deformation/movement of the lips, jaws

and tongue

  • Appropriate gestures particularly of, but not restricted to, arms and

hands

  • In practice this is often achieved through a combination of

skeletal animations and morph targets

30

slide-31
SLIDE 31

IDC | Interaction Design Collegium

Tool design

  • There are multiple ways to generate in-game cutscene, here

are a few

  • Demo editing
  • Human “actors” are used to control the characters using regular in-

game facilities and the action is recorded. Requires large “cast” and probably a lot of post-processing.

  • In-map scripting
  • Designers and artists add the scripted behavior to each map using

the normal level editor. Requires tool support and a strong scripting language.

  • Independent scripting
  • In-game scripting tools help the designers to control the cutscene as

if they were directors on a set controlling placement of cameras or actors.

31

slide-32
SLIDE 32

IDC | Interaction Design Collegium

Uses for in-game cutscenes

  • Dialogue
  • Allows for natural conversations in the games
  • Introduction of plot elements
  • Single shots or sequences of shots that introduce new

locations, creatures etc.

  • Plot development and mission briefings
  • Guide the players towards what they are intended to do in

the next interactive portion of the game (Command & Conquer Series)

  • Introductions and conclusions
  • Currently, most games still pre-render this, but it is about

to change

32

slide-33
SLIDE 33

IDC | Interaction Design Collegium

The language of film

  • Framing
  • The positioning of characters, camera and objects within a shot
  • Cinematic shot placement
  • Establishing shot
  • Used to establish the location of a scene
  • Wide shot
  • A shot of all the members of a conversation, to establish their positions
  • Long shot
  • Full body of a character, used to introduce a character
  • Medium close-up
  • Common conversation shot - just above or below the upper chest to above the head
  • Close-up
  • Just above or below the shoulders, used for conversation, adds more weight to the

statements or reactions

33

slide-34
SLIDE 34

IDC | Interaction Design Collegium

The language of film

  • Over the shoulder shot
  • A medium or wider shot that focuses on one character while having part of the other

character in shot

  • Two-shot
  • A closer version of the wide shot that allows some emotion to be shown while

establishing position

  • Conversational camerawork
  • The line
  • Between two characters having an active dialogue there lies an invisible line over

which the camera can rarely cross without disorienting the audience and making them feel uncomfortable

  • Talking room
  • Any action on the part of a character has a certain amount of space on-screen beyond

the point from which that character emerges, to “balance” that action

  • Shot sequence
  • Non-linear editing of the various shots into a coherent sequence

34

slide-35
SLIDE 35

IDC | Interaction Design Collegium

For more information

  • Try out the tools available in Half-life 2
  • Check out http://www.machinima.com

35

slide-36
SLIDE 36

IDC | Interaction Design Collegium

Break

http://xkcd.com

36

slide-37
SLIDE 37

IDC | Interaction Design Collegium

Scene graphs

  • Scenes could be contained in flat lists, but this

would be inefficient for complex scenes

  • Instead we use hierarchical data structures

called scene graphs

  • Hierarchical based on spatial locality
  • Based on trees
  • Polymorphic nodes
  • A tree is based on nodes
  • One parent
  • A number of children

37

slide-38
SLIDE 38

IDC | Interaction Design Collegium

Why scene graphs?

  • Locality of reference
  • Events of interest to a player tend to occur in the

same spatial region. Scene graphs allow for quick elimination of peripheral regions.

  • Content management
  • 3D content is typically created and assembled in

pieces by artists and modelers. A hierarchical graph makes life easier when managing these objects.

  • Hierarchical objects
  • Many objects are naturally modeled with a hierarchy

(such as humanoid characters)

38

slide-39
SLIDE 39

IDC | Interaction Design Collegium

Why scene graphs

  • Persistence
  • Hierarchical organization makes saving and

restoring the state of the world simple.

  • Heterogeneity
  • A polymorphic data structure like a scene graph

promotes heterogeneous node types that allow for lots of different node types, not just basic geometry

  • bjects.
  • While this is also possible in a flat object list, the

spatial nature of the scene graph makes this approach very natural.

39

slide-40
SLIDE 40

IDC | Interaction Design Collegium

Node representation

  • Each node needs the following information
  • Transforms
  • Linear transformation, usually represented by a matrix
  • Bounding volume
  • Bounding volume fully containing the geometry of the

node and its children

  • Render state
  • State flags for the 3D renderer (texture, material, color,

depth tests, etc)

  • Animation state
  • Time-varying node data

40

slide-41
SLIDE 41

IDC | Interaction Design Collegium

Node type examples

  • Group
  • Static 3D geometry
  • Dynamic 3D geometry
  • Terrain patch
  • Skybox/skydome
  • Spotlight/pointlight
  • User interface node
  • Animation node
  • 3D sound source
  • 3D text
  • ...

41

slide-42
SLIDE 42

IDC | Interaction Design Collegium

Node: Transforms

  • All nodes should contain their linear

transformation relative to their parent

  • Local coordinate system
  • Final transformation by concatenating transforms

from root to current node

  • How the transformation is contained varies, but

the base representation tends to be 4x4 matrices

  • Quaternions
  • Separate translation, etc
  • May cache the world transformation

42

slide-43
SLIDE 43

IDC | Interaction Design Collegium

Node: Bounding volumes

  • In order to exploit spatial locality we need

bounding volumes on all nodes

  • A bounding volume is a convex 3D shape

containing all the geometry of the node and its children

  • Can be used for
  • Collision detection
  • View frustum culling
  • Bounding volumes may need to be

recomputed

43

slide-44
SLIDE 44

IDC | Interaction Design Collegium

Node: Render state

  • Maintaining render states in a hierarchical

fashion has many benefits

  • Lights which only affect certain regions, etc.
  • When performing render traversal of a scene

graph, the renderer collects the render state through simple “push”/”pop” operations

  • Can minimize the amount of costly render state

changes

44

slide-45
SLIDE 45

IDC | Interaction Design Collegium

Animation state

  • Animation state is node data that changes as a

function of time

  • Often with interpolations between various linear

transformations

  • Can be accomplished with controllers
  • A lot can be animated
  • Scale, position, rotation, texture, fog, etc
  • Hierarchical animation structures can be useful
  • Synchronize animations on characters, etc

45

slide-46
SLIDE 46

IDC | Interaction Design Collegium

Node types

  • Three classes of nodes in a scene graph
  • Grouping nodes
  • Internal nodes in the tree with any number of children.

Spatially and semantically collect groups of nodes

  • Leaf nodes
  • The business end of the scene graph. Usually 3D

primitives and meshes with a visual representation.

  • Decorator nodes
  • Grouping nodes with only one child, used to invisibly

“decorate” a subtree and are often used for render and animation state changes.

46

slide-47
SLIDE 47

IDC | Interaction Design Collegium

Grouping nodes

47

slide-48
SLIDE 48

IDC | Interaction Design Collegium

Leaf nodes

48

slide-49
SLIDE 49

IDC | Interaction Design Collegium

Decorator nodes

49

slide-50
SLIDE 50

IDC | Interaction Design Collegium

Traversals

  • The purpose of a scene graph is not only to

store contents of a scene

  • Update traversal
  • Computes world transformation matrices, animation

data, bounding volumes, etc

  • Render traversal
  • Walks through the scene graph and draws 3D
  • bjects if they are visible
  • Pick traversal
  • Decides which object a user has clicked on

50

slide-51
SLIDE 51

IDC | Interaction Design Collegium

Update traversal

  • The scene graph contains a certain amount of volatile

state that changes as the scene graph changes

  • World transformations
  • Bounding volumes
  • Animation state
  • Most changes only affect a node and its children
  • Update traversals are typically performed once per

render cycle

  • Recursive traversal for most things
  • Bounding volumes are computed on the upward return call

51

slide-52
SLIDE 52

IDC | Interaction Design Collegium

Update traversal

# Update the node N given the world matrix M function update(N : node, M : matrix) : bool # Calculate the absolute matrix of this node Matrix W = N.calcWorldMatrix(M) bool bv_changed = false # Visit the children for each child c in N: # Update this child bv_changed |= update(c, W) # Did the bounding volume change? if (bv_changed): # Recompute the bounding volumes of this node bv_changed = N.calcBoundingVolume() # Return the result return bv_changed

52

slide-53
SLIDE 53

IDC | Interaction Design Collegium

Render traversals

# Render the node N using the abstract renderer R procedure render(N : node, R : renderer) # Set the current transformation matrix into the renderer R.loadMatrix(N.getWorldMatrix()) # Set renderer state (store old) R.pushState() R.setState(N.getRenderState()) # Visit the children for each child c in N: # Check the bounding volume of this child if (!R.isVisible(c)): continue # Render this child render(c, R) # Restore old render state R.popState()

53

slide-54
SLIDE 54

IDC | Interaction Design Collegium

Render traversal

  • The most basic scene graph traversal
  • Draws primitives according to render state
  • Can perform view frustum culling on nodes
  • Node entirely in view frustum -> Draw graphics for

all children

  • Node partially in view frustum -> Recursively test

children

  • Node not in frustum -> Do nothing

54

slide-55
SLIDE 55

IDC | Interaction Design Collegium

Pick traversal

  • Used when the user clicks on the screen and

we need to deduce which 3D object has been clicked on

  • Uses the current view frustum and builds a ray

starting from the user's viewpoint and passes through the viewport at the coordinates clicked upon

  • Recursively test on scene graph and sort by

the distance from the viewport

55

slide-56
SLIDE 56

IDC | Interaction Design Collegium

Pick traversal

# Perform 3D picking of the node N using the ray R procedure pick(N : node, R : ray, L : node_list) # Adapt ray to current world matrix R.setTransform(N.getWorldMatrix()) # Visit the children for each child c in N: # Intersect ray with this bounding volume if (!R.intersects(c)): continue # Perform picking for this child pick(c, R, L)

56

slide-57
SLIDE 57

IDC | Interaction Design Collegium

Existing scene graph APIs

  • SGI Performer
  • “Industrial-strength” scene graph API developed by Silicon

Graphics for OpenGL.

  • OpenSG
  • Freely available scene graph API with multiprocessing
  • support. More geared towards 3D visualization than games,

but possible to use with games.

  • Java3D
  • 3D API for the Java platform
  • NVSG
  • Nvidia's scene graph. Modern and has strong support for

programmable shaders using CgFX.

57

slide-58
SLIDE 58

IDC | Interaction Design Collegium

Scene graphs in Ogre3D

  • Ogre3D supports a simple scene graph hierarchy based

entirely on graphical objects

  • Read up on
  • Ogre::Node
  • Ogre::SceneNode
  • Ogre::Entity
  • All SceneManagers in Ogre3D contain a root SceneNode

to which all visible objects must be attached

  • There are plenty of different scene graphs for Ogre3D
  • OctTree
  • Paging landscape
  • ...

58

slide-59
SLIDE 59

IDC | Interaction Design Collegium

Scene graphs in Ogre3D

... // Create the node and entity SceneManager mSceneMgr = mRoot->getSceneManager(ST_EXTERIOR_CLOSE); SceneNode *node = mSceneMgr->getRootSceneNode()-> createChildSceneNode("Node1"); Entity *entity = mSceneMgr->createEntity("Mesh", "test.mesh"); node->attachObject(entity); // Transform the node node->translate(Vector3(10, 0, 10)); node->scale(0.5, 1, 2); node->yaw(Degree(-90)); ...

59