Simulation Engines TDA571|DIT030 Miscellaneous, input, collision - - PowerPoint PPT Presentation

simulation engines tda571 dit030
SMART_READER_LITE
LIVE PREVIEW

Simulation Engines TDA571|DIT030 Miscellaneous, input, collision - - PowerPoint PPT Presentation

Simulation Engines TDA571|DIT030 Miscellaneous, input, collision detection ... Tommaso Piazza 1 Administrative stuff Tech Demo presentation A few slides presenting the overall concept 1-2 slides for each extension Screen shots


slide-1
SLIDE 1

Simulation Engines TDA571|DIT030

Miscellaneous, input, collision detection ...

Tommaso Piazza

1

slide-2
SLIDE 2

IDC | Interaction Design Collegium

Administrative stuff

  • Tech Demo presentation
  • A few slides presenting the overall concept
  • 1-2 slides for each extension
  • Screen shots and diagrams are welcome
  • The demo itself
  • 18th December in von Neuman 10.00 sharp

2

slide-3
SLIDE 3

IDC | Interaction Design Collegium

Administrative stuff

  • Group project presentation Friday, 18 December at 10.00

sharp

  • Presentation using i.e. Powerpoint
  • The extensions made by every member of the group
  • The simulation engine that has been put together
  • Tech demo
  • Use terms that make it easy for people from the outside to

understand your work

  • Use plenty of screenshots
  • Diagram of system architecture
  • Show movie if applicable
  • Walkthrough of the tech demo
  • 15 minutes including questions

3

slide-4
SLIDE 4

IDC | Interaction Design Collegium

Administrative stuff

  • Group report
  • The group report should describe the work done as

well as the final result

  • 14-18 pages in total
  • Filename
  • simEngines-g#-GroupPresentation
  • in your group’s folder on Google Docs
  • Deadline
  • Monday, 11 January 24.00

4

slide-5
SLIDE 5

IDC | Interaction Design Collegium

Administrative stuff: Contents of group report

  • Conclusion
  • Introduction
  • Purpose
  • Goals
  • Pre-study
  • How does the system look today?
  • What did you want to make better?
  • Project plan and division of themes
  • Specification of demands (for each extension)
  • Functional demands (functionality)
  • Non-functional demands (properties)
  • Analysis
  • Conceptual model
  • Architecture
  • Design (for each extension)
  • Classes
  • Interaction
  • Implementation (for each extension)
  • Integration and testing
  • Tech demo
  • Description
  • Design
  • Implementation
  • Results
  • Screenshots
  • Performance
  • Conclusions
  • List of sources
  • Appendixes: Extension proposals

Link to the code repository on Google Code

5

slide-6
SLIDE 6

IDC | Interaction Design Collegium

Administrative stuff: Individual report

  • 6-8 pages
  • Contents
  • Description of work tasks assigned to you and your role in the group's work
  • Description of your own contributions to the end result
  • Evaluation of your own work
  • Evaluation of the group's work
  • Description of the experiences and knowledge you have acquired in planning,

designing, group dynamics and technical knowledge

  • Project diary as an appendix
  • File name
  • simEngines-g#-YourName-Report
  • in your group’s folder on Google Docs
  • Deadline
  • Monday, 11 January 24.00

6

slide-7
SLIDE 7

IDC | Interaction Design Collegium

Administrative stuff: Other stuff to do

  • Source code and data
  • Should be Google Code, put the link in you group

report

  • Put a readme.txt in the root of the file area
  • How to compile your code
  • How to run your tech demo
  • Anything else you can think of to aid me in accessing your

work

7

slide-8
SLIDE 8

IDC | Interaction Design Collegium

Input management

  • Input management is the process of accepting

input data from the player and transforming this into actions in the context of the game world

  • The challenge of input management is the

plentitude of input devices that exist in the gaming market today

  • Input management is not strictly about input
  • Force feedback
  • Touch information
  • Etc

8

slide-9
SLIDE 9

IDC | Interaction Design Collegium

Virtual devices

  • Virtual devices (Wallace, 1976) comes from the field of human-

computer interaction

  • Not entirely relevant for us, but it is an interesting concept
  • Each physical input devices supports one or several of
  • Button
  • Binary indication of choice (example: joystick and mouse buttons)
  • Keyboard
  • Alphanumeric strings (example: keyboard and voice)
  • Picker
  • Selection of graphical objects (example: light pen)
  • Locator
  • Screen coordinate specification (example: mouse)
  • Valuator
  • Floating point generation (example: joystick throttle, analog joystick)

9

slide-10
SLIDE 10

IDC | Interaction Design Collegium

Input management

  • Input arrives in our engine in the form of input events
  • Records containing information about the input
  • A few examples of event data
  • Position
  • Screen position (mouse)
  • Delta
  • Movement delta since last update (mouse, joystick)
  • Button mask
  • Button status (mouse buttons, keyboard)
  • Value
  • Floating-point value (joystick throttle)
  • Events are triggered by i.e.
  • Button press/release
  • A button or key was pressed or

released

  • Motion
  • An input device was moved
  • Update
  • A value input device was changed

10

slide-11
SLIDE 11

IDC | Interaction Design Collegium

Input events

  • Input is usually handled by triggering discrete

input events

  • Events are dispatched to an input handler

which translates input events into actions

  • The event-class is usually abstract and

subclassed with specific functionality

  • Joystick events
  • Mouse events
  • ...

11

slide-12
SLIDE 12

IDC | Interaction Design Collegium

DirectInput

  • DirectInput is the input management component of the DirectX SDK
  • Traditionally, input management in Windows has been done using the

internal Windows message system in the event handling loop

  • Incurs a lot of overhead
  • DirectInput ignores the Windows event queue altogether and gives

developers direct access to the input devices connected to the computer

  • DirectInput can be used more or less independently of the rest of

DirectX

  • The structure of DirectInput usage will typically be the following:
  • 1. Initialize DirectInput
  • 2. Initialize each input device that is to be used
  • 3. Retrieve input data from each device every loop and modify the game world accordingly
  • 4. Once done, clean up DirectInput

12

slide-13
SLIDE 13

IDC | Interaction Design Collegium

DirectInput: Cooperation and device state

  • An important thing to note for DirectInput is the

cooperative level of the input devices we are configuring for our application

  • Tells Windows how we want the input device to cooperate with
  • ther concurrently running applications
  • We do this with SetCooperativeLevel()
  • In many cases we need to acquire exclusive access to a device

in order to use it fully (i.e. for force feedback)

  • Once every frame, we read the device state and act

upon it

  • GetDeviceState() passes a pointer to a device-dependent

event structure depending on the device type

13

slide-14
SLIDE 14

IDC | Interaction Design Collegium

DirectInput: Force feedback

  • Force feedback devices not only support user

input, but can also provide tactile output to the user

  • Can create special effects like the stick vibrating,

shaking, jolting, etc.

  • DirectInput contains rich functionality for

controlling force feedback devices and provides many kinds of effects

  • Check the DirectX tutorials for more

information

14

slide-15
SLIDE 15

IDC | Interaction Design Collegium

DirectInput vs XInput

  • DirectInput has not seen any major changes

since DirectX8

  • XInput was introduced in a late DirectX9 SDK
  • Neither had any major updates in DirectX10
  • XInput is slightly easier to use, but does not

work on legacy devices

  • Primarily for XBOX 360 controllers
  • Both APIs have features that the other does

not

15

slide-16
SLIDE 16

IDC | Interaction Design Collegium

Collision detection

  • Collision detection is the process of determining

whether objects in the game world intersect with each

  • ther as well as the static geometry of the world itself
  • Collision detection can be seen as a special case of the

physical simulation we discussed in the previous lecture

  • Note that collision detection is a purely geometric problem

and collision detection algorithms are optimized for this purpose

16

slide-17
SLIDE 17

IDC | Interaction Design Collegium

Whether, when and where

  • In collision detection, we are interested in three questions
  • Whether two objects collided with each other
  • When in time the two objects collided
  • Where the surfaces of the two objects collided
  • These questions are increasingly more CPU intensive to

answer, and algorithms for collision detection are often

  • rganized into phases where the questions are answered

sequentially and only if necessary

  • Performing a simple intersection test in a system of N

bodies naively (just answering the first question) is of

  • rder O(N2), which can be expensive for large values of N

17

slide-18
SLIDE 18

IDC | Interaction Design Collegium

Collision response

  • When performing collision detection, we do not

worry about physics

  • However, we must be able to interface with the

physics system to communicate collisions

  • A common way is to use callback functions that are

supplied with parameters like contact point, time, surface normal, etc

18

slide-19
SLIDE 19

IDC | Interaction Design Collegium

Overview of collision detection algorithms

  • We will cover two different types of algorithms
  • Broad phase/narrow phase
  • Two phases are used
  • One broad for culling away objects that can not possibly collide
  • One narrow for accurate collision detection
  • Pre-defined game-specific collision groups
  • Broad phase using OBB-trees (RAPID)
  • Single phase
  • Use the same partitioning scheme for both phases, i.e. perform

no initial broad culling

  • BSP trees for collision detection

19

slide-20
SLIDE 20

IDC | Interaction Design Collegium

Representation

  • Most collision detection algorithms deal

with convex polyhedra only

  • Non-convex polyhedra are split into a set of convex

polyhedra

  • In the culling phase, we use bounding volumes

as representations

20

slide-21
SLIDE 21

IDC | Interaction Design Collegium

Bounding volume hierarchies

  • To get the most benefit out of our

bounding volumes, we superimpose them on top of the scene graph and build bounding volume hierarchies

  • Leaf nodes have normal bounding

volumes

  • Internal nodes are aggregations of

the bounding volumes of their children

  • Note that we use the same

technique for view frustum culling

21

slide-22
SLIDE 22

IDC | Interaction Design Collegium

OBB-trees

  • The RAPID collision detection system

is based on work done by Gottschalk et al in 1996, and which uses a data structure called a OBB-tree

  • In this approach, the object is

recursively partitioned into two halfspaces, and an OBB is computed for each halfspace

  • The OBB is calculated by fitting

triangles with a Gaussian distribution using principal component analysis (PCA)

22

slide-23
SLIDE 23

IDC | Interaction Design Collegium

Broad/narrow phase

  • Broad phase/narrow phase algorithms are based

around two stages where we first cull away object pairs which cannot possibly collide and then using exact collision detection on the remaining pairs

  • Different algorithms for the two phases for optimal performance
  • The broad phase can use different strategies for culling
  • bject pairs, including time coherence, spatial partitioning,
  • r simply the bounding volumes of the objects
  • The narrow phase performs exact collision detection on

primitive-level (usually triangles or planes)

  • There exists a number of algorithms and we will not go into details
  • n these

23

slide-24
SLIDE 24

IDC | Interaction Design Collegium

Broad Phase Collision Detection using OBB-Trees

  • Gottschalk (1996) use the OBB-trees defined earlier to implement

a broad phase collision detection system (RAPID is an implementation of this)

  • Each object is represented by an OBB-tree
  • In order to check for intersection between two such trees, each tree is

traversed recursively

  • If our simple intersection test tells us that the OBBs in the respective trees intersect, we

recurse to the two children of each tree node

  • Objects that do not intersect at all can be discarded quickly
  • The intersection test is slightly more complex than usual
  • Instead of naively performing edge-face tests for the two bounding boxes (which would have

resulted in 12 edges × 6 faces × 2 boxes = 144 tests), we project the boxes onto 15 different axes and check their intersection (i.e. only 15 intersection tests needed)

  • See the RAPID webpage for more information and downloads
  • http://www.cs.unc.edu/~geom/OBB/OBBT.html

24

slide-25
SLIDE 25

IDC | Interaction Design Collegium

Single phase

  • Single phase collision detection discards the

notion of having two phases

  • Instead, the same representation and strategy is

used for both stages and collision detection is done immediately

  • The most common way of doing this is

through BSP trees

25

slide-26
SLIDE 26

IDC | Interaction Design Collegium

BSP trees for collision detection

  • The BSP for an object can be seen as an

iterative bounding volume that shrinks the 3D space until it completely encloses the

  • bject
  • To test intersection, we simply merge their BSP

trees

  • The first object's BSP tree is used to partition the

second object

26

slide-27
SLIDE 27

IDC | Interaction Design Collegium

Break

  • 15 minutes

27

slide-28
SLIDE 28

IDC | Interaction Design Collegium

Alternative platforms

  • In this course, we have focused on development for Windows
  • The reality of the game development industry is that the PC platform is a

relatively small part of the market

  • Already in 2003, console games outsold computer games in the US by about

380%

  • Console games
  • Nintendo Gamecube & Wii
  • Sony Playstation 2 & 3
  • Sony PSP
  • Nintendo DS
  • Microsoft XBOX & XBOX 360
  • PDAs
  • Browser-based games
  • Mobile phones

28

slide-29
SLIDE 29

IDC | Interaction Design Collegium

Console games

  • Console games are primarily created through

cross-platform development

  • Not done on the console itself, but rather on a

development platform and then compiled for the console later on

  • Gold describes cross-platform development in

Chapter 6

  • Remember the build tools from the early

lectures?

  • SCons (cross-platform substitute for the classic Make)

29

slide-30
SLIDE 30

IDC | Interaction Design Collegium

Mobile games

  • The extremely high availability of mobile phones

has made games on mobile phones an interesting business

  • There are two primary APIs
  • Java 2 Micro Edition (J2ME)
  • A smaller version of the Java platform with very good support
  • n modern mobile phones
  • Binary Runtime Environment for Wireless (BREW)
  • A complete set of APIs that enable software development and

applications in C, C++ and Java, developed by Qualcomm

  • Also, iPhone SDK has a game dedicated part

30

slide-31
SLIDE 31

IDC | Interaction Design Collegium

2D graphics

  • Of particular interest to mobile platforms
  • Mobile phones, Nintendo DS, PSP, etc
  • Modern mobile phones have support for 3D,

but in practice, 2D is often better than 3D when used with limited input and display size

  • Also used in plenty of Windows games
  • Suitable APIs
  • OpenGL, OpenGL ES, DirectDraw, Java2D, etc

31

slide-32
SLIDE 32

IDC | Interaction Design Collegium

GUIs

  • Practically all games need some kind of GUI
  • Since games are often meant to be a departure

from reality, we rarely want to use the standard Windows UI

  • GUIs have traditionally been one of the main

research areas for object-orientation

  • A lot of the design and architectural patterns we

talked about in an earlier lecture are inspired from this research

  • Model-View-Controller, Composite, Decorator, Observer,

etc...

32

slide-33
SLIDE 33

IDC | Interaction Design Collegium

Example: CEGUI

  • Crazy Eddie’s GUI System

(http://www.cegui.org.uk) is a free library providing windowing and widgets for graphics APIs / engines where such functionality is not natively available, or severely lacking

  • The library is object orientated, written in C++, and

targeted at games developers who should be spending their time creating games, not building GUI sub-systems

33

slide-34
SLIDE 34

IDC | Interaction Design Collegium

CEGUI: Details

  • CEGUI is cross-platform and can thus be used on several

different platforms

  • Developed and released under the LGPL license
  • In order to be able to support various kinds of platforms, CEGUI

requires a renderer module which implements the necessary drawing operations for the current platform

  • Currently, CEGUI supports both DirectX and OpenGL, and also

has complete bindings for Ogre3D

  • CEGUI is often experienced by users to be quite complex

and difficult to get working with Ogre

  • In many cases, simple projects will not need the full functionality that CEGUI

provides, but instead can make do with the simple input management and

  • verlays provided by Ogre

34

slide-35
SLIDE 35

IDC | Interaction Design Collegium

Resource management

  • Gold outlines a number of problems with the traditional

use of file I/O in C/C++ applications

  • Standard file I/O is too slow
  • Limitations on file naming
  • File I/O calls are blocking
  • No control over memory management
  • We need a robust resource management system with the

following properties

  • Load files with little or no overhead
  • Load files in bulk with reduced overhead
  • Load files asynchronously to the game
  • Manage memory efficiently

35

slide-36
SLIDE 36

IDC | Interaction Design Collegium

Virtual file system

  • One common solution for resource management is to

introduce a VFS

  • Abstract away the real properties of whatever file system that is

running on the host machine

  • Instead of littering the physical file system with a

thousand different files for our project, we can bundle all these together into a single, large file possibly zipped using zlib

  • We design our VFS to handle these zip-files transparently
  • Our resource manager can be designed to keep track of

memory instances of objects and files and garbage collect them automatically when they are no longer used

36

slide-37
SLIDE 37

IDC | Interaction Design Collegium

Scripting

  • One of the most important tools of any game engine is the

scripting language

  • A simple programming language on a higher abstraction level

than the actual programming language used in the rest of the engine

  • Many attractive features
  • Decouples behavior and content from game code
  • Allows for rapid prototyping and testing, perhaps even dynamically
  • Simpler and more convenient programming features than a “real”

programming language

  • Language bindings to in-game concepts such as characters,

goals, missions, and levels instead of memory, files, triangles, and algorithms

  • Vital tool for design-driven control

37

slide-38
SLIDE 38

IDC | Interaction Design Collegium

Example: UnrealScript

  • UnrealScript is the internal scripting language of the Unreal

engine

  • Runs on top of a virtual machine (the Unreal Virtual Machine)
  • Each script defines a class which inherits from another class or

a superclass in the engine itself

  • Methods are either pre-defined and called by the engine, or called by
  • ther scripts
  • According to Tim Sweeney, the design goals of UnrealScript

were

  • To support the major concepts of time, state, properties, and

networking which traditional programming languages don’t address

  • To provide Java-style programming simplicity, object-orientation,

and compile-time error checking

38

slide-39
SLIDE 39

IDC | Interaction Design Collegium

Example: UnrealScript

  • A pointer-less environment with automatic garbage

collection

  • A simple single-inheritance class graph
  • strong compile-time type checking
  • A safe client-side execution ”sandbox”
  • The familiar look and feel of C/C++/Java code
  • To enable rich, high level programming in terms of

game objects and interactions rather than bits and pixels

39

slide-40
SLIDE 40

IDC | Interaction Design Collegium

Example: UnrealScript

class TriggerLight extends Light; var() float ChangeTime; // Time light takes to change from on to off. var() bool bInitiallyOn; // Whether it’s initially on. var() bool bDelayFullOn; // Delay then go full-on. .. function BeginPlay() { // Remember initial light type and set new one. Disable( ’Tick’ ); InitialType = LightType; InitialBrightness = LightBrightness; if( bInitiallyOn ) { Alpha = 1.0; Direction = 1.0; } else { LightType = LT_None; Alpha = 0.0; Direction = -1.0; } } ...

40

slide-41
SLIDE 41

IDC | Interaction Design Collegium

Existing languages

  • Perl
  • The mother of all scripting languages; certainly not the first scripting language, but probably
  • ne of the most “hardcore” scripting languages out there
  • Python
  • Object-oriented scripting language with a very clean and elegant design - relatively easy to

embed into an existing application

  • Java
  • Not really a scripting language, but sufficiently high-level to be feasible to use with scripting
  • Lua
  • A powerful light-weight programming language designed for extending applications – fairly

common in games

  • Ruby
  • A relatively modern object-oriented scripting language
  • Roll your own
  • You can always create your own in the tradition of UnrealScript

41

slide-42
SLIDE 42

IDC | Interaction Design Collegium

Functional paradigm

  • Regardless of which language we choose, we have to

consider the functional paradigm of our scripting language

  • C++ – used for performance-critical game code
  • Scripting language – intermediate and high-level game-specific

code

  • Gold describes three different execution models for our

scripting engine

  • Linear control – normal script code executed sequentially
  • Event-driven control – script code is executed upon certain in-

game events

  • Task-based control – Concurrently executing tasks (multitasking
  • perating system)

42

slide-43
SLIDE 43

IDC | Interaction Design Collegium

Language bindings

  • The bindings in the language is the interface to the

game engine that the scripting language exposes to the script programmer

  • It is obviously not sufficient to simply provide a scripting language

inside our game engine if the inputs and outputs of the language are not connected to the actual inputs and outputs of the game engine

  • We have to consider the following
  • File I/O – reading and writing files on the (real or virtual) file system
  • AI interface – maybe the scripting language should be used to

manipulate the AI subsystem?

  • 3D world – game scripts must be able to modify the 3D world state
  • Times and events – most game scripts will use time as a major factor,

and we need to provide functionality for this

43

slide-44
SLIDE 44

IDC | Interaction Design Collegium

Language bindings: SWIG

  • In many cases, it is a good idea to take a look

at the SWIG (Simplified Wrapper and Interface Generator) project

  • Standardized and easy way to embed standard

scripting languages

  • Supports lots of languages
  • Perl, PHP, Python, Tcl, Ruby, C#, Common Lisp,

Java, Modula-3, OCAML, etc...

  • http://www.swig.org

44

slide-45
SLIDE 45

IDC | Interaction Design Collegium

Final words

  • Play games, build games and have fun!
  • Project presentations on the 18th 10.00 sharp

week!

45