simulation engines tda571 dit030
play

Simulation Engines TDA571|DIT030 Miscellaneous, input, collision - PowerPoint PPT Presentation

Simulation Engines TDA571|DIT030 Miscellaneous, input, collision detection ... Tommaso Piazza 1 Administrative stuff Tech Demo presentation A few slides presenting the overall concept 1-2 slides for each extension Screen shots


  1. Simulation Engines TDA571|DIT030 Miscellaneous, input, collision detection ... Tommaso Piazza 1

  2. Administrative stuff  Tech Demo presentation  A few slides presenting the overall concept  1-2 slides for each extension  Screen shots and diagrams are welcome  The demo itself  18 th December in von Neuman 10.00 sharp IDC | Interaction Design Collegium 2

  3. Administrative stuff  Group project presentation Friday, 18 December at 10.00 sharp  Presentation using i.e. Powerpoint  The extensions made by every member of the group  The simulation engine that has been put together  Tech demo  Use terms that make it easy for people from the outside to understand your work  Use plenty of screenshots  Diagram of system architecture  Show movie if applicable  Walkthrough of the tech demo  15 minutes including questions IDC | Interaction Design Collegium 3

  4. Administrative stuff  Group report  The group report should describe the work done as well as the final result  14-18 pages in total  Filename  simEngines-g#-GroupPresentation  in your group’s folder on Google Docs  Deadline  Monday, 11 January 24.00 IDC | Interaction Design Collegium 4

  5. Administrative stuff: Contents of group report  Design (for each extension)  Conclusion  Classes  Introduction  Interaction  Purpose  Implementation (for each extension)  Goals  Integration and testing  Pre-study  Tech demo  How does the system look today?  Description  What did you want to make better?  Design  Project plan and division of themes  Implementation  Specification of demands (for each extension)  Results  Functional demands (functionality)  Screenshots  Non-functional demands (properties)  Performance  Analysis  Conclusions  Conceptual model  List of sources  Architecture  Appendixes: Extension proposals Link to the code repository on Google Code IDC | Interaction Design Collegium 5

  6. Administrative stuff: Individual report  6-8 pages  Contents  Description of work tasks assigned to you and your role in the group's work  Description of your own contributions to the end result  Evaluation of your own work  Evaluation of the group's work  Description of the experiences and knowledge you have acquired in planning, designing, group dynamics and technical knowledge  Project diary as an appendix  File name  simEngines-g#-YourName-Report in your group’s folder on Google Docs   Deadline  Monday, 11 January 24.00 IDC | Interaction Design Collegium 6

  7. Administrative stuff: Other stuff to do  Source code and data  Should be Google Code, put the link in you group report  Put a readme.txt in the root of the file area  How to compile your code  How to run your tech demo  Anything else you can think of to aid me in accessing your work IDC | Interaction Design Collegium 7

  8. Input management  Input management is the process of accepting input data from the player and transforming this into actions in the context of the game world  The challenge of input management is the plentitude of input devices that exist in the gaming market today  Input management is not strictly about input  Force feedback  Touch information  Etc IDC | Interaction Design Collegium 8

  9. Virtual devices  Virtual devices (Wallace, 1976) comes from the field of human- computer interaction Not entirely relevant for us, but it is an interesting concept   Each physical input devices supports one or several of Button   Binary indication of choice (example: joystick and mouse buttons) Keyboard   Alphanumeric strings (example: keyboard and voice) Picker   Selection of graphical objects (example: light pen) Locator   Screen coordinate specification (example: mouse) Valuator   Floating point generation (example: joystick throttle, analog joystick) IDC | Interaction Design Collegium 9

  10. Input management  Input arrives in our engine in the form of input events  Records containing information about the input  A few examples of event data  Events are triggered by i.e.  Position  Button press/release  Screen position (mouse)  A button or key was pressed or released  Delta  Motion  Movement delta since last update (mouse, joystick)  An input device was moved  Button mask  Update  Button status (mouse buttons, keyboard)  Value  A value input device was changed  Floating-point value (joystick throttle) IDC | Interaction Design Collegium 10

  11. Input events  Input is usually handled by triggering discrete input events  Events are dispatched to an input handler which translates input events into actions  The event-class is usually abstract and subclassed with specific functionality  Joystick events  Mouse events  ... IDC | Interaction Design Collegium 11

  12. DirectInput DirectInput is the input management component of the DirectX SDK  Traditionally, input management in Windows has been done using the  internal Windows message system in the event handling loop  Incurs a lot of overhead DirectInput ignores the Windows event queue altogether and gives  developers direct access to the input devices connected to the computer DirectInput can be used more or less independently of the rest of  DirectX The structure of DirectInput usage will typically be the following:  1. Initialize DirectInput 2. Initialize each input device that is to be used 3. Retrieve input data from each device every loop and modify the game world accordingly 4. Once done, clean up DirectInput IDC | Interaction Design Collegium 12

  13. DirectInput: Cooperation and device state  An important thing to note for DirectInput is the cooperative level of the input devices we are configuring for our application  Tells Windows how we want the input device to cooperate with other concurrently running applications  We do this with SetCooperativeLevel()  In many cases we need to acquire exclusive access to a device in order to use it fully (i.e. for force feedback)  Once every frame, we read the device state and act upon it  GetDeviceState() passes a pointer to a device-dependent event structure depending on the device type IDC | Interaction Design Collegium 13

  14. DirectInput: Force feedback  Force feedback devices not only support user input, but can also provide tactile output to the user  Can create special effects like the stick vibrating, shaking, jolting, etc.  DirectInput contains rich functionality for controlling force feedback devices and provides many kinds of effects  Check the DirectX tutorials for more information IDC | Interaction Design Collegium 14

  15. DirectInput vs XInput  DirectInput has not seen any major changes since DirectX8  XInput was introduced in a late DirectX9 SDK  Neither had any major updates in DirectX10  XInput is slightly easier to use, but does not work on legacy devices  Primarily for XBOX 360 controllers  Both APIs have features that the other does not IDC | Interaction Design Collegium 15

  16. Collision detection  Collision detection is the process of determining whether objects in the game world intersect with each other as well as the static geometry of the world itself  Collision detection can be seen as a special case of the physical simulation we discussed in the previous lecture  Note that collision detection is a purely geometric problem and collision detection algorithms are optimized for this purpose IDC | Interaction Design Collegium 16

  17. Whether, when and where  In collision detection, we are interested in three questions  Whether two objects collided with each other  When in time the two objects collided  Where the surfaces of the two objects collided  These questions are increasingly more CPU intensive to answer, and algorithms for collision detection are often organized into phases where the questions are answered sequentially and only if necessary  Performing a simple intersection test in a system of N bodies naively (just answering the first question) is of order O(N2) , which can be expensive for large values of N IDC | Interaction Design Collegium 17

  18. Collision response  When performing collision detection, we do not worry about physics  However, we must be able to interface with the physics system to communicate collisions  A common way is to use callback functions that are supplied with parameters like contact point, time, surface normal, etc IDC | Interaction Design Collegium 18

  19. Overview of collision detection algorithms  We will cover two different types of algorithms  Broad phase/narrow phase  Two phases are used  One broad for culling away objects that can not possibly collide  One narrow for accurate collision detection  Pre-defined game-specific collision groups  Broad phase using OBB-trees (RAPID)  Single phase  Use the same partitioning scheme for both phases, i.e. perform no initial broad culling  BSP trees for collision detection IDC | Interaction Design Collegium 19

  20. Representation  Most collision detection algorithms deal with convex polyhedra only  Non-convex polyhedra are split into a set of convex polyhedra  In the culling phase, we use bounding volumes as representations IDC | Interaction Design Collegium 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend