comp4076 gv07 virtual environments tracking and
play

COMP4076 / GV07 - Virtual Environments: Tracking and Interaction - PowerPoint PPT Presentation

COMP4076 / GV07 - Virtual Environments: Tracking and Interaction Wole Oyekoya Department of Computer Science University College London w.oyekoya@cs.ucl.ac.uk http://www.cs.ucl.ac.uk/teaching/VE Outline Introduction Models of


  1. COMP4076 / GV07 - Virtual Environments: Tracking and Interaction Wole Oyekoya Department of Computer Science University College London w.oyekoya@cs.ucl.ac.uk http://www.cs.ucl.ac.uk/teaching/VE

  2. Outline • Introduction • Models of Interaction • Interaction Methods

  3. Introduction • Introduction • Models of Interaction • Interaction Methods

  4. Tracking and Interaction User Synthetic User Environment Computer Input Devices Tracking and Real interaction happens Environment here

  5. Basic Components of an Interface • The input devices captures user actions • Transfer functions / control-display mappings / interaction techniques – Map the movement of the device into the movement of the controlled elements – Isomorphic: one-to-one mapping between motions in the physical and virtual worlds – Non-isomorphic: input is shifted, scaled, integrated, … • The display devices present the effects of the input to the user

  6. What Are the Basic Interaction Tasks? • Locomotion or Travel – How to move through the space • Selection – How to indicate an object of interest • Manipulation – How to change properties of an object of interest • Symbolic – How to enter text and other parameters Won’t be • System control covered in this – How to change the way the system is behaving lecture

  7. Challenges in Designing Metaphors? • Designing interaction metaphors for virtual environments is hard: – Six degrees of freedom • Lack of appropriate input devices – Isolated parts of body tracked – Boxing glove or fishing rod style of interaction metaphors • Divide and conquer the problem by identifying basic models of interaction

  8. Models of Interaction • Introduction • Models of Interaction • Interaction Methods

  9. Models of Interaction • Extended Desktop Model – The computer generates a 3D version of the familiar desktop – The user needs tools to do 3D tasks • Virtual Reality Model – The user’s body is an interface to the world – The system responds to everything they do or say

  10. Extended Desktop Model • The desktop is now in 3D • It can extend beyond the boundaries of the computer screen itself • However, the user: – is not part of the scene – has a “God’s eye view”

  11. Interaction in the Extended Desktop • Focus on analysing a task and creating devices that fit the task • Study ergonomics of the device and applicability/suitability for the role • Special-purpose devices can be developed to support interaction

  12. 2D Interaction in a 3D World: Google Earth

  13. Modelling Package (3D Studio Max)

  14. Types of Device to Enable 3D Interaction Polhemus Isotrak 3-Ball 3DConnexion Spacemouse 3DConnexion Spaceball Logitech 3D Mouse Inition 3DiStick Nintendo Wii Remote Ascension Wanda

  15. GlobeFish and GlobeMouse

  16. Limitations of the Extended Desktop Model • 3D tasks can be quite complicated to perform • Tasks can become very specialised – Counterintuitive – Requires a lot of user training Fakespace Cubic Mouse

  17. Virtual Reality Model • Track the user as they move through a genuine 3D space • Need to track the user precisely and interpret what they do • Focus is on users exploring the environment

  18. Interaction in the Virtual Reality Model • Tension between isomorphic and non-isomorphic movements – Isomorphic: one-to-one mapping between motions in the physical and virtual worlds – Non-isomorphic: input is shifted, scaled, integrated, … • Tension between mundane and magical responses of the environment – Mundane are where the dynamics are governed by the familiar laws of physics (Newtonian mechanics) – Magical are everything else (casting spells, automatic doors, etc…)

  19. Body-Relative Interaction Technique based on Proprioception • Provides – Physical frame of reference in which to work – More direct and precise sense of control – “Eyes off” interaction • Enables – Direct object manipulation (for sense of position of object) – Physical Mnemonics (objects fixed relative to body) – Gestural Actions (invoking commands)

  20. Hand-Held Widgets • Hold controls in hands, rather than on objects • User relative motion of hands to effect widget changes Mine, Brooks Jr, Sequin

  21. Gestural Actions • Head – butt zoom • Look – at Menus • Two – handed flying • Over – the – shoulder deletion Mine, Brooks Jr, Sequin

  22. Limitations of the Virtual Reality Model • Can’t track user over very large areas – E.g. Some form of locomotion metaphor will be required for long distance travel (see later) • Physical constraints of systems • Limited precision and tracking points • Lack of physical force feedback

  23. Overcoming Lack of Force Feedback • One way to overcome lack of force feedback is to use a haptic device – Will be discussed in another lecture • Another approach is to exploit visual dominance in the interpretation of cues • CyberForce CyberGrasp

  24. Visual Dominance • The real hand is not constrained in space • The virtual hand can be constrained in virtual space • Can the user detect the difference? “The Hand is Slower than the Eye: A quantitative exploration of visual dominance over proprioception Burns, Whitton, Razzaque, McCallus, Panter, Brooks

  25. Visual Dominance Task: playing Simon game Drift between virtual and real hand gradually introduced over time

  26. Summary of Interaction Methods • The extended desktop model: – Desktop extends beyond physical screen – Interactions and devices on a case-by-case basis – Potentially more accurate, but counterintuitive and specialised interaction • The virtual reality model: – User’s body input to system – Potentially more intuitive but more general – Greater reliance on ability to have natural movement and ability to track – Partially resolved using visual dominance for HMDs

  27. Interaction Methods • Introduction • Models of Interaction • Interaction Methods

  28. Basic Interaction Tasks • Locomotion or Travel – How to effect movement through the space • Selection – How to indicate an object of interest Logically • Manipulation grouped together – How to move an object of interest • Symbolic – How to enter text and other parameters Won’t be • System control covered in this – Change mode of interaction or system state lecture

  29. Locomotion • Introduction • Models of Interaction • Interaction Methods – Locomotion or Travel Techniques – Selection and Manipulation

  30. Purpose of Locomotion • Change the pose of the viewpoint (both position and attitude) from some start location A to some end location B • This is the most fundamental task for a virtual environment – Arguably if the user can’t change the pose of the viewpoint, it’s not really a virtual environment at all

  31. Types of Travel Techniques • There are two fundamentally different types: • Virtual techniques: – The user’s body remains stationary even through the viewpoint moves • Physical techniques: – The user’s physical motion is used to transport the user through the virtual world

  32. Virtual Locomotion Techniques • The user remains stationary even though the viewpoint moves • Techniques must be used to specify – Direction – Speed

  33. Taxonomy of Virtual Locomotion Techniques Bowman, Koller and Hodges

  34. Taxonomy of Virtual Locomotion Techniques Bowman, Koller and Hodges

  35. Controlling Direction by Steering • Continuous specification of direction of motion • Direction can be specified by many means: – Gaze-directed – Pointing – Physical device (steering wheel, flight stick)

  36. Steering Techniques • Gaze-based: – Actually uses head orientation – Cognitively simple – Cannot look around whilst travelling • Pointing-based: – Actually uses on hand orientation – Cognitively more complicated – Makes it possible to look in one direction whilst travelling in another – However, you can’t hold other objects or manipulate them whilst travelling

  37. Target-Based Steering • Specify discrete target or goal • Multiple ways to specify the goal: – Point at object – Choose from list – Enter coordinates • Once specified, travel to the target is passive • Convenient way to get from A to B, but inflexible

  38. Route-Based Steering • Generalisation of the target-based metaphor • User specifies multiple waypoints on a map • The system generates a path and passively moves the user along the path • Placement of waypoints controls granularity of movement

  39. Taxonomy of Virtual Locomotion Techniques Bowman, Koller and Hodges

  40. Two-Handed Velocity / Acceleration Selection • Line between hands defines direction of travel • Distance between hands specifies constant speed • Pinch glove gestures provide “nudges” Mine, Brooks Jr, Sequin

  41. “Grabbing the Air” • Use hand gestures to move through the world • Metaphor is pulling a rope • Often implemented using pinch gloves • Physically occupies hands • Slow • Fatiguing

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend