CSC 2524, Fall 2017 VR Stereo+Optics Karan Singh Inspired and - - PowerPoint PPT Presentation

csc 2524 fall 2017
SMART_READER_LITE
LIVE PREVIEW

CSC 2524, Fall 2017 VR Stereo+Optics Karan Singh Inspired and - - PowerPoint PPT Presentation

CSC 2524, Fall 2017 VR Stereo+Optics Karan Singh Inspired and adapted from Oliver Kreylos Outline Real-world visual perception. how VR emulates it. Problems and consequences of the emulation in VR. Vision Vision Vision in


slide-1
SLIDE 1

CSC 2524, Fall 2017 VR Stereo+Optics

Karan Singh

Inspired and adapted from Oliver Kreylos

slide-2
SLIDE 2

Outline

  • Real-world visual perception.
  • how VR emulates it.
  • Problems and consequences of the emulation in VR.
slide-3
SLIDE 3

Vision

slide-4
SLIDE 4

Vision

slide-5
SLIDE 5

Vision in Room VR

slide-6
SLIDE 6

Vision in Room VR

slide-7
SLIDE 7

Vision in Room VR

slide-8
SLIDE 8

Vision in Room VR

slide-9
SLIDE 9

Vision in Room VR

slide-10
SLIDE 10

User Movement

slide-11
SLIDE 11

Vision in Room VR

slide-12
SLIDE 12

Vision in Room VR

slide-13
SLIDE 13

Vision in VR

slide-14
SLIDE 14

Vision in Room VR

slide-15
SLIDE 15

Vision in Room VR

slide-16
SLIDE 16

Vision in Room VR

slide-17
SLIDE 17

Head-Mounted Displays

slide-18
SLIDE 18

Head-mounted Displays

slide-19
SLIDE 19

Head-mounted Displays

slide-20
SLIDE 20

Head-mounted Displays

slide-21
SLIDE 21

Optics

slide-22
SLIDE 22

Accommodation

…eye lens changes shape “accomodates” to focus at different depths.

slide-23
SLIDE 23

HMD Optics

slide-24
SLIDE 24

HMD Optics

slide-25
SLIDE 25

HMD Optics

slide-26
SLIDE 26

Head-mounted Displays

slide-27
SLIDE 27

Head-mounted Displays

slide-28
SLIDE 28

Lens Distortion

slide-29
SLIDE 29

Lens Distortion

slide-30
SLIDE 30

Lens Correction

slide-31
SLIDE 31

Configuration

slide-32
SLIDE 32

Configuration

slide-33
SLIDE 33

Physiognomy

slide-34
SLIDE 34

Configuration

slide-35
SLIDE 35

Configuration

slide-36
SLIDE 36

Configuration

slide-37
SLIDE 37

How to measure your IPD

slide-38
SLIDE 38

Mis-configuration

slide-39
SLIDE 39

Mis-configuration

slide-40
SLIDE 40

Mis-configuration

slide-41
SLIDE 41

Mis-configuration (depth inaccuracy)

slide-42
SLIDE 42

What VR Needs

  • Good screens and lenses
  • Good internal calibration
  • High-precision head tracking
  • Good user calibration
  • Ideally eye tracking
  • Low end-to-end latency
slide-43
SLIDE 43

End-to-end Latency

slide-44
SLIDE 44

End-to-end Latency

slide-45
SLIDE 45

End-to-end Latency

slide-46
SLIDE 46

End-to-end Latency

slide-47
SLIDE 47

What Else Can Go Wrong?

  • Artificial locomotion
  • Mismatch between “seen” and “felt” motion
  • Vection-vestibular conflict
slide-48
SLIDE 48

Accomodation and Vergence Conflict

Why do virtual objects close to my face appear blurry when wearing a VR headset? My vision is fine! And why does the real world look strange immediately after a long VR session?

slide-49
SLIDE 49

Vergence

slide-50
SLIDE 50

Accomodation-Vergence Coupling

How do our eyes “accommodate” or determine lens focus?

Blurriness reflex. Foveal vision is clearer than peripheral. We have two eyes. => “vergence”

slide-51
SLIDE 51

A-V Conflict Effects

  • Blurry objects
  • Eye strain
  • Accommodation-vergence decoupling
  • Vision feels “off” for a while after using VR
  • Might interfere with vision development in very young children
  • Potential solutions:
  • Lenses that allow different screen distances.
  • True Holographic displays.
slide-52
SLIDE 52

Projects

Tempest projects

  • Storm: fake storm, audience driven.

idea: estimate audience interest/gaze on stage, by some optical technology. for eg. processing colored ballcaps worn by audience from a camera on the ceiling. or library like densepose. use the estimated attention of lack of to produce glitches in a projected audio-visual storm to convey it is a manufactured storm. http://densepose.org/

  • Prospero’s brush: tiltBrush for natural phenomena.

idea: a tilt brush like interface where you paint out a dynamic landscape with trees and waterfalls. For inspiration see https://www.youtube.com/watch?v=uthd5rLJZtg http://www.dgp.toronto.edu/~karan/videos/drive_clip.wmv https://www.youtube.com/watch?v=TckqNdrdbgk https://www.youtube.com/watch?v=GSbkn6mCfXE https://www.youtube.com/watch?time_continue=7&v=qj2XxB2dsco

  • Ariel’ magic: Projective painting in real-time.

idea: have a user view their environment via a 360 camera that they are able to overlay drawings on a tablet. The drawings are projected back onto the environment in real-time. https://www.youtube.com/watch?v=KYKyqCsmMAU

  • Drawing on surfaces in AR/VR

idea: drawing on an object in 2D is best handled by projecting the sketched 2D points onto the visible objects through the screen. The best way to project an in-air 3D stroke on to 3D objects is not known. A good technique needs to be both intuitive and provide instant feedback so that a user is able correctly produce the on-surface strokes they desire. We already have a working prototype for this. The prototype will need to be improved and tested for usability using the vive in VR. https://www.youtube.com/watch?v=PSD_nISLolY https://www.youtube.com/watch?v=vBos8A_cwSM

slide-53
SLIDE 53

Projects

  • Facial Animation in VR

idea: use voice and hand gestures to control an animated face in VR. http://jaliresearch.com/

  • Proprioceptive interfaces in AR/VR.

idea: perform a study to understand human proprioceptive zones and design an interface of menus and commands to exploit the zones.

  • Pointing in AR/VR (mirror pointing).

idea: perform a study to understand human pointing at targets and build a data-driven model to predict pointed targets.

  • Interactive 3D acquisition and scanning of large spaces with AR

idea: create a gestural interface to create a 3D model of spaces using AR/VR https://www.youtube.com/watch?v=Xnp3_eMYXj0

  • Direct manipulation, browsing of linked 360 images and video.

idea: given a number of 360 images of spaces with common features, create a system that allows a user to browse the collection using familiar mobile hand gestures. As an example here are some 360 images, that are not spatially collocated but shown as hotspots that you can use gaze to switch between http://demos.janusvr.com/karan/webvr/hotspots/

slide-54
SLIDE 54

Projects

  • Developing a cinematic vocabulary for 360 video in VR. (shots/cuts/staging).

idea: adapt ideas from 2D cinematography to drive user gaze in 360 video.

  • Augmented Reality for Dance Choreography (with National Ballet).

idea: take clips or key points of dance choreography that can be overlaid live and controlled by a dancer while dancing.

  • Guided Tours in VR.

idea: design a system to allow the creation of bots, that are able to guide users through a VR environment. The bot needs to be able to pause the tour based on user interest and focus, the user can choose to leave, join and catch-up with the tour, as well as choose between tours.

  • Immersive platform for language and cultural exchange.

idea: the French dept. is interested in creating a restaurant scenario that can be used as a setting for language education.