Visualization with Virtual and Augmented Reality Tobias Isenberg and - - PowerPoint PPT Presentation

visualization with virtual and augmented reality
SMART_READER_LITE
LIVE PREVIEW

Visualization with Virtual and Augmented Reality Tobias Isenberg and - - PowerPoint PPT Presentation

Visualization with Virtual and Augmented Reality Tobias Isenberg and Xiyao Wang Who are we? Xiyao Wang Tobias Isenberg Senior researcher PhD student tobias.isenberg@inria.fr xiyao.wang@inria.fr http://xiyaowang.net/


slide-1
SLIDE 1

Visualization with Virtual and Augmented Reality

Tobias Isenberg and Xiyao Wang

slide-2
SLIDE 2

Who are we?

Xiyao Wang

PhD student xiyao.wang@inria.fr http://xiyaowang.net/

Tobias Isenberg

Senior researcher tobias.isenberg@inria.fr http://tobias/isenberg.cc

slide-3
SLIDE 3

What are VR and AR?

[Milgram and Kishino 1994] images: Microsoft and Oculus

Real environments with virtual data/objects Users can interact with real world Entirely virtual environments Users are isolated from real world

slide-4
SLIDE 4

Compared to traditional screen

less immersion →good stereo perception only w/ interaction high visual immersion → stereo perception w/o interaction → ppl. can understand 3D data well

images: Univ. Groningen, LG, A. Wiebel

slide-5
SLIDE 5
  • Does not rely on projection

Scientific Visualization with Stereo

slide-6
SLIDE 6
  • Does not rely on projection
  • Make use of spatial input devices

(direct mapping)

Scientific Visualization with Stereo

image Daniel F. Keefe

slide-7
SLIDE 7
  • Immersion helps data understanding

Scientific Visualization with Stereo

[Prabhat et al. 2008]

slide-8
SLIDE 8
  • Immersion helps data understanding

Scientific Visualization with Stereo

[Laha et al. 2012]

slide-9
SLIDE 9
  • Immersion helps data understanding

Scientific Visualization with Stereo

[Mirhosseini et al. 2014]

slide-10
SLIDE 10
  • Fully stereoscopic view

Some examples

[Hurter et al. 2018]

slide-11
SLIDE 11
slide-12
SLIDE 12

Some examples

[Taylor, II, et al. 1993]

slide-13
SLIDE 13

Some examples

image: Daniel F. Keefe

slide-14
SLIDE 14
  • Multiple views with stereoscopic view + touch input

Some examples

[Coffey et al. 2011/2012]

slide-15
SLIDE 15
slide-16
SLIDE 16
  • Multiple views with stereoscopic view + special input

Some examples

image Daniel F. Keefe

slide-17
SLIDE 17

Interaction has to be tailored to the view

[Bruder et al. 2013]

slide-18
SLIDE 18
  • Stereo view + touch input that works

Interaction has to be tailored to the view

[Butkiewicz & Ware, 2011]

slide-19
SLIDE 19
slide-20
SLIDE 20

Tangible displays & stereoscopy

[López et al, 2016]

  • Mobile interaction with stereo
slide-21
SLIDE 21
slide-22
SLIDE 22

Visual representation and interaction

[with S. Bruckner, T. Ropinski, and A. Wiebel, hopefully to appear soon]

slide-23
SLIDE 23

What do we want to do?

Bring the visual immersion for HEP data analysis.

slide-24
SLIDE 24

What do we want to do?

Bring the visual immersion for HEP data analysis. Current case: understanding the results of TrackML challenge

slide-25
SLIDE 25

What do we envision?

Two linked views. Each view is interactable. Pointer goes from one to another.

slide-26
SLIDE 26

Why do we use such combination?

Traditional analysis tools Familiar environments Typing, precise input, … Visual immersion Not separated from real world Other input forms

slide-27
SLIDE 27
  • Interaction can be performed on both sides.
  • Linked views with the same dataset.
  • Similar user interface and functionalities.

Initial design

slide-28
SLIDE 28
  • Navigation
  • Abstraction of data and detector meshes.
  • Filtering and highlighting.

Prototype

slide-29
SLIDE 29
  • We preformed an observational study.

Study

slide-30
SLIDE 30
  • Stereoscopic view offers better depth clue, benefiting the

spatial understanding of the event, and for example, following the trajectories and observing their spatial arrangements.

  • The unlimited working space of AR could be useful for

complex analysis (while 2D screen has a fixed size).

  • Being able to walk around the data has huge potentials

compared to interaction on PC.

Major feedback

slide-31
SLIDE 31
  • Keeping the laptop and mouse is useful for precise

interaction, and for dense information display (for its high resolution).

  • Mouse is not appreciated for AR space, what kind of input

device can be used to unify the interaction across spaces?

  • Users lose interaction and context while walking around.

Major feedback

slide-32
SLIDE 32
  • 3D input device to match the stereo view in AR?

Input and interaction techniques

3D Mouse? Tangible devices?

Map the interaction to a physical object’s movement.

slide-33
SLIDE 33
  • Study the mis-match of dimensionality between input and
  • utput devices.

– Would 2D(3D) input match better the 2D(3D) output?

Input and interaction techniques

slide-34
SLIDE 34
  • 3D Docking task & 3D clipping plane task

Input and interaction techniques

slide-35
SLIDE 35
  • 3D Docking task & 3D clipping plane task

– Tangible device is more nature for rough 3D manipulation. – Mouse still has the most previse results for specific tasks.

Input and interaction techniques

slide-36
SLIDE 36
  • How can we relate 2D graphs/plots and 3D representations?

– Do we go back and forth between them? – Is one more important than the other, or are all equally important?

Questions

slide-37
SLIDE 37
  • In addition to scoring the results, how do we work with 3D

representations to compare two or more ML-based results?

Questions

slide-38
SLIDE 38

Visualization with Virtual and Augmented Reality

Xiyao Wang

PhD student xiyao.wang@inria.fr http://xiyaowang.net/

Tobias Isenberg

Senior researcher tobias.isenberg@inria.fr http://tobias/isenberg.cc