Subjective Evaluation of a Semi-Automatic Optical See- Through - - PowerPoint PPT Presentation

subjective evaluation of a semi automatic optical see
SMART_READER_LITE
LIVE PREVIEW

Subjective Evaluation of a Semi-Automatic Optical See- Through - - PowerPoint PPT Presentation

Subjective Evaluation of a Semi-Automatic Optical See- Through Head-Mounted Display Calibration Technique Kenneth Moser Yuta Itoh Kohei Oshima Technical University of Munich Nara Institute of Science & Technology Mississippi State


slide-1
SLIDE 1

Subjective Evaluation of a Semi-Automatic Optical See- Through Head-Mounted Display Calibration Technique

Kenneth Moser

Mississippi State University

Yuta Itoh

Technical University of Munich

Kohei Oshima

Nara Institute of Science & Technology

  • J. Edward Swan II

Mississippi State University

Gudrun Klinker

Technical University of Munich

Christian Sandor

Nara Institute of Science & Technology March 26, 2015 IEEE Virtual Reality 2015 Arles, France

slide-2
SLIDE 2
slide-3
SLIDE 3

Image Plane Pin-Hole Camera Virtual Camera Frustum Far Clipping Plane Wearer’s Eye Optical See-Through HMD View HMD Optical Combiner OST HMD View with Imagery Improper Registration Proper Registration

OST-HMD Calibration

IEEE Virtual Reality 2015 Arles, France

slide-4
SLIDE 4

Less Accurate Registration More Accurate Registration Video Taken Through Camera Set Within The Display

IEEE Virtual Reality 2015 Arles, France

OST-HMD Calibration

slide-5
SLIDE 5

IEEE Virtual Reality 2015 Arles, France Screen Point World Point

Single Point Active Alignment Method (SPAAM)

(Tuceryan & Navab, 2000)

tHead tWorld tPoint tH-P

Point

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

Calibration Methods

slide-6
SLIDE 6

IEEE Virtual Reality 2015 Arles, France

Calibration Methods

slide-7
SLIDE 7

EyeTracker World Eye

tWE

Camera

Interaction Free Display Calibration (INDICA) (Itoh & Klinker, 2014) Recycled INDICA: Updates Calibration Matrix With Eye Location

Calibration Methods

IEEE Virtual Reality 2015 Arles, France

slide-8
SLIDE 8

Swirski, 2012

Calibration Methods

IEEE Virtual Reality 2015 Arles, France

Interaction Free Display Calibration (INDICA) (Itoh & Klinker, 2014) Eye Center Locations Determined Through Limbus Detection

Nitschke, 2013

slide-9
SLIDE 9
  • SPAAM:

20 screen - world alignments taken over 1.5m – 3.0m distance to world point

  • Degraded SPAAM:

reuse of SPAAM result HMD removed and replaced

  • Recycled INDICA:

Reuse intrinsic values from SPAAM calibration Combine updated Eye Position for final result

Calibration Methods Evaluated

IEEE Virtual Reality 2015 Arles, France

slide-10
SLIDE 10

perceived location error

Subjective Evaluation Metrics

IEEE Virtual Reality 2015 Arles, France

Virtual Objects in RED

slide-11
SLIDE 11

1 2 3 4 5 1 2 3 4

diagram provided to subject before start of each trial set

Subjective Evaluation Metrics

IEEE Virtual Reality 2015 Arles, France

quality of registration with perceived location

slide-12
SLIDE 12

variance of SPAAM alignment point reprojection

Reprojection – Transformation of the 3D world points back into screen points using calibration projection matrix result

Screen Point

World Point

. . . . . . . . . . . . . . . . . . . . . . . .

X

3x4 Projection Matrix

Reprojected Screen Point

Vertical Screen Space

Horizontal Screen Space

Quantitative Evaluation Metrics

IEEE Virtual Reality 2015 Arles, France

slide-13
SLIDE 13

eye

Δ Z Δ X

Quantitative Evaluation Metrics

variance in eye location estimates

IEEE Virtual Reality 2015 Arles, France

tracking camera

slide-14
SLIDE 14

Binocular Display Left Eye (Monocular) NVIS ST50 1280 x 1024 per eye HFOV 40⁰ / VFOV 32 ⁰

System Hardware

Right Eye Piece (Covered)

slide-15
SLIDE 15

Logitech QuickCam Pro USB 2.0 Interface Auto-Focus Disabled World Tracking (head) 640 x 360 30fps Eye Localization (left eye) 1280 x 720 (still images)

System Hardware

slide-16
SLIDE 16

4 x 4 Grid of Pillars – 4cm spacing

4cm

Real Pillar

Real Heights: 13.5 – 19.5cm Virtual Height:15.5cm

Pillar Evaluation Tasks

IEEE Virtual Reality 2015 Arles, France

verbally state location and registration quality

Z X

slide-17
SLIDE 17

Horizontal Grid Vertical Grid

Cube Evaluation Tasks

IEEE Virtual Reality 2015 Arles, France

20 x 20 Grids of Squares Virtual Cube: 2cm x 2cm x 2cm

verbally state location and registration quality

Z/Y X

slide-18
SLIDE 18

Tracking Performed by Ubitrack Huber et al., 2007

System Hardware

Within-Subjects Design 13 Subjects (6 male / 7 female) 22 – 26 years of age No prior experience with HMD’s

slide-19
SLIDE 19

SPAAM Extrinsic Parameters derived through QR Decomposition

Quantitative Result – Eye Location Estimation

slide-20
SLIDE 20

Horizontal Screen Space

Quantitative Result – Reprojection Variance

IEEE Virtual Reality 2015 Arles, France

slide-21
SLIDE 21

Vertical Screen Space

Quantitative Result – Reprojection Variance

IEEE Virtual Reality 2015 Arles, France

slide-22
SLIDE 22

IEEE Virtual Reality 2015 Arles, France

Difference in cm between perceived and actual location

0: No error (perfect registration) +X: Virtual perceived to the right

  • X: Virtual perceived to the left

Y Z X

Subjective Result – Location Error

+Y: Virtual perceived above

  • Y: Virtual perceived below

+Z: Virtual perceived further

  • Z: Virtual perceived closer
slide-23
SLIDE 23

Difference Between Perceived & Actual Location

Group A Significance in Z

F(2, 24) = 14.011 p < .001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Group B No Significance*

Subjective Result: Location Error – Pillars

IEEE Virtual Reality 2015 Arles, France

slide-24
SLIDE 24

Group B No Significance*

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test **Mauchly’s test indicated non-sphericity, p value adjusted by Huynh – Feldt ε

Group A Significance in Y

F(2, 24) = 10.96 p < .0016 ε = .75**

Difference Between Perceived & Actual Location

IEEE Virtual Reality 2015 Arles, France

Subjective Result: Location Error – Cubes Vertical Grid

slide-25
SLIDE 25

Group B No Significance*

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Group A Significance in Z

F(2, 24) = 7.37 p < .003

IEEE Virtual Reality 2015 Arles, France

Subjective Result: Location Error – Cubes Horizontal Grid

Difference Between Perceived & Actual Location

slide-26
SLIDE 26

IEEE Virtual Reality 2015 Arles, France

Subjective Result: Location Error

Quantitative Measures Not a Performance Prediction No Difference SPAAM/DSPAAM No Difference All Algorithms in X Highest Overall Error in Z INDICA Significantly Better Y/Z

slide-27
SLIDE 27

Registration Quality with Chosen Location Group A No Significance* Group B Significance in Quality

ANOVA: F(2, 24) = 5.03, p < .015 Friedman: X2(2) = 5.45, p < .066 Kruskal-Wallis: X2(2) = 18.92, p < .001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Subjective Result: Registration Quality – Pillars

slide-28
SLIDE 28

Group A No Significance*

Friedman: X2(2) = .15 Kruskal-Wallis: X2(2) = .98

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test

Registration Quality with Chosen Location

Subjective Result: Registration Quality – Cubes Vertical

slide-29
SLIDE 29

Group B No Significance* Group A Significance in Quality

ANOVA: F(2, 24) = 6.65, p < .013, ε = .71** Friedman: X2(2) = 13.06, p < 0.0015 Kruskal-Wallis: X2(2) = 21.21, p < 0.001

* p ≤ 0.05 Ryan REGWQ post-hoc homogeneous subset test **Mauchly’s test indicated non-sphericity, p value adjusted by Huynh – Feldt ε

Registration Quality with Chosen Location

Subjective Result: Registration Quality – Cubes Horizontal

slide-30
SLIDE 30

Registration Quality with Chosen Location

Subjective Result: Registration Quality

Quality Values Match Performance Measures INDICA Quality is Equal or Better Than SPAAM INDICA Quality Significantly Better in Z

IEEE Virtual Reality 2015 Arles, France

slide-31
SLIDE 31

Why Apparent Disagreement in Quantitative & Subjective? Why No Significant Performance Change in SPAAM/DSPAAM? Poor Eye Localization for INDICA?

Results Discussion

IEEE Virtual Reality 2015 Arles, France

Removal/Replacement of HMD Between Conditions Reprojection Shows Closer to Actual Pixels Used in Alignment HMD Specific Properties Resolution of Task Not High Enough to Find Significance Eye Location Values for INDICA Show Low Variance Fit on User’s Head Exit Pupil Location INDICA Presumes a Simplistic HMD Model

slide-32
SLIDE 32

INDICA – Equal or Superior performance to SPAAM

  • Significantly higher performance in Y/Z
  • Significantly higher quality in Z
  • Recycled INDICA requires SPAAM intrinsics
  • Minimal requirement from user
  • Less time to perform (user preferred)

SPAAM / Degraded SPAAM – almost no difference

  • Removal/Replacement little effect on accuracy
  • Accuracy in X equal to INDICA
  • Less favorable method (exit survey)

Take Away

IEEE Virtual Reality 2015 Arles, France

slide-33
SLIDE 33

Evaluation of Full INDICA Remove induced error from SPAAM intrinsics (Full INDICA) Utilize more robust eye localization (Alex’s presentation) Real time update of calibration (on-line) Binocular Task More relevant depth cue Verification of SPAAM Z error Best VS Best Comparison of best possible calibrations SPAAM/INDICA Removal of HMD distortion (Yuta’s presentation) Improvements to SPAAM to reduce impact of user error

Future Work

slide-34
SLIDE 34
  • Dr. Hirokazu Kato
  • Dr. Goshiro Yamamoto

Everyone at the Interactive Media Design Lab All of the Anonymous Subjects

Special Thanks

IEEE Virtual Reality 2015 Arles, France

slide-35
SLIDE 35

NSF Awards: IIA-1414772, IIS-1320909, IIS-1018413 NASA Mississippi Space Grant Consortium Fellowship European Union Seventh Framework PITN-GA-2012-316919-EDUSAFE

IEEE Virtual Reality 2015 Arles, France

Support

slide-36
SLIDE 36

INDICA – Equal or Superior performance to SPAAM

  • Significantly higher performance in Y/Z
  • Significantly higher quality in Z
  • Recycled INDICA requires SPAAM intrinsics
  • Minimal requirement from user
  • Less time to perform (eye measures vs alignment)

SPAAM / Degraded SPAAM – almost no difference

  • Removal/Replacement little effect on accuracy
  • Accuracy in X equal to INDICA
  • Less favorable method (exit survey)

Take Away

IEEE Virtual Reality 2015 Arles, France

slide-37
SLIDE 37

Calibration Methods

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y)

tH-P

Screen Pixel (x,y) IEEE Virtual Reality 2015 Arles, France

Single Point Active Alignment Method (SPAAM)

(Tuceryan & Navab, 2000)

slide-38
SLIDE 38

Calibration Methods

Recycled Setup Full Setup

Translation Eye -Tracker tET Rotation World – Screen RWS Rotation World – Tracker RWT Translation World – Tracker tWT

Required for Both

Intrinsic Calib. Params. KE Translation World – Eye tWE Translation World – Screen tWS Translation World – Screen tWS Pixel Scaling Factor α(x,y)

IEEE Virtual Reality 2015 Arles, France

Interaction Free Display Calibration (INDICA)

(Itoh & klinker, 2014)

slide-39
SLIDE 39

Evaluation Study

Registration Quality Comparison

  • SPAAM
  • Degraded SPAAM:

Reuse of SPAAM result after HMD replacement

  • Recycled INDICA:

Reuse intrinsic values from SPAAM calibration

Algorithms

Perceived VS Intended Location

IEEE Virtual Reality 2015 Arles, France

slide-40
SLIDE 40

Evaluation Study

Degraded SPAAM & Recycled INDICA rely on values from SPAAM calibration

IEEE Virtual Reality 2015 Arles, France

slide-41
SLIDE 41

Evaluation Study

Order of D. SPAAM and R. INDICA, as well as Cube/Pillar task presentation, distributed such that no two subjects experienced the same sequence Within-Subjects Design 3 Alg. X 2 Tasks = 6 Conditions 16 Pillar Trials/20 Cube Trials per cond.

IEEE Virtual Reality 2015 Arles, France

13 Subjects (6 male / 7 female) 22 – 26 years of age No prior experience with HMD’s

slide-42
SLIDE 42

Experimental Results & Discussion

Quantitative Measures – Reprojection Estimates

Screen Point World Point

SPAAM Calibration Produces Screen (X,Y) and World (X, Y, Z) Correspondence Pairs

IEEE Virtual Reality 2015 Arles, France

slide-43
SLIDE 43

Discrete 2Axis Grids Each Square 2cm x 2cm Vertical Axis: A – Z (A-D) Horizontal Axis: 1 – 20 (1-4

Evaluation Tasks

IEEE Virtual Reality 2015 Arles, France

slide-44
SLIDE 44

Subjective Measures –Registration Quality

Experimental Results & Discussion

Verbal Response from Subject (1-4/5) Normalized both scales (Pillars/Cubes) into 1-4 Quality values are not Likert scale data – provided images create reference for quality range Statistical Analysis on Quality Data:

  • ANOVA
  • Friedman
  • Less power compared to ANOVA
  • Reduces number of considered data points
  • Kruskal-Wallis
  • More power compared to ANOVA
  • Does not consider within subject design

IEEE Virtual Reality 2015 Arles, France

slide-45
SLIDE 45

Performance Summary – Quantitative Measures

Experimental Results & Discussion

INDICA – Stable Eye Estimates / High Reprojection Variance.

  • Manual Limbus Detection Required for best estimates
  • Eye position in Z more consistent
  • Higher reprojection variance

Reprojection not indication of result quality INDICA reprojection shows actual pixel used (?) SPAAM – Extrinsic/Reprojection Values Match Previous Findings

  • Extrinsics show Higher variance along Z axis
  • Low reprojection variance

SPAAM result closely reproduces SPAAM alignments

IEEE Virtual Reality 2015 Arles, France