Realtime Gaze Estimation with Online Calibration Li Sun, Mingli - - PowerPoint PPT Presentation

realtime gaze estimation with online calibration
SMART_READER_LITE
LIVE PREVIEW

Realtime Gaze Estimation with Online Calibration Li Sun, Mingli - - PowerPoint PPT Presentation

Realtime Gaze Estimation with Online Calibration Li Sun, Mingli Song, Zicheng Liu, Ming-Ting Sun Outline Introduction Limitations & goals Proposed method Results Demo Gaze Estimation Dia iagnostic applications:


slide-1
SLIDE 1

Realtime Gaze Estimation with Online Calibration

Li Sun, Mingli Song, Zicheng Liu, Ming-Ting Sun

slide-2
SLIDE 2

Outline

Introduction Limitations & goals Proposed method Results Demo

slide-3
SLIDE 3

Gaze Estimation

Dia iagnostic applications:

  • Eye disease diagnosis
  • Human behavior studies, e.g.

visual attention analysis In Interactive applications:

  • Human-computer interaction
  • Gaze-contingent display
slide-4
SLIDE 4

Gaze Estimation Methods 2D regression based

Appearance/ features Regression Point of gaze

Eye images, Pupil-glint vector Support Vector Regression, Gaussian Processes, Neural Network Screen coordinates Difficult to handle head movements, require many samples for calibration

slide-5
SLIDE 5

Gaze Estimation Methods 3D model based

Image features 3D model Line of gaze

Glints, pupil/iris center Model the structure of the eye and the geometrical relationship of the eye and the scene Visual axis,

  • ptical axis

Complex/expensive setup, user calibration for person-specific parameters

slide-6
SLIDE 6

Limitations & Goals

Limitations:

  • Intrusion
  • Complex,

expensive setup

  • Cannot tolerate

head movements

  • Cumbersome

calibration

  • Cannot work in

realtime

  • Low accuracy

(>5°)

Goals:

  • Remote gaze

estimation

  • Simple, low-cost

setup

  • Free movements
  • Easy calibration/

calibration-free

  • Realtime

processing speed

  • Accurate (<3°)

Broad application

slide-7
SLIDE 7

Proposed Method

Gaze Feature Extraction

3D Face Tracking Face & Eye Detection Iris Center Location Eye Corner Location

System Calibration

Screen-Camera Calibration Online (User) Calibration

3D Gaze Estimation

Screen-Camera Parameters & subject-specific Parameters Gaze Features

slide-8
SLIDE 8

3D Model

slide-9
SLIDE 9

3D Gaze Estimation

In Input:

  • Gaze features: 𝐒ℎ, 𝐪𝑗, 𝐪𝑏, 𝑨𝑏
  • Screen-camera parameters:

𝐏s, 𝐎𝑡, 𝐎𝑣, 𝐎𝑤 𝑥𝑡, ℎ𝑡, 𝑥𝑠, ℎ𝑠, 𝑔, 𝐩

  • Personal parameters:

𝑠

𝑓, 𝐖𝑏𝑓 𝐼

Output: Gaze point: 𝐐

𝑕, 𝐪𝑕

slide-10
SLIDE 10

3D Gaze Estimation

  • 1. Compute the 3D position of 𝐐a:

𝐪𝑏 − 𝐩 = 𝑔 𝑨𝑏 𝑦𝑏 −𝑧𝑏

  • 2. Calculate the offset vector 𝐖𝑏𝑓 and
  • btain the eyeball center 𝐏𝑓:

𝐖𝑏𝑓 = 𝐒ℎ𝐖𝑏𝑓

𝐼

𝐏𝑓 = 𝐒ℎ𝐖𝑏𝑓 + 𝐐𝑏

  • 3. Find the iris center 𝐐𝑗:

𝐪𝑗 − 𝐩 = 𝑔 𝑨𝑗 𝑦𝑗 −𝑧𝑗 𝐐𝑗 − 𝐏𝑓

2 = 𝑠 𝑓

  • 4. Obtain the gaze direction 𝐎𝑕:

𝐎𝑕 = 𝐐𝑗 − 𝐏𝑓 𝑠

𝑓

slide-11
SLIDE 11

3D Gaze Estimation

5 obtain the 3D gaze point 𝐐

𝑕:

𝐐

𝑕 = 𝐏𝑓 + 𝑚𝐎𝑕

λ = 𝑃𝑓𝑄

𝑕 2 = −

𝐏𝒇 − 𝐐𝒉 ∙ 𝐎𝑡 𝐎𝑕 ∙ 𝐎𝑡 = − 𝐏𝒇 ∙ 𝐎𝑡 +𝑒 𝐎𝑕 ∙ 𝐎𝑡 6 obtain the 2D gaze point 𝐪𝑕 (𝑣𝑕, 𝑤𝑕): 𝐐

𝑕 − 𝐏𝑡 ∙ 𝐎𝑣 − 𝑥𝑡

𝑥𝑠 𝑣𝑕 = 0 𝐐

𝑕 − 𝐏𝑡 ∙ 𝐎𝑤 − ℎ𝑡

ℎ𝑠 𝑤𝑕 = 0

slide-12
SLIDE 12

Gaze Feature Extraction

Gaze Feature Extraction 3D Face Tracking Face & Eye Detection Iris Center Location Eye Corner Location

Gaze features:

  • Rotation matrix 𝐒ℎ
  • Depth value 𝑨𝑏
  • 2D iris location 𝐪𝑗
  • 2D anchor point location 𝐪𝑏

The anchor point: Inner corner of the left eye

slide-13
SLIDE 13

Gaze Feature Extraction

3D face tracking: Microsoft face tracking SDK[1] Face & eye detection: OpenCV[2] Visual Context Boosting[3] Eye corner detection: Template filtering [4]

slide-14
SLIDE 14

Online Calibration

Offline calibration: 𝑔 ∙ 𝑦𝑏 + 𝐒𝑦 ∙ 𝐖𝑏𝑓

𝐼 + 𝑠 𝑓𝐎𝑕 ∙ 𝐖𝑦 + 𝑥

2 − 𝑣𝑝 𝑨𝑏 + 𝐒𝑨 ∙ 𝐖𝑏𝑓

𝐼 + 𝑠 𝑓𝐎𝑕 ∙ 𝐖𝑨 = 0

𝑔 ∙ 𝑧𝑏 + 𝐒𝑧 ∙ 𝐖𝑏𝑓

𝐼 + 𝑠 𝑓𝐎𝑕 ∙ 𝐖𝑧 + 𝑤𝑝 − ℎ

2 𝑨𝑏 + 𝐒𝑨 ∙ 𝐖𝑏𝑓

𝐼 + 𝑠 𝑓𝐎𝑕 ∙ 𝐖𝑨 = 0

where 𝐖𝑦 = 1,0,0 T, 𝐖𝑧 = 0,1,0 T, 𝐖𝑨 = 0,0,1 T 𝐒ℎ = 𝐒𝑦 𝐒𝑧 𝐒𝑨 For T calibration points, there are 2T equations with 4 unknowns.

  • There exist noise calibration points (changes in luminance, head pose, target position, etc), the

calibration result varies from time to time.

  • How many calibration points are required to ensure satisfactory calibration result?
  • Too many calibration points have a large impact on the usability of the system.
slide-15
SLIDE 15

Online Calibration

Rewrite the linear system into the form 𝐁𝐲 = 𝐜, the least square solution can be

  • btained from 𝐁𝑈𝐁𝐲 = 𝐁𝑈𝐜.

Online calibration: Given a new calibration point with coefficient matrix 𝐃𝑢+1 and column vector 𝐞𝑢+1, the coefficient matrix and column vector can be updated as follows: 𝐁𝑢+1

𝑈

𝐁𝑢+1 = 𝐁𝑢

𝑈𝐁𝑢 + 𝐃𝑢+1 𝑈

𝐃𝑢+1 𝐁𝑢+1

𝑈

𝐜𝑢+1 = 𝐁𝑢

𝑈𝑐𝑢 + 𝐃𝑢+1 𝑈

𝐞𝑢+1

  • During online calibration, the eye parameters are constantly improved with increasing gaze estimation

accuracy.

  • The online calibration completes as soon as the updates of eye parameters reach convergence.
  • Increasing calibration points average out the impact of noise calibration points.
slide-16
SLIDE 16

Results

System setup: Kinect sensor 19” LCD screen Desktop PC 4GB memory Intel Core2 Quad CPU Q9550 2.83GHz nVidia GT310

User Kinect Screen Line of gaze Point of gaze

slide-17
SLIDE 17

Results

Online calibration:

The updates of eye parameters (unit: mm)

Number of calibration points Number of calibration points

slide-18
SLIDE 18

Results

Online calibration vs. offline calibration

Average gaze estimation error (unit: degree) over first n calibration points from the online calibration and offline calibration with fixed number (5, 9, 16) of calibration points.

slide-19
SLIDE 19

Results

Gaze estimation accuracy: User Accuracy (degree) Eyeball radius 𝒔𝒇 Offset vector V𝒃𝒇

𝑰 (mm)

𝑦 𝑧 𝑨 1 1.7703 11.0

  • 16.1

4.7 6.1 2 1.7276 9.4

  • 14.3

2.0 1.0 3 1.9243 10.2

  • 13.0

3.3 3.8 4 1.8547 10.8

  • 14.7

5.2 6.4

slide-20
SLIDE 20

Results

Gaze estimation accuracy:

slide-21
SLIDE 21

Results

Processing speed: Computational cost in ms of the major steps of the proposed method. Face Eye Blink Iris Eye Corner Gaze Total 5.78 4.98 0.38 18.14 0.23 0.64 30.15

slide-22
SLIDE 22

Demo: 3D Chess Game

slide-23
SLIDE 23

References

[1] Microsoft Kinect SDK, http://www.microsoft.com/en-us/kinectforwindows/develop/, 2013. [2] OpenCV, http://opencv.org/, 2013 [3] Z. Sun M. Song, D. Tao and X. Li, “Visual-context boosting for eye detection,” IEEE Trans. Sys. Man Cyber. Part B, 40(6):1460–1467, 2010. [4] J. Zhu and J. Yang, “Subpixel eye gaze tracking,” in AFGR, 2002.

slide-24
SLIDE 24

Thank you!