Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es - - PowerPoint PPT Presentation

robust gyroscope aided camera self calibration
SMART_READER_LITE
LIVE PREVIEW

Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es - - PowerPoint PPT Presentation

Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es Arno Solin Juho Kannala Aalto University July 11, 2018 Motivation Camera sensors are common in smart devices Use cases: AR/VR , games , odometry ,


slide-1
SLIDE 1

Robust Gyroscope-Aided Camera Self-Calibration

Santiago Cort´ es Arno Solin Juho Kannala

Aalto University

July 11, 2018

slide-2
SLIDE 2

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 2/12

Motivation

◮ Camera sensors are common in

smart devices

◮ Use cases: AR/VR , games ,

  • dometry , photography , etc.

◮ But the observed images are

distorted

◮ The distortion can be estimated

  • ff-line or be factory-calibrated

◮ We want to estimate the distortion

  • nline

What the camera sees

slide-3
SLIDE 3

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 3/12

Idea

slide-4
SLIDE 4

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 4/12

Camera model

◮ World coordinates (x, y, z) to image coordinates (u, v): ◮ Pinhole camera model:

  u v 1   = K E     x y z 1    

◮ where the intrinsic and extrinsic matrices are:

K =   fx cx fy cy 1   and E =

  • RT

−RTp

  • ◮ Camera pose (R, p): the camera orientation (quaternion) and position
slide-5
SLIDE 5

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 5/12

Camera model (non-linear)

◮ Lens distortions are typically non-linear ◮ Radial distortion coefficients k1 and k2:

u′ v ′

  • =

u v

  • (1 + k1 r 2 + k2 r 4)

with the radial component given by r = u − cx fx 2 + v − cy fy 2

slide-6
SLIDE 6

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 6/12

Feature tracking

◮ The dense image is not

convenient to work with

◮ Choose sparse points by

a feature detector

◮ Track the points over frames

using a feature tracker

◮ Measurement data consists of

tracks of points over frames

slide-7
SLIDE 7

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 7/12

Motion model

◮ The gyroscope for drives the orientation dynamics:

dq(t) dt = 1 2 Ω(ω) q(t) q(t) is the quaternion at t and ω the angular velocity.

◮ The position p(t) = (p1(t), p2(t), p3(t)) is modeled as a

Wiener velocity model: d2pj(t) dt2 = w(t) w(t) is white noise.

slide-8
SLIDE 8

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 8/12

State estimation

◮ The state variables are:

x =

  • cT

pT vT qT zTT c = (fx, fy, cx, cy, k1, k2) are the parameters, p position, v velocity, q orientation, and z feature world coordinates.

◮ State space model:

xk = Ak xk−1 + εk, yk = hk(xk) + γk, where Ak depends on ωk and yk = (u1, v2, . . .) are the feature image coordinates.

◮ We use an Extended Kalman filter for inference.

slide-9
SLIDE 9

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 9/12

Experiments

The videos are available at: https://youtu.be/ro7TeQKgfT0

slide-10
SLIDE 10

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 10/12

Real-world experiment

slide-11
SLIDE 11

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 11/12

Recap

◮ Online estimation of camera

parameters

◮ Information fusion:

◮ Gyroscope-driven ◮ Feature-track observations

◮ Movement constrained by a

Wiener velocity motion model

◮ Inference done by an EKF

slide-12
SLIDE 12

Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 12/12

◮ Link to codes can be found

  • n my homepage!

◮ Homepage:

http://arno.solin.fi

◮ Twitter:

@arnosolin