Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es - - PowerPoint PPT Presentation
Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es - - PowerPoint PPT Presentation
Robust Gyroscope-Aided Camera Self-Calibration Santiago Cort es Arno Solin Juho Kannala Aalto University July 11, 2018 Motivation Camera sensors are common in smart devices Use cases: AR/VR , games , odometry ,
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 2/12
Motivation
◮ Camera sensors are common in
smart devices
◮ Use cases: AR/VR , games ,
- dometry , photography , etc.
◮ But the observed images are
distorted
◮ The distortion can be estimated
- ff-line or be factory-calibrated
◮ We want to estimate the distortion
- nline
What the camera sees
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 3/12
Idea
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 4/12
Camera model
◮ World coordinates (x, y, z) to image coordinates (u, v): ◮ Pinhole camera model:
u v 1 = K E x y z 1
◮ where the intrinsic and extrinsic matrices are:
K = fx cx fy cy 1 and E =
- RT
−RTp
- ◮ Camera pose (R, p): the camera orientation (quaternion) and position
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 5/12
Camera model (non-linear)
◮ Lens distortions are typically non-linear ◮ Radial distortion coefficients k1 and k2:
u′ v ′
- =
u v
- (1 + k1 r 2 + k2 r 4)
with the radial component given by r = u − cx fx 2 + v − cy fy 2
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 6/12
Feature tracking
◮ The dense image is not
convenient to work with
◮ Choose sparse points by
a feature detector
◮ Track the points over frames
using a feature tracker
◮ Measurement data consists of
tracks of points over frames
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 7/12
Motion model
◮ The gyroscope for drives the orientation dynamics:
dq(t) dt = 1 2 Ω(ω) q(t) q(t) is the quaternion at t and ω the angular velocity.
◮ The position p(t) = (p1(t), p2(t), p3(t)) is modeled as a
Wiener velocity model: d2pj(t) dt2 = w(t) w(t) is white noise.
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 8/12
State estimation
◮ The state variables are:
x =
- cT
pT vT qT zTT c = (fx, fy, cx, cy, k1, k2) are the parameters, p position, v velocity, q orientation, and z feature world coordinates.
◮ State space model:
xk = Ak xk−1 + εk, yk = hk(xk) + γk, where Ak depends on ωk and yk = (u1, v2, . . .) are the feature image coordinates.
◮ We use an Extended Kalman filter for inference.
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 9/12
Experiments
The videos are available at: https://youtu.be/ro7TeQKgfT0
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 10/12
Real-world experiment
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 11/12
Recap
◮ Online estimation of camera
parameters
◮ Information fusion:
◮ Gyroscope-driven ◮ Feature-track observations
◮ Movement constrained by a
Wiener velocity motion model
◮ Inference done by an EKF
Robust gyroscope-aided camera self-calibration Cort´ es, Solin, Kannala 12/12
◮ Link to codes can be found
- n my homepage!