Visual SLAM with Multi-Fisheye Camera Systems Stefan Hinz, Steffen - - PowerPoint PPT Presentation

visual slam with multi fisheye camera systems
SMART_READER_LITE
LIVE PREVIEW

Visual SLAM with Multi-Fisheye Camera Systems Stefan Hinz, Steffen - - PowerPoint PPT Presentation

Visual SLAM with Multi-Fisheye Camera Systems Stefan Hinz, Steffen Urban Institute of Photogrammetry and Remote Sensing KIT, Karlsruhe Contents Application of Visual Multi-fisheye SLAM for Augmented Reality applications Components of


slide-1
SLIDE 1

Visual SLAM with Multi-Fisheye Camera Systems

Stefan Hinz, Steffen Urban Institute of Photogrammetry and Remote Sensing KIT, Karlsruhe

slide-2
SLIDE 2

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-3
SLIDE 3

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-4
SLIDE 4
  • Support for different planning phases
  • Investigation of different tracks
  • Localization of emergancy tunnels
  • etc

Application: Planning of new Underground railway tracks

slide-5
SLIDE 5

Virtual 3D plans available

slide-6
SLIDE 6
  • Tunnelmodell

Different resolutions, different level of details

slide-7
SLIDE 7

GIS as background information

slide-8
SLIDE 8

Augmented Reality System

  • Development of a mobile AR-System
  • Support of co-operative tunnel/track planning:
  • Overlay of planned and already existing objects
  • Analysis of geometric deviations and missing objects
  • In-situ visualization
  • Documentation

3D-geocoded and annotated images Platform / camera pose needed

slide-9
SLIDE 9

Augmented Reality System

  • Example: Emergancy tunnel

Person equipped with the AR-System

slide-10
SLIDE 10

Augmented Reality System

  • Example: Emergancy tunnel
slide-11
SLIDE 11

11

Explicit 3D Model (Collaboration Server) Mulit-fisheye System (mounted at helmet) Tablet camera system

System concept

slide-12
SLIDE 12
  • Constraints:
  • Indoor/underground → no GPS/GNNS available
  • Bad illumination conditions
  • Narrow and „cluttered“ environment
  • Many occlusions

System concept

slide-13
SLIDE 13

System concept

  • Mobile AR-System
  • Prototype
  • 3 Fisheye cameras
  • Complete coverage
  • Robust estimation of position

(and tracking)

  • Visualization unit
slide-14
SLIDE 14

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-15
SLIDE 15

Camera Calibration

.

  • Basics: Model for fisheye project of Scaramuzza et al. 2006
  • Extensions
  • Multiple collinearity equations (3 cameras)
  • Robust bundle approach
  • Simultaneous estimation of all parameter

Improvement in terms of speed and geometric quality by factor 2 - 4

Sensor

f

X,Y Objekt P u,v p

with

slide-16
SLIDE 16

Validation

  • Calibration using 3D-Model
  • Ground-truth (tachymeter)
  • Accuracy:

Position: 0.4-1.5cm

  • rientation 0.35-2.6mrad
slide-17
SLIDE 17

Basic image data (1): Multi-Fisheye Panorama

No homography anymore

Mapping onto cylinder (using the relative orientation)

slide-18
SLIDE 18

Basic image data (1): Multi-Fisheye Panorama

No homography anymore

Mapping onto cylinder (using the relative orientation)

Transformation into coordinate system of 3d-model

slide-19
SLIDE 19

Panorama trajectory

slide-20
SLIDE 20

P1 P2

R,t

Basic image data (2): Fisheye Stereo

slide-21
SLIDE 21

Rectification (via mapping onto cylinder) Transformation into epipolar geometry => limited accuracy of 3D points => useful for initial 3D description of imaged environment

Basic image data (2): Fisheye Stereo

slide-22
SLIDE 22

Flächenhaftes Stereo durch Rektifizierung

Fisheye -> Zylinderabbildung Herstellung der Epipolargeometrie (Zeilendisparitäten)

13.09.2017

slide-23
SLIDE 23

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-24
SLIDE 24

Self-localization / Initialization of AR-System

  • Challenges:
  • no GPS → no absolute position
  • Many potential initial positions → many hypotheses to start tracking inside 3D-model
slide-25
SLIDE 25

Self-localization / Initialization of AR-System

  • Challenges:
  • no GPS → no absolute position
  • Many potential initial positions → many hypotheses to start tracking inside 3D-model
  • Many clutter and objects not included in virtual 3D model
  • Many discrepancies between images and virtual 3D model
slide-26
SLIDE 26

Self-localization / Initialization of AR-System

  • Challenges:
  • no GPS → no absolute position
  • Many potential initial positions → many hypotheses to start tracking inside 3D-model
  • Many clutter and objects not included in virtual 3D model
  • Many discrepancies between images and virtual 3D model
  • Virtual 3D model is not textured => less features for maching
  • Real-time requirements

→ Indexing (search trees), GPU processing (Rendering), Parallel processing

slide-27
SLIDE 27

27

Model-based Initialization

(“Model” = 3D Modell) Task:

slide-28
SLIDE 28

28

Model-based Initialization

(“Model” = 3D Modell)

3D-Model Multiple initial hypotheses Images Rendering and feature extraction Feature Extraction

Camera pose In 3D model

Online Offline

Particle Filtering

Particle Degeneration

slide-29
SLIDE 29

Simulation of virtual camera poses

slide-30
SLIDE 30

Features: Extraction of visible 3D-Model edges

Using already determined fisheye distortion

Rendering mit “Vertex Shadern”

slide-31
SLIDE 31

Features: Extraction of visible 3D-Model edges

Using already determined fisheye distortion

Rendering mit “Vertex Shadern”

slide-32
SLIDE 32

Example

13.09.2017

slide-33
SLIDE 33

Example: visible edges

13.09.2017

slide-34
SLIDE 34

Self-localization: estimation by particle filtering

slide-35
SLIDE 35

Self-localization

slide-36
SLIDE 36

Self-localization: Examples

slide-37
SLIDE 37
  • Fine registration
slide-38
SLIDE 38
  • Fine registration
slide-39
SLIDE 39

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-40
SLIDE 40

Egomotion determination / Tracking

  • Extension of conventional visual SLAM algorithm (ORB-SLAM)

 Multi-Fisheye cameras  Optional: support of virtual 3D modell

Hybrid multi-scale 3D-Models Web-Services Self localization Feature extraction Key Frames Co-Visibility Graph Radiometric / geometric analysis Model- or point- based tracking Augmented Reality: Fusion of Tablet- and fisheye cameras Annotation Documentation Simulation Co-operation server, Web-Services

slide-41
SLIDE 41

Challenges:

  • Fusion of piont- und model-based tracking

Egomotion determination / Tracking

slide-42
SLIDE 42

Challenges:

  • Fusion of piont- und model-based tracking

=> Multi-colinearity SLAM with refinement of keyframes

Egomotion determination / Tracking

slide-43
SLIDE 43

Challenges:

  • Fusion of piont- und model-based tracking

=> Multi-colinearity SLAM with refinement of keyframes

  • Feature extraction and self-calibration adapted to fisheye projections

=> mdBRIEF with online learning

Egomotion determination / Tracking

slide-44
SLIDE 44

Challenges:

  • Fusion of piont- und model-based tracking

=> Multi-colinearity SLAM with refinement of keyframes

  • Feature extraction and self-calibration adapted to fisheye projections

=> mdBRIEF with online learning

  • Distinction of static, moving and re-locatable object points

=> Co-visibility graph, robust estimation (not yet: utilization of uncertainties of 3D model)

Egomotion determination / Tracking

slide-45
SLIDE 45

Egomotion determination / Tracking

  • Learning of geometric and radiometric properties for co-visibility graph

Zeit

Geometry Radiometry:

slide-46
SLIDE 46
  • Learning of geometric and radiometric properties for co-visibility graph

Zeit

Geometry: Radiometry:

Egomotion determination / Tracking

slide-47
SLIDE 47
  • Learning of geometric and radiometric properties for co-visibility graph

Radiometry: Geometry:

Zeit

Egomotion determination / Tracking

slide-48
SLIDE 48
  • Weighting of points (geometric restrictions)
  • Testing of radiometric invariances
  • Efficient selection and indexing (Wuest et al. 2007)

Static – useful Relocatable – temporary useful Moving – not useful

Egomotion determination / Tracking

slide-49
SLIDE 49

Tracking: „MulitCol SLAM“

slide-50
SLIDE 50

Tracking: „MulitCol SLAM“

slide-51
SLIDE 51

Multi-Fisheye SLAM (Self-localization and mapping)

slide-52
SLIDE 52

Tracking: „MulitCol SLAM“

slide-53
SLIDE 53
  • Some numbers

Tracking: „MulitCol SLAM“

slide-54
SLIDE 54
  • Even more numbers (ATE)

Tracking: „MulitCol SLAM“

slide-55
SLIDE 55

Contents

  • Application of Visual Multi-fisheye SLAM for Augmented Reality applications

Components of Visual SLAM

  • Calibration and basic image data
  • Initialization using virtual 3D model
  • Egomotion determination, Tracking
  • Integration with image based measurement system
slide-56
SLIDE 56

Appendix: Fusion with Tablet-System

  • Image to image matching
  • In-situ analysis
  • Dokumentation
  • Annotation

TP-E Vor-Ort Analysen

slide-57
SLIDE 57

Correction of distortion and matching

slide-58
SLIDE 58
slide-59
SLIDE 59

Result

slide-60
SLIDE 60

Thank you for your attention…

slide-61
SLIDE 61

Thank you for your attention… and many thanks to Dr. Steffen Urban

Further readings: Urban, Leitloff, Hinz (2015): Multi-fisheye camera calibration. ISPRS Journal Urban, Leitloff, Wursthorn, Hinz (2016): Multi-fisheye tracking. Int. Journal of Computer Vision Urban, Weinmann, Hinz (2017, to appear): mdBrief… . Computer Vision and Image Understanding