Collaborative Visual SLAM Framework for a Multi-Robot System Nived - - PowerPoint PPT Presentation

collaborative visual slam framework for a multi robot
SMART_READER_LITE
LIVE PREVIEW

Collaborative Visual SLAM Framework for a Multi-Robot System Nived - - PowerPoint PPT Presentation

Collaborative Visual SLAM Framework for a Multi-Robot System Nived Chebrolu, David Marquez-Gamez and Philippe Martinet 7th Workshop on Planning, Perception and Navigation for Intelligent Vehicles Hamburg, Germany 28th September, 2015 1 / 27


slide-1
SLIDE 1

Collaborative Visual SLAM Framework for a Multi-Robot System

Nived Chebrolu, David Marquez-Gamez and Philippe Martinet

7th Workshop on Planning, Perception and Navigation for Intelligent Vehicles Hamburg, Germany

28th September, 2015

1 / 27

slide-2
SLIDE 2

Motivation for a collaborative system

Multi-robot system for disaster relief operations1

1Picture taken from project SENEKA, Fraunhofer IOSB

2 / 27

slide-3
SLIDE 3

Contribution of this paper

System For Collaborative Visual SLAM

3 / 27

slide-4
SLIDE 4

Perception sensor

Monocular Camera

4 / 27

slide-5
SLIDE 5

Main components of the system

Monocular Visual SLAM Place Recognition System Merging Maps Collaborative SLAM Framework

5 / 27

slide-6
SLIDE 6

Monocular visual SLAM

Goal Given a sequence of images, obtain the trajectory of the camera and the structure/model of the environment.

Mono SLAM PTAM DTAM

6 / 27

slide-7
SLIDE 7

Large-Scale Direct Visual SLAM

LSD-SLAM Output

7 / 27

slide-8
SLIDE 8

Monocular SLAM: System Overview

Tracking Depth Estimation Map Optimization

8 / 27

slide-9
SLIDE 9

Monocular Visual SLAM Place Recognition System Merging Maps Collaborative SLAM Framework

9 / 27

slide-10
SLIDE 10

Place Recognition System: Context

Where? Where?

Is the place already visited?

10 / 27

slide-11
SLIDE 11

FAB-MAP Approach

Overlap Detection Scheme

11 / 27

slide-12
SLIDE 12

Experimental Results - A Simple Scenario

Image Num. P(Seen) P(New) 1 0.991 0.001 2 0.085 0.910 3 0.922 0.002 4 0.991 0.001 5 0.911 0.131

12 / 27

slide-13
SLIDE 13

Monocular Visual SLAM Place Recognition System Merging Maps Collaborative SLAM Framework

13 / 27

slide-14
SLIDE 14

Merging Maps: Context

What is the transformation between two views?

14 / 27

slide-15
SLIDE 15

Procedure for Merging Maps

Initial Estimate Using Horn's Method Refine Estimate Using Direct Image Alignment Refine Final Estimate Using ICP

15 / 27

slide-16
SLIDE 16

Experimental Results: Input

(a) First Image (b) Second Image (c) Depth map for first image (d) Depth map for second image

16 / 27

slide-17
SLIDE 17

Experimental Results: Output

(e) Before Applying Transformation (f) After Applying Transformation

17 / 27

slide-18
SLIDE 18

Monocular Visual SLAM Place Recognition System Merging Maps Collaborative SLAM Framework

18 / 27

slide-19
SLIDE 19

Overall Scheme

Figure: Overall scheme of our collaborative SLAM system

19 / 27

slide-20
SLIDE 20

Experimental Results

Case study: Experimental Settings Robotic Platform: Two Turtlebots. Sensor: uEye Monocular Camera with wide-eye lens. Images: 640 × 480 pixels @ 30 Hz Environment: Area: 20m × 20m, Indoor (Semi-Industrial) Computation: Core 2 Duo Laptop Software: ROS, OpenCV, g 2o library.

20 / 27

slide-21
SLIDE 21

At Instance 1

(a) Robot R1 (b) Robot R2

21 / 27

slide-22
SLIDE 22

At Instance 2

(c) Robot R1 (d) Robot R2

22 / 27

slide-23
SLIDE 23

At Instance 3

(e) Robot R1 (f) Robot R2

23 / 27

slide-24
SLIDE 24

Global Map

(g) Combined Trajectory (h) Combined Depth Map Global map computed at the central server.

24 / 27

slide-25
SLIDE 25

Summary

A collaborative visual SLAM framework with:

1

Monocular SLAM process for each robot.

2

Detection of scene overlap amongst several robots

3

Global map computation fusing measurements from all robots.

4

Feedback mechanism for global information to be communicated back to each robot.

25 / 27

slide-26
SLIDE 26

Scope For Future Work

1

Investigate advantage due to feedback in terms

  • f localization accuracy and map quality.

2

Towards a Decentralized system: Direct robot to robot communication.

3

Adapting for a hybrid team of robots (ex. UAVs and ground robots).

26 / 27

slide-27
SLIDE 27

Thank you

Thank you very much for your attention !!!

Q&A

27 / 27