SLAM Dr. Ahmad Kamal Nasir Dr. Ing. Ahmad Kamal Nasir 1 The SLAM - - PowerPoint PPT Presentation

slam
SMART_READER_LITE
LIVE PREVIEW

SLAM Dr. Ahmad Kamal Nasir Dr. Ing. Ahmad Kamal Nasir 1 The SLAM - - PowerPoint PPT Presentation

SLAM Dr. Ahmad Kamal Nasir Dr. Ing. Ahmad Kamal Nasir 1 The SLAM Problem Given Robot controls Nearby measurements Estimate Robot state (position, orientation) Map of world features Dr. Ing. Ahmad Kamal Nasir 2


slide-1
SLIDE 1

SLAM

  • Dr. Ahmad Kamal Nasir
  • Dr. –Ing. Ahmad Kamal Nasir

1

slide-2
SLIDE 2

The SLAM Problem

  • Given

– Robot controls – Nearby measurements

  • Estimate

– Robot state (position, orientation) – Map of world features

  • Dr. –Ing. Ahmad Kamal Nasir

2

slide-3
SLIDE 3

Motivation and Challenges

  • Advantages: Faster objective completion time, re-

assign task in case a robot fails, tasks can be done which are beyond the capability of single robot

  • Challenges:

– Map merging, large dynamic sparse outdoor environment – Controlling and managing of multi-robot system is challenging because the system requires handling

  • f multiple robots with heterogeneous capabilities

– Standard software architecture to avoid re- implementation of basic communication and non- interoperability

  • Application: Multi-robot map building in absence
  • f priori map such as sea ports, destroyed nuclear

plants…

  • Dr. –Ing. Ahmad Kamal Nasir

3

slide-4
SLIDE 4

Types of Sensors

  • Odometry
  • Laser Ranging and Detection (LIDAR)
  • Acoustic (sonar, ultrasonic)
  • Radar
  • Vision (monocular, stereo etc.)
  • GPS
  • Gyroscopes, Accelerometers (Inertial Navigation)
  • Etc.
  • Dr. –Ing. Ahmad Kamal Nasir

4

slide-5
SLIDE 5

Sensor Characteristics

  • Noise
  • Dimensionality of Output

– LIDAR- 3D point – Vision- Bearing only (2D ray in space)

  • Range
  • Frame of Reference

– Most in robot frame (Vision, LIDAR, etc.) – GPS earth centered coordinate frame – Accelerometers/Gyros in inertial coordinate frame

  • Dr. –Ing. Ahmad Kamal Nasir

5

slide-6
SLIDE 6
  • Formalize 𝑞 𝑦𝑢, 𝑛 𝑨𝑢, 𝑣𝑢, 𝑑𝑢 for a team of mobile robots

where 𝑦𝑢 is the state of the robots at time step 𝑢 𝑛 is the map 𝑨𝑢 is the robots measurements 𝑣𝑢 are the control inputs 𝑑𝑢 is the data association function

Problem Statement (Probablistic Approach)

  • Dr. –Ing. Ahmad Kamal Nasir

6

slide-7
SLIDE 7

Full vs. Online SLAM

  • Full SLAM calculates the robot state over all

time up to time t

) , | , p(

: 1 : 1 : 1 t t t

u z m x

  • Online SLAM calculates the robot state for the

current time t

1 2 1 : 1 : 1 : 1 : 1 : 1

... ) , | , ( ) , | , p(

 

t t t t t t t

dx dx dx u z m x p u z m x 

  • Dr. –Ing. Ahmad Kamal Nasir

7

slide-8
SLIDE 8

Full vs. Online SLAM

) , | , p(

: 1 : 1 : 1 t t t

u z m x

Full SLAM Online SLAM

1 2 1 : 1 : 1 : 1 : 1 : 1

... ) , | , ( ) , | , p(

 

t t t t t t t

dx dx dx u z m x p u z m x 

  • Dr. –Ing. Ahmad Kamal Nasir

8

slide-9
SLIDE 9

Two Example SLAM Algorithms

  • Extended Kalman Filter (EKF) SLAM

– Solves online SLAM problem – Uses a linearized Gaussian probability distribution model

  • FastSLAM

– Solves full SLAM problem – Uses a sampled particle filter distribution model

  • Dr. –Ing. Ahmad Kamal Nasir

9

slide-10
SLIDE 10

Extended Kalman Filter SLAM

  • Solves the Online SLAM problem using a

linearized Kalman filter

  • One of the first probabilistic SLAM algorithms
  • Not used frequently today but mainly shown

for its explanatory value

  • Dr. –Ing. Ahmad Kamal Nasir

10

slide-11
SLIDE 11

Kalman Filter Components

Linear discrete time dynamic system (motion model) Measurement equation (sensor model)

1 1 1 1    

 

t t t t

n x H z

State transition function Control input function Noise input function with covariance Q State Control input Process noise State Sensor reading Sensor noise with covariance R Sensor function

t t t t t t t

w G u B x F x   

1

  • Dr. –Ing. Ahmad Kamal Nasir

11

slide-12
SLIDE 12

EKF Equations

Propagation (motion model):

T t t t T t t t t t t t t t t t t t

G Q G F P F P u B x F x    

  / / 1 / / 1

ˆ ˆ

Update (sensor model):

t t t t T t t t t t t t t t t t t t t T t t t t t T t t t t t t t t t t t t

P H S H P P P r K x x S H P K R H P H S z z r x H z

/ 1 1 1 1 1 / 1 / 1 1 / 1 1 1 / 1 1 / 1 1 1 1 / 1 1 1 1 / 1 1 1 1 1 1 / 1 1 1

ˆ ˆ ˆ ˆ ˆ

                             

         

  • Dr. –Ing. Ahmad Kamal Nasir

12

slide-13
SLIDE 13

EKF Example

t=0

  • Initial State and Uncertainty
  • Using Range Measurements
  • Dr. –Ing. Ahmad Kamal Nasir

13

slide-14
SLIDE 14

EKF Example

t=1

  • Predict Robot Pose and Uncertainty

at time 1

  • Dr. –Ing. Ahmad Kamal Nasir

14

slide-15
SLIDE 15

EKF Example

t=1

  • Correct pose and pose uncertainty
  • Estimate new feature uncertainties
  • Dr. –Ing. Ahmad Kamal Nasir

15

slide-16
SLIDE 16

EKF Example

t=2

  • Predict pose and uncertainty of pose

at time 2

  • Predict feature measurements and

their uncertainties

  • Dr. –Ing. Ahmad Kamal Nasir

16

slide-17
SLIDE 17

EKF Example

t=2

  • Correct pose and mapped features
  • Update uncertainties for mapped

features

  • Estimate uncertainty of new features
  • Dr. –Ing. Ahmad Kamal Nasir

17

slide-18
SLIDE 18

Implementation

  • Dr. –Ing. Ahmad Kamal Nasir

18

slide-19
SLIDE 19

SLAM

Effect of odometeric errors on robot uncertainty Feature based SLAM to reduce robot uncertainty

  • Dr. –Ing. Ahmad Kamal Nasir

19

slide-20
SLIDE 20

Feature based SLAM

3D plane map using Kinect 2D Line feature based SLAM using Laser Scanner

  • Dr. –Ing. Ahmad Kamal Nasir

20

slide-21
SLIDE 21

Occupancy Grid based SLAM

Grid based SLAM Experiment on H-F0 Grid based SLAM Experiment on H-F1

  • Dr. –Ing. Ahmad Kamal Nasir

21

slide-22
SLIDE 22

Mapping Results

Original map Line feature map Grid map Planned trajectory Map using Hough transform Map using RANSAC

  • Dr. –Ing. Ahmad Kamal Nasir

22

slide-23
SLIDE 23

SLAM ormulization

Robot state: 𝒚𝒔 = 𝒚, 𝒛, 𝜾 𝑼 Line features: 𝒏𝒎 = 𝒔, 𝜷 𝑼 Plane features: 𝒏𝒒 = 𝒔, 𝜾, 𝝌 𝑼

Map (robot states + features) Map covariance 𝑦 = 𝑦𝑠1 𝑦𝑠2 ⋮ 𝑛𝑚1 ⋮ 𝑛𝑞1 ⋮ 𝑄 = 𝑄

𝑠1𝑠1

𝑄

𝑠1𝑠2

⋯ 𝑄

𝑠2𝑠1

𝑄

𝑠2𝑠2

⋯ ⋮ ⋮ ⋱ 𝑄

𝑠1𝑛𝑚1

⋯ 𝑄

𝑠2𝑛𝑚1

⋯ ⋮ ⋱ 𝑄

𝑠1𝑛𝑞1

⋯ 𝑄

𝑠2𝑛𝑞1

⋯ ⋮ ⋱ 𝑄

𝑠1𝑛𝑚1

𝑄

𝑠2𝑛𝑚1

⋯ ⋮ ⋮ ⋱ 𝑄

𝑛𝑚1𝑛𝑚1

⋯ ⋮ ⋱ 𝑄

𝑛𝑚1𝑛𝑞1

⋯ ⋮ ⋱ 𝑄

𝑠1𝑛𝑞1

𝑄

𝑠2𝑛𝑞1

⋯ ⋮ ⋮ ⋱ 𝑄

𝑛𝑞1𝑛𝑚1

⋯ ⋮ ⋱ 𝑄

𝑛𝑞1𝑛𝑞1

⋯ ⋮ ⋱

  • Dr. –Ing. Ahmad Kamal Nasir

23

slide-24
SLIDE 24

Methodology

Feature Map

  • Dr. –Ing. Ahmad Kamal Nasir

24

slide-25
SLIDE 25

Core CSLAM Modules

  • Prediction
  • Clustering/Segmentation
  • Feature Extraction
  • Correspondence/ Data association
  • Map Update
  • New Feature Augmentation
  • Map Management
  • Dr. –Ing. Ahmad Kamal Nasir

25

slide-26
SLIDE 26

Prediction

𝒈 𝒚𝒔, 𝒗𝒖, 𝒙𝒖 (Robot kinematic motion model) 𝒏𝒎 represents all of the existing line features 𝒏𝒒 represents the set of all existing plane features 𝑮𝒔𝟐 =

𝜖 𝜖𝑦𝑠1 𝑔 𝑦𝑠, 𝑣𝑢, 𝑥𝑢 Jacobian wrt. robot pose

𝑮𝒐 = 𝜖

𝜖𝑥 𝑔 𝑦𝑠, 𝑣𝑢, 𝑥𝑢 Jacobian wrt. Noise

𝑹 = Covariance of the noise input 𝑦𝑢+1 = 𝑔 𝑦𝑠1, 𝑣𝑢, 𝑥𝑢 𝑔 𝑦𝑠2, 𝑣𝑢, 𝑥𝑢 𝑛𝑚 𝑛𝑞 𝑄𝑢+1 = 𝐺

𝑠1 ∙ 𝑄 𝑠1𝑠1 ∙ 𝐺 𝑠1 𝑈 + 𝐺 𝑜 ∙ 𝑅 ∙ 𝐺 𝑜 𝑈

𝑄

𝑠2 ∙ 𝐺 𝑠1

𝑄

𝑠1𝑛𝑚 ∙ 𝐺 𝑠1

𝑄

𝑠1𝑛𝑞 ∙ 𝐺 𝑠1

𝐺

𝑠1 ∙ 𝑄 𝑠2

𝑄

𝑠2𝑠2

𝑄

𝑛𝑚𝑠2

𝑄

𝑛𝑞𝑠2

𝐺

𝑠1 ∙ 𝑄 𝑠1𝑛𝑚

𝑄

𝑠2𝑛𝑚

𝑄

𝑛𝑚𝑛𝑚

𝑄

𝑛𝑞𝑛𝑚

𝐺

𝑠1 ∙ 𝑄 𝑠1𝑛𝑞

𝑄

𝑠2𝑛𝑞

𝑄

𝑛𝑚𝑛𝑞

𝑄

𝑛𝑞𝑛𝑞

  • Dr. –Ing. Ahmad Kamal Nasir

26

slide-27
SLIDE 27

Clustering / Segmentation

Dynamic clustering threshold

Lee Segmentation IEPF Segmentation

𝐸 𝑠

𝑗, 𝑠 𝑗+1 =

𝑠

𝑗 2 + 𝑠 𝑗+1 2

− 2𝑠

𝑗𝑠 𝑗+1cos(∆𝜄)

𝐸𝑢ℎ = 𝐷0 + 𝐷1 min 𝑠

𝑗, 𝑠 𝑗+1

𝐷1 = 2(1 − cos (∆𝜄)) = 𝐸 𝑠

𝑗, 𝑠 𝑗+1

𝑠

𝑗

  • Dr. –Ing. Ahmad Kamal Nasir

27

slide-28
SLIDE 28

Line Feature Extraction

𝛽 = 1 2 𝒃𝒖𝒃𝒐𝟑( −2 𝑧 − 𝑧𝑗 𝑦 − 𝑦𝑗

𝑜 𝑗=0

, 𝑧 − 𝑧𝑗 2 − 𝑦 − 𝑦𝑗 2

𝑜 𝑗=0

)

𝑠 = 𝑦 cos 𝛽 + 𝑧 sin 𝛽 𝑄

𝛽𝑠 =

𝜏𝛽

2

𝜏𝛽𝑠 𝜏𝑠𝛽 𝜏𝑠

2

𝜏𝛽

2 = 𝜖𝛽

𝜖𝜍𝑗

2

𝜏𝜍𝑗

2 𝑜 𝑗=0

𝜏𝑠

2 = 𝜖𝑠

𝜖𝜍𝑗

2

𝜏𝜍𝑗

2 𝑜 𝑗=0

𝜏𝛽𝑠 = 𝜏𝑠𝛽 = 𝜖𝛽 𝜖𝜍𝑗 ∙ 𝜖𝑠 𝜖𝜍𝑗 ∙ 𝜏

𝜍𝑗 2 𝑜 𝑗=0

  • Dr. –Ing. Ahmad Kamal Nasir

28

slide-29
SLIDE 29

Correspondence / Data association

𝒜𝒋 is the innovation 𝒂𝒋 is the covariance of the innovation 𝒐 is the threshold Mahalanobis distance criterion 𝑇 is the covariance of the expected feature 𝑆 is the covariance of the measured feature 𝑨𝑗𝑈 ∙ 𝑎𝑗 −1 ∙ 𝑨𝑗 < 𝑜2 𝑎𝑗 = S + 𝑆 𝑆 = 𝜏𝑠

2

σrα σαr 𝜏𝛽

2

  • Dr. –Ing. Ahmad Kamal Nasir

29

slide-30
SLIDE 30

Sensor Observation Model

The update step of the SLAM process in case of heterogeneous set of features is different for each type of feature. The line features only update the portion of the map containing the robot and line features and similarly for plane features. New information from observed feature Sensor’s observation model Jacobian of sensor model wrt. robot pose Jacobian of sensor model wrt. feature 𝑔 𝑦𝑠, 𝑧𝑕

𝑗

= 𝑠

𝑓 𝑗

𝛽𝑓

𝑗

= 𝑠

𝑕 𝑗 − 𝑦𝑠 ∙ 𝑑𝑝𝑡 𝛽𝑕 𝑗

− 𝑧𝑠 ∙ 𝑡𝑗𝑜 𝛽𝑕

𝑗

𝛽𝑕

𝑗 − 𝜄𝑠

𝑨𝑗 = 𝑧𝑛

𝑗 − 𝑧𝑓 𝑗 =

𝑠

𝑛 𝑗

𝛽𝑛

𝑗

− 𝑠

𝑓 𝑗

𝛽𝑓

𝑗

𝐼𝑠 = − 𝑑𝑝𝑡 𝛽𝑓 − 𝑡𝑗𝑜 𝛽𝑓 −1 𝐼𝑚𝑗 = 1 𝑦𝑠 ∙ 𝑡𝑗𝑜 𝛽𝑓 − 𝑧 ∙ 𝑑𝑝𝑡(𝛽𝑓) 1

  • Dr. –Ing. Ahmad Kamal Nasir

30

slide-31
SLIDE 31

Map Update

𝑎𝑗 = 𝑇

𝑘 + 𝑆𝑗

𝑆 = 𝜏𝑠

2

σrα σαr 𝜏𝛽

2

𝐿𝑞𝑗 = 𝑄

𝑠𝑠

𝑄

𝑠𝑞1

𝑄

𝑞1𝑠

𝑄

𝑞1𝑞𝑗

⋮ ⋮ 𝑄

𝑞𝑜𝑠

𝑄

𝑞𝑜𝑞𝑗

∙ 𝐼𝑠

𝑈

𝐼𝑞𝑗

𝑈

∙ 𝑎𝑗 −1 𝑦 = 𝑦 + 𝐿𝑞𝑗 ∙ 𝑨𝑗 𝑄 = 𝑄 − 𝐿𝑞𝑗 ∙ 𝑎𝑗 ∙ 𝐿𝑞𝑗

𝑈

Covariance of the innovation Measure feature covariance Kalman gain Map update Map uncertainty reduced

  • Dr. –Ing. Ahmad Kamal Nasir

31

slide-32
SLIDE 32

Inverse sensor Observation Model

𝑔 𝑦𝑠, 𝑧𝑚

𝑜+1 = 𝑠 𝑕 𝑜+1

𝛽𝑕

𝑜+1 = 𝑠𝑚 𝑜+1 + 𝑦𝑠 ∙ cos 𝛽𝑚 𝑜+1 + 𝜄𝑠 + 𝑧𝑠 ∙ sin 𝛽𝑚 𝑜+1 + 𝜄𝑠

𝛽𝑚

𝑜+1 + 𝜄𝑠

𝑄 = 𝑄

𝑠𝑠

𝑄

𝑠𝑛

𝑄

𝑠𝑠 𝑈 ∙ 𝑍 𝑠 𝑈

𝑄

𝑠𝑛

𝑄

𝑛𝑛

𝑄

𝑠𝑛 𝑈 ∙ 𝑍 𝑠 𝑈

𝑍

𝑠 ∙ 𝑄 𝑠𝑠

𝑍

𝑠 ∙ 𝑄 𝑠𝑛

𝑍

𝑠 ∙ 𝑄 𝑠𝑠 ∙ 𝑍 𝑠 𝑈 + 𝑍 𝑚𝑜+1 ∙ 𝑆 ∙ 𝑍 𝑚𝑜+1 𝑈

𝑍

𝑠 = cos 𝛽𝑚 𝑜+1 + 𝜄𝑠

sin 𝛽𝑚

𝑜+1 + 𝜄𝑠

𝑧𝑠 ∙ cos 𝛽𝑚

𝑜+1 + 𝜄𝑠 − 𝑦𝑠 ∙ sin 𝛽𝑚 𝑜+1 + 𝜄𝑠

1 𝑍

𝑚𝑜+1 = 1

𝑧𝑠 ∙ cos 𝛽𝑚

𝑜+1 + 𝜄𝑠 − 𝑦𝑠 ∙ sin 𝛽𝑚 𝑜+1 + 𝜄𝑠

1 Covariance of new feature

  • Dr. –Ing. Ahmad Kamal Nasir

32

slide-33
SLIDE 33

Map Management

State Vector (Map) 𝑆𝑝𝑐𝑝𝑢1 𝑦 𝑧 𝜄 𝑈 𝑆𝑝𝑐𝑝𝑢2 𝑦 𝑧 𝜄 𝑈 𝑀𝑗𝑜𝑓𝐺𝑓𝑏𝑢𝑣𝑠𝑓1 𝑠 𝛽 𝑦1 𝑧1 𝑦2 𝑧2 𝑈 ⋮ 𝑀𝑗𝑜𝑓𝐺𝑓𝑏𝑢𝑣𝑠𝑓𝑜 𝑠 𝛽 𝑦1 𝑧1 𝑦2 𝑧2 𝑈 Line-Segments Fusion Overlapped Line-Segments Fusion 𝐵𝐷 = 𝐷𝑦 − 𝐵𝑦 , 𝐷𝑧 − 𝐵𝑧 𝐶𝐵 = 𝐶𝑦 − 𝐵𝑦 , 𝐶𝑧 − 𝐵𝑧 𝑆1 = 𝐵𝐷𝑦 ⋅ 𝐵𝐶𝑦 + 𝐵𝐷𝑧 ⋅ 𝐵𝐶_𝑧 𝐵𝐶𝑦2 + 𝐵𝐶𝑧2

  • Dr. –Ing. Ahmad Kamal Nasir

33