Humanoid Robotics 6D Localization for Humanoid Robots Maren - - PowerPoint PPT Presentation

humanoid robotics 6d localization for humanoid robots
SMART_READER_LITE
LIVE PREVIEW

Humanoid Robotics 6D Localization for Humanoid Robots Maren - - PowerPoint PPT Presentation

Humanoid Robotics 6D Localization for Humanoid Robots Maren Bennewitz 1 Motivation To perform useful service, a robot needs to know its pose within the environment model Motion commands are only executed inaccurately Estimate the


slide-1
SLIDE 1

1

Humanoid Robotics 6D Localization for Humanoid Robots

Maren Bennewitz

slide-2
SLIDE 2

2

Motivation

§ To perform useful service, a robot needs to

know its pose within the environment model

§ Motion commands are only executed

inaccurately

§ Estimate the pose of a robot in a given

model of the environment

§ Based on sensor data and odometry

measurements

§ Sensor data: Depth image or laser range

readings

slide-3
SLIDE 3

3

Task

§ Estimate the pose of a robot in a given

model of the environment

§ Based on its sensor data and odometry

measurements

slide-4
SLIDE 4

4

Recap: Basic Probability Rules

If x and y are independent: Bayes’ rule: Often written as:

The denominator is a normalizing constant that ensures that the posterior on the left hand side sums up to 1 over all possible values of x

In case of background knowledge, Bayes' rule turns into

slide-5
SLIDE 5

5

Recap: Basic Probability Rules

Law of total probability: continuous case discrete case

slide-6
SLIDE 6

6

Markov Assumption

state of a dynamical system

  • bservations

actions

slide-7
SLIDE 7

7

State Estimation

§ Estimate the state given observations

and odometry measurements/actions

§ Goal: Calculate the distribution § Apply the recursive Bayes’ filter

slide-8
SLIDE 8

8

Recursive Bayes Filter 1

Definition of the belief all data up to time t

slide-9
SLIDE 9

9

Recursive Bayes Filter 2

Bayes’ rule

slide-10
SLIDE 10

10

Recursive Bayes Filter 3

Markov assumption

slide-11
SLIDE 11

11

Recursive Bayes Filter 4

Law of total probability

slide-12
SLIDE 12

12

Recursive Bayes Filter 5

Markov assumption

slide-13
SLIDE 13

13

Recursive Bayes Filter 6

slide-14
SLIDE 14

14

Recursive Bayes Filter 7

recursive term

slide-15
SLIDE 15

15

Recursive Bayes Filter 7

motion model

  • bservation model
slide-16
SLIDE 16

16

Probabilistic Motion Model

§ Robots execute motion commands only

inaccurately

§ The motion model specifies the probability

that action u moves the robot from to :

§ Defined for each robot type individually

slide-17
SLIDE 17

17

Observation Model for Range Readings

§ Sensor data consists of measurements § The individual measurements are

independent given the robot’s pose

§ “How well can the distance measurements

be explained given the pose and the map”

slide-18
SLIDE 18

18

Recap Observation Model: Simplest Ray-Cast Model

§ Compare the actually measured distance

with the expected distance to the obstacle

§ Consider the first obstacle along the ray in

the map

§ Use a Gaussian to evaluate the difference

measured distance

  • bject in map
slide-19
SLIDE 19

19

Recap Observation Model: Beam-Endpoint Model

Evaluate the distance of the hypothetical beam end point to the closest obstacle in the map with a Gaussian

Courtesy: Thrun, Burgard, Fox

slide-20
SLIDE 20

20

Recap: Particle Filter

§ One implementation of the recursive Bayes’

filter

§ Non-parametric framework (not only

Gaussian)

§ Arbitrary models can be used as motion and

  • bservation models
slide-21
SLIDE 21

21

Key Idea: Samples

Use a set of weighted samples to represent arbitrary distributions samples

slide-22
SLIDE 22

22

Particle Set

Set of weighted samples

state hypothesis importance weight

slide-23
SLIDE 23

23

Particle Filter

§ The set of weighted particles approximates

the belief about the robot’s pose

§ Prediction: Sample from the motion model

(propagate particles forward)

§ Correction: Weigh the samples based on

the observation model

slide-24
SLIDE 24

24

Monte Carlo Localization

§ Each particle is a pose hypothesis § Prediction: For each particle, sample a

new pose from the the motion model

§ Correction: Weigh samples according to

the observation model

§ Resampling: Draw sample with

probability and repeat times ( =#particles)

slide-25
SLIDE 25

25

MCL – Correction Step

Image Courtesy: S. Thrun, W. Burgard, D. Fox

slide-26
SLIDE 26

26

MCL – Resampling & Prediction

Image Courtesy: S. Thrun, W. Burgard, D. Fox

slide-27
SLIDE 27

27

MCL – Correction Step

Image Courtesy: S. Thrun, W. Burgard, D. Fox

slide-28
SLIDE 28

28

MCL – Resampling & Prediction

Image Courtesy: S. Thrun, W. Burgard, D. Fox

slide-29
SLIDE 29

29

Summary – Particle Filters

§ Particle filters are non-parametric, recursive

Bayes filters

§ The belief about the state is represented by

a set of weighted samples

§ The motion model is used to draw the

samples for the next time step

§ The weights of the particles are computed

using the observation model

§ Resampling is carried out based on weights § Also called: Monte-Carlo localization (MCL)

slide-30
SLIDE 30

30

Localization for Humanoids

3D environments require a 6D pose estimate

height yaw, pitch, roll 2D position estimate the 6D torso pose

slide-31
SLIDE 31

31

Localization for Humanoids

§ Recursively estimate the belief about the

robot‘s pose using MCL

§ The probability distribution is represented

by pose hypotheses (particles)

§ Needed: Motion model and observation

model

slide-32
SLIDE 32

32

Motion Model

§ The odometry estimate corresponds to

the incremental motion of the torso

§ is computed by forward kinematics (FK)

from the current stance foot while walking

slide-33
SLIDE 33

33

Kinematic Walking Odometry

Frfoot Ftorso Fodom

frame of the current stance foot frame of the torso, transform can be computed with FK over the right leg

Keep track of the transform to the current stance foot

slide-34
SLIDE 34

34

Kinematic Walking Odometry

Frfoot Ftorso Fodom Frfoot Ftorso Fodom

Both feet remain on the ground, compute the transform to the frame of the left foot with FK

Frfoot Flfoot Ftorso Fodom

slide-35
SLIDE 35

35

Kinematic Walking Odometry

Frfoot Ftorso Fodom Frfoot Ftorso Fodom Frfoot Flfoot Ftorso Fodom

Flfoot Ftorso Fodom

The left leg becomes the stance leg and is the new reference to compute the transform to the torso frame

slide-36
SLIDE 36

36

Kinematic Walking Odometry

§ Using FK, the poses of all joints and sensors

can be computed relative to the stance foot at each time step

§ The transform from the odometry frame

to the stance foot is updated whenever the swing foot becomes the new stance foot

slide-37
SLIDE 37

37

Odometry Estimate

Odometry estimate from two consecutive torso poses

ut

slide-38
SLIDE 38

38

Odometry Estimate

§ The incremental motion of the torso

is computed from kinematics of the legs

§ Typical error sources: Slippage on the

ground and backlash in the joints

§ Accordingly, we have only noisy odometry

estimates while walking and the drift accumulates over time

§ The particle filter has to account for

that noise within the motion model

slide-39
SLIDE 39

39

Motion Model

§ Noise modelled as a Gaussian with

systematic drift on the 2D plane

§ Prediction step samples a new pose for each

particle according to

§ learned with least squares optimization

using ground truth data (as in Ch. 2)

covariance matrix calibration matrix motion composition

slide-40
SLIDE 40

40

Sampling from the Motion Model

We need to draw samples from a Gaussian

Gaussian

slide-41
SLIDE 41

41

How to Sample from a Gaussian

§ Drawing from a 1D Gaussian can be done in

closed form

Example with 106 samples

slide-42
SLIDE 42

42

How to Sample from a Gaussian

§ Drawing from a 1D Gaussian can be done in

closed form

§ If we consider the individual dimensions of

the motion as independent, we can draw each dimension using a 1D Gaussian

draw each of the dimensions independently

slide-43
SLIDE 43

43

Sampling from the Odometry

§ We sample an individual motion for each

particle

§ We then incorporate the sampled

motion into the pose of particle i

slide-44
SLIDE 44

44

Motion Model

  • dometry (uncalibrated)

ground truth

particle distrib. acc. to uncalibrated motion model:

§ Resulting particle distribution (2000 particles) § Nao robot walking straight for 40cm

slide-45
SLIDE 45

45

Motion Model

  • dometry (uncalibrated)

ground truth

  • dometry (uncalibrated)

ground truth

particle distribution acc. to uncalibrated motion model: particle distribution acc. to calibrated motion model: properly captures drift, closer to the ground truth

slide-46
SLIDE 46

Observation Model

§ Observation consists of independent

§ range measurements , § height (computed from the values of the joint

encoders),

§ and roll and pitch measurements (by IMU)

§ Observation model:

slide-47
SLIDE 47

Observation Model

slide-48
SLIDE 48

48

Observation Model

§ Range data: Ray-casting or endpoint model in 3D map

slide-49
SLIDE 49

49

Observation Model

§ Range data: Ray-casting or endpoint model in 3D map § Torso height: Compare measured value from kinematics to predicted height (accord. to motion model) § IMU data: Compare measured roll and pitch to the predicted angles § Use individual Gaussians to evaluate the difference

slide-50
SLIDE 50

50

Likelihoods of Measurements

Gaussian distribution

standard deviation corresponding to noise characteristics

  • f joint encoders

and IMU

height roll pitch

slide-51
SLIDE 51
slide-52
SLIDE 52

52

Localization Evaluation

§ Trajectory of 5m, 10 runs each § Ground truth from external motion capture system § Raycasting results in a significantly smaller error § Calibrated motion model requires fewer particles

slide-53
SLIDE 53

53

Trajectory and Error Over Time

2D Laser Range Finder

slide-54
SLIDE 54

Comparison Laser – Depth Camera

slide-55
SLIDE 55

55

Summary

§ Estimation of a humanoid’s 6D torso pose in

a given 3D model

§ Motion estimate from kinematic walking

  • dometry

§ Sample a motion for each particle

individually using the motion model

§ Use individual Gaussians in the observation

model to evaluate the differences between measured and expected values

§ Particle filter allows to locally track and

globally estimate the robot’s pose

slide-56
SLIDE 56

56

Literature

§ Book: Probabilistic Robotics,

  • S. Thrun, W. Burgard, and D. Fox

§ Chapter 3, Humanoid Robot Navigation in

Complex Indoor Environments,

Armin Hornung, PhD thesis, Univ. Freiburg, 2014

slide-57
SLIDE 57

57

Acknowledgment

§ Previous versions of parts of the slides have

been created by Wolfram Burgard, Armin Hornung, and Cyrill Stachniss