Tracking in City Traffic Scenarios Hamma Tadjine, Daniel Goehring - - PowerPoint PPT Presentation

tracking in city traffic scenarios
SMART_READER_LITE
LIVE PREVIEW

Tracking in City Traffic Scenarios Hamma Tadjine, Daniel Goehring - - PowerPoint PPT Presentation

Acoustic/Lidar Sensor Fusion for Car Acoustic/Lidar Sensor Fusion for Car Tracking Tracking in City Traffic Scenarios Hamma Tadjine, Daniel Goehring Hamma Tadjine IAV GmbH, 08. September 2015 Freie Universitt Berlin 1 Acoustic/Lidar Sensor


slide-1
SLIDE 1

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 1 IAV GmbH, Freie Universität Berlin

Acoustic/Lidar Sensor Fusion for Car Tracking in City Traffic Scenarios

Hamma Tadjine, Daniel Goehring

slide-2
SLIDE 2

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 2 IAV GmbH, Freie Universität Berlin

Motivation

  • Direction to Object-Detection: What is possible with cost-

efficient microphone arrays, e.g. from Kinect?

  • Fusion of multiple non-synchronized Kinect audio sensors

and evaluation with data from Lidar sensors

  • Application of the solution in real-world traffic scenarios
slide-3
SLIDE 3

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 3 IAV GmbH, Freie Universität Berlin

Contribution

  • Main components:

– audio-based detection of objects for a single Kinect microphone array – creation of a representation for the belief distribution of object directions – combination of belief distributions of two Kinect microphone arrays – implementation on a real autonomous car using the OROCOS framework – synchronization and evaluation of the algorithm with Lidar point cloud from Ibeo Lux sensors

slide-4
SLIDE 4

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 4 IAV GmbH, Freie Universität Berlin

Test platform

  • Vehicle: VW Passat Variant, modified by VW
  • Drive- and Steer-by-Wire, CAN
  • Positioning system: Applanix POS LV 510

– IMU, odometer, correction data via UMTS

  • Camera systems:

– 4 Wide angle cameras – 2 INKA Cameras (HellaAglaia) – 2 Guppy Cameras for traffic light detection – Continental Lane Detection

  • Laser scanner:

– IBEO Lux 6-Fusion System – 3D Laser scanner: Velodyne HDL 64 E

  • Radar systems:

– 2 short range (BSD 24 GHz) – 4 long range (ACC 77 GHz) – 1 SMS (24 GHz)

slide-5
SLIDE 5

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 5 IAV GmbH, Freie Universität Berlin

Kinect sensor (Schematic)

  • 4 microphones, only the left

and right outer microphones were used in our approach (gray circles)

  • outer microphone distance:

approx 22 cm

slide-6
SLIDE 6

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 6 IAV GmbH, Freie Universität Berlin

Signal shift calculation via cross- correlation

  • For continuous signals f and g holds:
  • For discrete signals f and g holds:
  • We are interested in the delay n between the two discrete

microphone signals:

slide-7
SLIDE 7

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 7 IAV GmbH, Freie Universität Berlin

Time delay between to Microphoness

  • the two microphones provide audio signals with a

sampling rate of 16.8 kHz

  • the time difference for a signal approaching the two

microphones is , which translates, given the speed of sound (340 meters per second), into a distance difference of:

slide-8
SLIDE 8

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 8 IAV GmbH, Freie Universität Berlin

Time delay between to Microphoness

  • the two microphones have a

distance of 0.22 meters (base distance)

  • given the base distance, and the

signal shift, for an assumed distance of the object (far away, e.g. 25 m) we have a defined triangle

  • we can calculate the angle to the
  • bject w.r.t. symmetry
  • on a plane, two solutions remain
slide-9
SLIDE 9

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 9 IAV GmbH, Freie Universität Berlin

Distribution of possible angles to

  • bject
  • sampling frequency is limited to 16.8 kHz
  • for a given base distance of the two microphones (0.22 m)

and a given signal shift, we can have only: 2 * 0.22 m * 16.8 kHz / 340 m ≈ 22 possible discrete outcomes for angular directions  approx. 46 different angular segments (with symmetry)

slide-10
SLIDE 10

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 10 IAV GmbH, Freie Universität Berlin

Angular segment distribution

  • different segments (46)

cover different angular intervals

  • each segment can be

interpreted as a belief cell for an object in an angular direction interval

  • radius for each segment

will represent cross- correlation value (belief)

slide-11
SLIDE 11

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 11 IAV GmbH, Freie Universität Berlin

Combination of Kinect sensors

  • Symmetry disambiguation on a

plane can be achieved with two Kinect (each 2 microphones),

  • Both devices are rotated by 90

degrees towards each other

  • Kinect 1 can distinguish between left

and right but not between front and rear direction

  • Kinect 2 can distinguish between

front and rear but not between left and right direction

slide-12
SLIDE 12

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 12 IAV GmbH, Freie Universität Berlin

Combination of Kinect sensors

  • Symmetry disambiguation on a plane can be achieved

with two Kinect microphone pairs, which are oriented by 90 degrees towards each other

  • For fusion, we subsampled the two non-equally spaced

histograms into two qually spaced histograms

  • The value of each non-uniform belief cell is assigned to

(split into) the uniform belief cells covered (fully or partially covered)

  • Combination of both kinect belief distributions via

cell-wise multiplication

slide-13
SLIDE 13

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 13 IAV GmbH, Freie Universität Berlin

Subsampling of belief cells and fusion for two Kinect sensors

46 segments 64 segments

Kinect 1

  • rientation forward

Kinect 2

  • rientation

sideways

slide-14
SLIDE 14

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 14 IAV GmbH, Freie Universität Berlin

Traffic Example

Kinect facing to the front, length of each (non-equal) angular segment represents angular belief Passing car, Lidar data

slide-15
SLIDE 15

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 15 IAV GmbH, Freie Universität Berlin

Traffic Example, Step by Step

Kinect facing to the front Kinect facing to the side front/rear symmetry (yellow axis) left/right symmetry, (orange axis)

slide-16
SLIDE 16

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 16 IAV GmbH, Freie Universität Berlin

Traffic Example, Step by Step

nonuniform uniform segments

slide-17
SLIDE 17

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 17 IAV GmbH, Freie Universität Berlin

Traffic Example, Step by Step

nonuniform uniform segments data fusion, no symmetries

slide-18
SLIDE 18

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 18 IAV GmbH, Freie Universität Berlin

Object Direction calculation

  • What is the angle to the object?

– After fusion, angular segment with the highest value wins (maximum likelihood) – Drawback: only one direction possible – For multiple objects it would be possible to search for multiple large angular segments (not too close to each

  • ther)
slide-19
SLIDE 19

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 19 IAV GmbH, Freie Universität Berlin

Experimental setup

  • Approach was tested in our autonomous car in a real

traffic situation

  • driving the car created much wind noise  car was parked
  • n the side of the read, passing vehicles were detected
slide-20
SLIDE 20

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 20 IAV GmbH, Freie Universität Berlin

Experimental evaluation

  • Lidar scanner from Ibeo Lux (6 scanners) used to evaluate

accuracy of sound source localization

  • Idea: compare the angle calculated using audio data with

the closest angle of moving obstacles from using Lidar

  • Lidar objects were clustered and tracked from point cloud

data

slide-21
SLIDE 21

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 21 IAV GmbH, Freie Universität Berlin

Demo: Video 1 and 2

slide-22
SLIDE 22

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 22 IAV GmbH, Freie Universität Berlin

slide-23
SLIDE 23

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 23 IAV GmbH, Freie Universität Berlin

Experimental results

angular error standard deviation: 10.3 degrees Angular error over time w.r.t. Lidar data

slide-24
SLIDE 24

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 24 IAV GmbH, Freie Universität Berlin

Experimental results (contd.) Angular error over distances w.r.t. Lidar data

slide-25
SLIDE 25

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 25 IAV GmbH, Freie Universität Berlin

Experimental results (contd.) Angular error over distances

slide-26
SLIDE 26

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 26 IAV GmbH, Freie Universität Berlin

Experimental results (contd.)

  • for close objects it is usually hard to tell the exact angle,

due to their size – therefore the error for more distant

  • bjects was often smaller than for close ones
  • other inaccuracies were caused by sound reflections on

houses and trees close to the street

  • errors caused by limited sound velocities in combination

with high velocities of cars could not be measured  city traffic 50 km/h

slide-27
SLIDE 27

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 27 IAV GmbH, Freie Universität Berlin

Conclusion

  • we presented an approach to calculating angles to objects

using accoustic data from 2 Kinect microphone pairs

  • showed how data from two non-synchronized devices can

be combined using subsampled and uniform angular interval segments

  • Detected angles were assigned and compared to real

world Lidar data

  • Approach was implemented on a real autonomous car

robotics modular framework (OROCOS) and tested in a real world traffic situation

slide-28
SLIDE 28

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 28 IAV GmbH, Freie Universität Berlin

Future Work

  • Current challenges – wind noise while driving
  • How to keep track of multiple sources
  • How to handle sound reflections, e.g. from buildings,

trees, etc.

slide-29
SLIDE 29

Hamma Tadjine

  • 08. September 2015

Acoustic/Lidar Sensor Fusion for Car Tracking 29 IAV GmbH, Freie Universität Berlin

Future Work (contd.)

  • How can the signal intensity used for distance estimation
  • Band pass filters / FFT can help to select specific signals,

e.g. emergency vehicles with certain signal horn frequencies and signal patterns – furthermore try to detect alternation between two frequencies

  • Use tracking in combination with doppler effect to estimate

velocities while vehicle is passing (change in frequency)