Course outline Week 1 : Introduction to intelligent vehicles, - - PDF document

course outline
SMART_READER_LITE
LIVE PREVIEW

Course outline Week 1 : Introduction to intelligent vehicles, - - PDF document

ME400 Intelligent vehicles and intelligent transportation systems (ITS) Week 4 : Vehicle perception and map building Denis Gingras January 2015 1 20-dc.-14 D Gingras ME470 IV course CalPoly Week 4 Course outline Week 1 :


slide-1
SLIDE 1

1

20-déc.-14 1

Intelligent vehicles and intelligent transportation systems (ITS)

Week 4 : Vehicle perception and map building

ME400

Denis Gingras January 2015

D Gingras – ME470 IV course CalPoly Week 4 20-déc.-14 2

Course outline

 Week 1 : Introduction to intelligent vehicles, context, applications and

motivations

 Week 2 : Vehicle dynamics and vehicle modelling  Week 3: Positioning and navigation systems and sensors  Week4: Vehicular perception and map building  Week 5 : Multi-sensor data fusion techniques  Week 6 : Object detection, recognition and tracking  Week 7: ADAS systems and vehicular control  Week 8 : VANETS and connected vehicles  Week 9 : Multi-vehicular scenarios and collaborative architectures  Week 10 : The future: toward autonomous vehicles and automated driving

(Final exam)

D Gingras – ME470 IV course CalPoly Week 4

slide-2
SLIDE 2

2

20-déc.-14 3

 Brainstorming and introduction  Context and importance of vehicle perception and map building  Perception and map building systems basic architectures  Scene analysis  Range sensors  Radar  Lidar  Sonar  Vision based systems

 Cameras  Mono vision 2D scene analysis  Stereo Vision 3D scene analysis  Night vision

 Landmarks in automated guided vehicles applications

Week 4 outline

D Gingras – ME470 IV course CalPoly Week 4 4

What is perception ?

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 4 D Gingras – ME470 IV course CalPoly Week 4

slide-3
SLIDE 3

3

5

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 5 D Gingras – ME470 IV course CalPoly Week 4

What are the two main purposes of perception in the context

  • f intelligent vehicles?

6

Does perception provide a unique interpretation?

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 6 D Gingras – ME470 IV course CalPoly Week 4

slide-4
SLIDE 4

4

7

Name a few meaningful structures we want to extract from the vehicle environment.

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 7 D Gingras – ME470 IV course CalPoly Week 4 8

Name a few sensors for vehicular perception.

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 8 D Gingras – ME470 IV course CalPoly Week 4

slide-5
SLIDE 5

5

9

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 9 D Gingras – ME470 IV course CalPoly Week 4

Name a few intelligent vehicle applications which are using perception sensors and data.

10

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 10 D Gingras – ME470 IV course CalPoly Week 4

What kind of objects are we interested in about the road infrastructure?

slide-6
SLIDE 6

6

11

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 11 D Gingras – ME470 IV course CalPoly Week 4

What physical quantities can we use to detect objects in the perception process ?

12

Brainstorming

Open questions and introductory discussion

Brainstorming 20-déc.-14 12 D Gingras – ME470 IV course CalPoly Week 4

Which frequency bands are typically used in vehicular perception?

slide-7
SLIDE 7

7

 We can ask the following question:

 "given the sensory reading I am getting, what was the

world like to make the sensor give me this reading."

 This is what is done in computer vision, for example, where:

the sensor (a camera) provides a great deal of information (for example, 512 x 512 pixels = 262,144 pixels of black & white, or gray levels, or color), and

 we need to compute what those pixels correspond to in the

real world (i.e., a vehicle, a pedestrian?).

Introduction

Perception Intro 13 20-déc.-14 13 D Gingras – ME470 IV course CalPoly Week 4

 Sensors do not provide state/symbols, just signals  A great deal of computation may be required to

convert the signal from a sensor into useful state information for the vehicle.

 This process bridges the areas of:  electronics,  signal processing, and  computation.  Sensory data processing is challenging and can be

computationally intensive and time consuming.

 It means the intelligent vehicle needs a brain to do

this processing.

Introduction

Perception Intro 14 20-déc.-14 14 D Gingras – ME470 IV course CalPoly Week 4

slide-8
SLIDE 8

8

 Perception in the context of action and the task at hand  Action-oriented perception  Expectation-based perception uses a priori knowledge

about the world as constraints on sensor interpretation

 Focus-of-attention methods provide constraints on where

to look

 Perceptual classes partition the world into useful

categories

Modern approach to perception

Introduction

Perception Intro 15 20-déc.-14 15 D Gingras – ME470 IV course CalPoly Week 4

Local perception and its use in intelligent vehicle applications

20-déc.-14 16 D Gingras – ME470 IV course CalPoly Week 4

Introduction

Perception Intro

slide-9
SLIDE 9

9

20-déc.-14 17 D Gingras – ME470 IV course CalPoly Week 4

Environment representation

Source: Jan Becker, ME Dept., Stanford University, 2014

Introduction

Perception Intro 20-déc.-14 18 D Gingras – ME470 IV course CalPoly Week 4

Source: Jan Becker, ME Dept., Stanford University, 2014

Introduction

Perception Intro

slide-10
SLIDE 10

10

20-déc.-14 19 D Gingras – ME470 IV course CalPoly Week 4

Source: Moras J et al., A lidar Perception Scheme for Intelligent Vehicle Navigation. 11th Int. Conference on Control, Automation, Robotics and Vision, Dec 2010, Singapour.

Introduction

Perception Intro

 Obstacle detection is much more

difficult than vehicle detection:

  • bstacles can be small, non-

metallic, and much harder to see

 Obstacles can be stationary or

moving (e.g. deer running across the road)

 For a passenger car at highway

speeds, obstacles need to be detected 100 m ahead. For trucks, the distance is even longer.

 Obstacle detection is one of the

most challenging tasks for an intelligent vehicle

Sensing Obstacles Obstacles on the Road

 State DOTs report cleaning up construction debris, fuel spills, car

parts, tire carcasses, and so forth.

 State highway patrols receive reports of washing machines, other

home appliances, ladders, pallets, deer, etc.

 A survey commissioned by a company that builds litter-retrieval

machines reports 185 million pieces of litter / week.

 Rural states report up to 35% of all rural crashes involve animals,

mostly deer but also including moose and elk as well as farm animals.

 A non-scientific survey indicates that people have hit tire

carcasses, mufflers, deer, dogs, even a toilet !

20-déc.-14 20 D Gingras – ME470 IV course CalPoly Week 4

Introduction

Perception Intro

slide-11
SLIDE 11

11

20-déc.-14 21 D Gingras – ME470 IV course CalPoly Week 4

This figure shows how roughly raw energy detected from the environment is converted into a situational understanding of the world around the vehicle. It shows also where noise and errors can be introduced into the system and the modeling assumptions that specify how the information is altered at each step. Categories of errors marked with * are generally referred to as artifacts.)

General perception process

Source: Darms M. et al, Obstacle Detection and Tracking for the Urban Challenge, IEEE Transactions On ITS, Vol. 10,

  • No. 3, Sept. 2009

Introduction

Perception Intro 20-déc.-14 22 D Gingras – ME470 IV course CalPoly Week 4

This figure shows the architecture of a typical perception system. It is divided according to the world model into three subsystems: 1) a road estimation subsystem, which generates information about the road structure; 2) a tracking subsystem, which is responsible for generating dynamic obstacle hypotheses; and 3) a static obstacle estimation subsystem, which estimates the location of static

  • bstacles and builds the local static map.

General perception architecture

Source: Darms M. et al, Obstacle Detection and Tracking for the Urban Challenge, IEEE Transactions On ITS, Vol. 10, No. 3, Sept. 2009

Introduction

Perception Intro

slide-12
SLIDE 12

12

20-déc.-14 23 D Gingras – ME470 IV course CalPoly Week 4

Introduction

Perception Intro

Source: John Leonard et al., A Perception-Driven Autonomous Urban Vehicle, Journal of Field Robotics, 1–48 (2008) Wiley Periodicals, Inc.

Example: a perception system architecture for the DARPA Urban Challenge

20-déc.-14 24 D Gingras – ME470 IV course CalPoly Week 4

Sensor features evolution

Introduction

Perception Intro

slide-13
SLIDE 13

13

25 20-déc.-14 25 D Gingras – ME470 IV course CalPoly Week 4

Sensors using wave reflection.

Directed reflection Bad reflection Diffuse reflection

Introduction

Perception Intro 20-déc.-14 26 D Gingras – ME470 IV course CalPoly Week 4

Reflectivity of selected objects typically found in vehicular perception

a: heavily depends on the angle R: through retro-reflectors

Source: Jan Becker, ME Dept., Stanford University, 2014

Introduction

Perception Intro

slide-14
SLIDE 14

14

20-déc.-14 27 D Gingras – ME470 IV course CalPoly Week 4

Radar: RAdio Detecting And Ranging Definition: Radar is an object detection system that uses electromagnetic waves to identify the range, altitude, direction, or speed of both moving and fixed

  • bjects. By illuminating a portion of space with an EM wave, we detect and

analyze the reflected waves to extract characteristics about the reflecting object. Principle:  a transmitter emits radio waves;  when hitting an object, the radar waves are scattered in all directions;  the signal is thus partly reflected back and it has a slight change of wavelength (and thus frequency) if the target is moving;  a receiver is usually (but not always) in the same location as the transmitter. Although the signal returned is usually very weak, the signal can be amplified through the use of electronic techniques in the receiver and in the antenna configuration.

Radar

Radar 28

RADAR wavelengths

20-déc.-14 28 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

slide-15
SLIDE 15

15

20-déc.-14 29 D Gingras – ME470 IV course CalPoly Week 4

 First device invented and patented in Germany in 1904 (experience over 3km in Cologne)

 http://upload.wikimedia.org/wikipedia/commons/1/11/DE165546.pdf

 1917 Nikola Tesla first established principles regarding frequency and power level for the first primitive radar units.

 "[...] by their [standing electromagnetic waves] use we may produce at will, from a sending station, an electrical effect in any particular region of the globe; [with which] we may determine the relative position or course of a moving object, such as a vessel at sea, the distance traversed by the same, or its speed."

 Further development for anti- aircrafts purposes during WW II  First attempt for vehicle distance radar in the USA (1960s) and Germany (1970s, 35GHz)  1980s European public funded project: major radar development in Europe  1999: first automotive radar systems made by A.D.C. (Mercedes S-Class) and Bosch (BMW).

Radar brief history

1974

Radar

Radar

 Continuous Wave (CW) radar. These emit electromagnetic energy continuously in time. Non-modulated CW radars can provide reliable radial relative speed (Doppler) of the targets as well as their angular position. Range information however cannot be obtained without using some sort of modulation, usually FM (CWFM).  Pulsed radars (PR): These emit energy pulses during very short time interval. Pulsed radars can be found in different forms: 1) Pulse Repetition Frequency (PRF) radar, low-PRF, medium-PRF and high-PRF radars. Low-PRF radars are mainly used to estimate ranges when the obstacle are not in motion and the Doppler shift is not considered.

20-déc.-14 30 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

Based on waveform , we encounter two types of radars:

slide-16
SLIDE 16

16

20-déc.-14 31 D Gingras – ME470 IV course CalPoly Week 4

Radar equation

sender

  • bject

Pe = emitted power, Pr = received power σ = scattering cross section of target λ = wavelength G = Gain of sending/receiving antenna (depending on angle) d = distance to target

 

2 2 2

... 4 4 4

e r

PG G P d d                           

Radar

Radar 20-déc.-14 32 D Gingras – ME470 IV course CalPoly Week 4

tof = time of flight

c = propagation velocity

radar ≈ 300 000 km/s (electromag. waves) sound ≈ 340 m/s (sound in air)

2 2

  • f
  • f

t d t d c c   

Radar Distance Measurement – Measurement of transit time Example

d = 150m tof , radar ≈ 1µs (tof ,sound ≈ 0,9s) d = 0,15m tof , radar ≈ 1ns (tof ,sound ≈ 0,9ms)

Radar

Radar

slide-17
SLIDE 17

17

20-déc.-14 33 D Gingras – ME470 IV course CalPoly Week 4

Doppler Effect – Phase of return wave change in phase leads to change in frequency

Example fo = 76,5 GHz = base (carrier) frequency Δvrel = 0,25m/s Δf = 127Hz

2 2 2

rel

  • rel
  • v

f t c v f f f c                 

Radar

Radar 20-déc.-14 34 D Gingras – ME470 IV course CalPoly Week 4

 Very short radar pulses, objects reflect pulse  Direct distance measurement through of Sme of flight  Oscillator (e.g. 24GHz) drives sending and receiving path  Sending path  pulse is modulated, then send  Receiving path  delay generates reference signal  received pulse echo is mixed with oscillator signal (coherent signal)  coherence means that phase of sending signal is contained in  reference signal  Doppler filter generates frequency shift  Typically measurement range is 20-50m (depending on size and reflection characteristics of object)  Measurement accuracy is high (~1cm)  Ability to distinguish between objects: good, 1-2m

Pulse-‐Echo-‐Doppler-‐Radar

Radar

Radar

slide-18
SLIDE 18

18

20-déc.-14 35 D Gingras – ME470 IV course CalPoly Week 4

Radar Angular Measurement

Radar

Radar

In which lanes are the detected vehicle?

20-déc.-14 36 D Gingras – ME470 IV course CalPoly Week 4

 Indirect distance (time of flight) measurement

 frequency difference between sending and receiving signal (instead of time

  • f flight measurement)

 Linear variation of sending frequency

 Voltage Controlled Oscillator (VCO) varies sending frequency fs with slope m = df/dt  return signal is received after tof = 2d/c  during this time the sending frequency has changed by fD = tof ∙ m  thus the frequency difference can be used to measure tof giving distance d

 The frequency difference determined through mixer and low pass filtering

 signal is digitized  transformed to spectrum by Fast-‐Fourier-‐Transform (FFT)  a peak in the spectrum at fD is equiv. to an object at distance d = fD∙c / 2m.

 The modulation bandwidth determines degree of separation

 e.g. modulation hub of 100 MHz results in a line distance of Δd=1.5 m  resolution separation is approx. 4-5m

Radar

Radar

FMCW Measurement Principle

slide-19
SLIDE 19

19

20-déc.-14 37 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

FMCW Measurement Principle

 

1 2

2

m m R h h

cT cT d f f f f f      

 

1 2

2

rel D

c c v f f f f f      

Modulation  FMCW uses a linear modulation  The frequency hub of the modulation is e.g. 200 MHz for a carrier frequency

  • f 76.5 GHz.

20-déc.-14 38 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

FMCW Measurement Principle

Modulation  Object with same velocity (vrel = 0) reflects the signal  The reflected signal is received with a time delay T=2d/c  This results in a frequency difference  Lower frequency in the rising slope  Higher frequency in the falling slope  The frequency difference is a direct measure of the distance

slide-20
SLIDE 20

20

20-déc.-14 39 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

FMCW Measurement Principle

Modulation  Object with same velocity (vrel = 0) reflects the signal  The reflected signal is received with a time delay T=2d/c  This results in a frequency difference  Lower frequency in the rising slope  Higher frequency in the falling slope  The frequency difference is a direct measure of the distance

20-déc.-14 40 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

FMCW Measurement Principle

Modulation  Approaching object (vrel < 0) reflects the signal  An approaching object results in a positive shift ∆fD in the rising and falling slope.  Frequency difference in the rising slope ∆f1  Frequency difference in the falling slope ∆f2  Addition of ∆f1 and ∆f2 is a measure for the distance  Subtraction of ∆f1 and ∆f2 is a measure for the relative velocity.

slide-21
SLIDE 21

21

20-déc.-14 41 D Gingras – ME470 IV course CalPoly Week 4

CWFM Automotive Radar Block Diagram example

Source: Altera, Implementing Digital Processing for Automotive Radar Using SoCs, White paper, WP-01183-1.3.

Radar

Radar 20-déc.-14 42 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

Example of commercial radars and vehicle integration

Continental ARS 300 Long Range Radar 77 GHz Bosch LRR3 (2009)

Source: Continental Source: BOSCH

slide-22
SLIDE 22

22 Using landmarks: radar reflective surfaces

  • Collision avoidance radar can

be used for lateral control with modified lane-marking tape.

  • Frequency-dependent tape

properties can provide distance and other information Conventional lane marking tape (3M Corp.) punched with specific hole pattern to provide frequency-selective retro- reflection, helping echo discrimination.

L

  • w-Frequency

Illumination High-Frequency Illumination R adar-R eflective S tripe R adar

lower f higher f

(a) (b)

20-déc.-14 43 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

0.90" 0.09" 5.6 deg. 0.20" 0.54"

 Radar can be polarized in the same way as light.  Just as polarized sunglasses help reduce light reflected from

shallow angles (glare), polarized radar transmitters and receivers can separate the return from different polarization directions; this provides cues to distinguish horizontal surfaces and from vertical surfaces.

 Polarimetric radars are usually better than ordinary radar at

separating small obstacles from ground clutter.

 There is also some evidence that polarimetric radar will give

different returns for wet or snowy roads, giving some information on road conditions.

20-déc.-14 44 D Gingras – ME470 IV course CalPoly Week 4

Polarimetric radar

Radar

Radar

slide-23
SLIDE 23

23

2 c o s ( ) 2

a r d

v v f f f c c   

20-déc.-14 45 D Gingras – ME470 IV course CalPoly Week 4

Assume that a continuous wave (CW) radar emits at frequency f0 . The wave hits an obstacle (vehicle) moving away at relative speed va . The radar receives at an angle  a reflected wave at frequency f = f0 +fd , where fd is the Doppler frequency shift. This frequency shift is given by:

Radar

Radar

The radar is installed at the front of vehicle A, the

  • bstacle being the vehicle

B, as shown in the figure. Knowing the emitting frequency and measuring f and  at the radar receiver, we can estimate the relative speed of vehicle B with respect to vehicle A. Example: Typically, automotive radar have a range of about 150m with a solid angle

  • f view of about 12°. They can measure relative speed of a target (moving
  • bstacle) up to 60m/s (150km/h) with a speed and angular accuracies <

0.2km/h and <0.3° respectively. For intelligent vehicle applications, radars use emitting frequencies typically at 24 GHz or at 77 GHz.

20-déc.-14 46 D Gingras – ME470 IV course CalPoly Week 4

 By measuring the Doppler effect, CW radars can measure the radial speed of the target fairly accurately. Waveforms must be modulated to

  • btain both radial speed and range.

 Radar are robust to variations of ambient illumination. They can

  • perate in daylight or at night.

 Range is fairly good  Sensitive to EM interferences (ex. from other vehicles’ radar emission) and clutter environments. Remarks on radar use.

Radar

Radar

slide-24
SLIDE 24

24 Radar main components

Antenna EW Duplexing Receiver Transmitter Modulator Exploitation Synchronization

20-déc.-14 47 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

T

e

nTe

time

Emitted pulse Received pulse  t Tr

time

Frequency F1 F2 Emitted wave Received wave T 

fshift

 F

time

Frequency F1 F2 Emitted wave Received wave F Doppler effect 

time

Emitted pulse Received pulse Pulse width Delay

20-déc.-14 48 D Gingras – ME470 IV course CalPoly Week 4

Pulsed radar: Pulsed Doppler radar: Frequency modulated Continuous Wave (FMCW) radar Frequency shift keying (FSK) radar:

Radar

Radar

Types of waveform design for radar

slide-25
SLIDE 25

25 Informations that can be measured with radars

Remote detection of objects :

 Object present or not depending on radar return signal (echos)are

Range:

 d = ½*c*t in time of flight measurement (c = 3x108 m/s)

 Astuce dans le domaine fréquentiel

Relative radial speed:

 V = -½ fd l = -½ (fd /F)c , frequency analysis – use Doppler effect.

 Range can be derived (after tracking) if time of flight is measured

Angle :

 Antenna orientation (or the wave beam) during detection

20-déc.-14 49 D Gingras – ME470 IV course CalPoly Week 4

Radar

Radar

LAser Detection And Ranging or Ladar (also named Lidar, for « Light radar »), is also used for the remote detection and characterization of

  • bjects. Principle of operation is similar to radar, but used much shorter

wavelengths, usually in the near infrared (1,2 micron), whereas the radars wavelength are in the mm range. Like radar it is an active system, composed of a transmitter (laser source), a receiver (photo detector, IR camera) and a synchronization system. The Lidar send an amplitude modulated laser light beam and measure the echos back. The phase delay is then computed using a time of flight

  • technique. With a fixed frequency f and an object located at distance d,

c being the speed of light, the phase delay is given by,

The distance between the Lidar and the object is then given by,

2 2 ( ) d f c    

4 c d f    

20-déc.-14 50 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

slide-26
SLIDE 26

26

0,2 x 10-6 0,4 x 10-6 0,6 x 10-6 0,8 x 10-6 10-6 1,2 x 10-6 I R proche 1,4 x 10-6

Plage d’utilisation du LASER

Longueur d’ondes (mètres)

Lidar and laser sensors wavelength

20-déc.-14 51 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars 20-déc.-14 52 D Gingras – ME470 IV course CalPoly Week 4

Source: Moras J et al., A lidar Perception Scheme for Intelligent Vehicle Navigation. 11th Int. Conference on Control, Automation, Robotics and Vision, Dec 2010, Singapour.

Lidars

Basic lidar sensor model. Bird view of the real scene. Multiple echos received by the sensor to compute

  • bstacles’ range.

Lidars and laser sensors

slide-27
SLIDE 27

27

Typically, laser sensors are used in a scanning mode. The scanning mechanism is achieved using a 360 degrees rotationg mirror (galvanometer). In this way, the whole surrounding space can be scanned at a typrical range

  • f 200 – 300 m. Lidars have a much better angular resolution than radar.

Their range measurement is also much better (cm range) because of their much shorter wavelength.

Source: source: http://upload.wikimedia.org/ wikipedia/commons/c/c0/ LIDAR-‐scanned-‐SICK-‐LMS-‐animation.gif

20-déc.-14 53 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

With proper optics, all non light absorbing objects will reflect part

  • f its receiving light back to the

lidar, which will then measure the delay between the emitted and received signals. Problems may

  • ccur with textured surface, hence

deviating the reflected beam.

20-déc.-14 54 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

Lidar: Light Detecting and ranging Lidar Principle  Distance Measurement: time of flight  Angular Measurement  scanning lidar  laser pulse is reflected by a rotating mirror distance measurements at specified angle intervals  non scanning lidar  multi-‐beam  multi-‐receiver  none

slide-28
SLIDE 28

28

In 1917 : Albert Einstein détermined and predicted the necessary

conditions for the generation of stimulated emission of light. In 1950 : Alfred Kastler in France realized first stimulated emission of light by exciting atoms using optical pumping. In 1958 : Design parameters for constructing a LASER were set by Charles H.Townes et Arthur L. Schawlow In 1960 : First LASER has been realized by Theodore Maiman LASER : Light Amplification by the Stimulated Emission of Radiation »

20-déc.-14 55 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

Some historical notes on lasers

Excited state Fundamental state

h

Spontaneous emission of photons

Excited state

h

Stimulated emission of photon

h h

An excited atom hit by an incoming photon will emit photons with the same characteristic (, amplitude, direction and phase) than the incident photon. Hence the generation of « coherent » light. Using a resonance cavity allows for the amplification (multiplying the number of coherent photons) of light. Energy

Photon energy E1 – E0 (= h) E1 E0 h : Planck constant = 6,626 x 10-34 J s  : frequency 20-déc.-14 56 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

Fundamental state

Optical pumping: excite (transfer of energy) the atom to reach an « inversion of population », bringing the atom in its excited state. This transfer of energy to the atom is called optical pumping and can be achieve using electrical fields or chemical reactions, etc.

slide-29
SLIDE 29

29

20-déc.-14 57 D Gingras – ME470 IV course CalPoly Week 4

Source: Darms M. et al, Obstacle Detection and Tracking for the Urban Challenge, IEEE Transactions On ITS, Vol. 10,

  • No. 3, Sept. 2009

Lidars and laser sensors

Lidars

Typical radar/laser sensors setup and characteristics for object detection and tracking on R&D vehicles for automated driving. Those setups are currently way too expensive for commercial automotive applications.

20-déc.-14 58 D Gingras – ME470 IV course CalPoly Week 4

Source: Moras J et al., A Lidar Perception Scheme for Intelligent Vehicle Navigation. 11th Int. Conference on Control, Automation, Robotics and Vision, Dec 2010, Singapour.

Laser scanner angular resolution as a function of the spatial scanning and of the scanning frequency

Lidars and laser sensors

Lidars

slide-30
SLIDE 30

30

20-déc.-14 59 D Gingras – ME470 IV course CalPoly Week 4

Lidar commercial variants

Lidars and laser sensors

Lidars

Modulation: practically all lidar sensors use pulse modulation Angular Measurement: 3 types  Fixed multi-‐beam sensors  Mechanically scanning mirrors (galvos)  Moving optics Velocity Measurement: all Ibeo Denso Hella Conti Conti Omron

20-déc.-14 60 D Gingras – ME470 IV course CalPoly Week 4

Lidar data example: Map building with the Velodyne HDL-64E Lidar

Lidars and laser sensors

Lidars

Source: Stanford racing team Darpa Challenge

slide-31
SLIDE 31

31

20-déc.-14 61 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

Source: Moras J et al., A lidar Perception Scheme for Intelligent Vehicle Navigation. 11th Int. Conference on Control, Automation, Robotics and Vision, Dec 2010, Singapour.

Occupancy grid: Scan grid projected in a Cartesian frame, white cells are occupied, black are free, and gray are unknown.

Source: ACAS Program, final report, executive summary, 1998.

Raster scan laser sensor

Advantages / Drawbacks of Lidars

 Advantages :

 High accuracy both in range and in angular resolution  Extended view of local environment (through scanning)  Cover 360 degress.

 Drawbacks :

 Very expensive still  Mobile components, fragile.  Sensitive to water (snow, mist, rain)  Glowing  Sensitive to perturbations from other instrumented vehicles

20-déc.-14 62 D Gingras – ME470 IV course CalPoly Week 4

Lidars and laser sensors

Lidars

slide-32
SLIDE 32

32

Typical vision system for lane tracking. The detected position of the solid line is shown by the blue dots; the detected dashed line by dark and light blue dots. Overlayed on the image is data from other sensors, showing the location of radar targets: yellow X for right lane, red X for current lane. Experimenter interface shown at bottom.

20-déc.-14 63 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision 64

Machine vision wavelength ranges: 1) visible 2) infrared 3) ultraviolet

20-déc.-14 64 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

slide-33
SLIDE 33

33

65

Optics used in imaging F F’

O

f f T t D D’

20-déc.-14 65 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision 66

Camera sensor

 Imaging sensor: 2D array of integrated photosensitive

cells (called photosites) on silicone substrate.

 Each photosite corresponds to a pixel in the resulting

digital image.

Photosites use photoelectric effect.

Electrode V > 0 Photons Isolation SiO2 Substrate Si doped P

  • - - -

Electrons

Some (photosensitive) materials when hit by a photon transform the energy to produce electrons.

20-déc.-14 66 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

slide-34
SLIDE 34

34

Machine vision systems

Stereo works by finding the same point in two or more cameras. Intersecting the lines of view from the cameras gives the 3D location of the object.

20-déc.-14 67 D Gingras – ME470 IV course CalPoly Week 4 Machine Vision

Stereo vision for range (depth) estimation

Camera 1 Camera 2

Stereo Guided Segmentation

 Low-resolution stereo for detection and recognition of nearby objects,

used for side-looking sensors on a bus.

 Left: Original image. Center: depth map from stereo; brighter is close.

Right: “blobs” of pixels at the same distance. The overlays on the

  • riginal image show detected objects, two pedestrians and a car.

 Further processing can examine each blob to separate people from

fixed obstructions, and generate appropriate driver warnings

Machine vision systems

Machine Vision

slide-35
SLIDE 35

35 Long-Range Stereo vision-based system

Top: One of three images from a stereo set. The

  • bjects on the road are 15

cm tall at a range of 100 m from the camera. Bottom: detected objects in

  • black. Besides the obstacles
  • n the road, the system has

found the person, the sign, grass along the road, and a distant dip in the road

20-déc.-14 69 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

Perception at night

Video Lane detection at night

Source: US DOT NHTSA ACAS Program, final report, 2000

Dynamic environment modeling based on moving on-vehicle cameras plays an important role in intelligent vehicles. However, this is extremely challenging due to the combined effects of ego-motion, blur, ambient light changes etc. Therefore, traditional methods for gradual illumination change, small motion

  • bjects, such as background subtraction, do not work well any more,

particularly at night, even those that have been widely used in surveillance

  • applications. Consequently, more and more approaches try to handle these
  • issues. Unfortunately, it is still an open problem to reliably model and update

visual background in scene interpretation.

20-déc.-14 70 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

Proble: glowing from other incoming cars

slide-36
SLIDE 36

36 Imaging process (CCD) Optics

CCD sensor array

Signal shaping 3D scene

Lumière émise / réfléchie 2D Image light intensity (no phase)

Composite signal (analog)

  • r digital

data array

Projection In image plane 3D -> 2D Conversion

  • f photons

into electrons Mixing, filtering and signal acquisition Sequential stream of charges

20-déc.-14 71 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision 72 20-déc.-14 72 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

 Image processing involves 2D or 3D data sets. Hence the

computational needs are heavy and complex to extract useful information (example: vehicle positioning relative to lane marking).

 Great variability of illumination. Shadow problems.

Take this simple image:

slide-37
SLIDE 37

37

73

AGVs using machine vision

Machine vision systems are often used in Automatic Guided vehicles applications, where visual landmarks are implemented on fixed routes, such as in airports or specific city centers.

20-déc.-14 73 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision 74

 Advantages:

 Low cost cameras (CMOS cameras are mainly used in automotive

applications, because they are cheaper, faster and have a better dynamic range than CCDs.

 Extended view of the surrounding environment.  Passive systems. No emission from the vehicles.  Very good to classify various kind of objects and object recognition.  Required to extract information on road infrastructure.  Motion detection using multiple frames

20-déc.-14 74 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

 Disadvantages:

 Sensitive to great variability of illumination  Heavy computational needs  Relatively low accuracy for 3D range estimation  Sensitive to shadows in detection objects  Non unique interpretation of measured scenes

slide-38
SLIDE 38

38

75

 Example of problems in machine vision

systems for vehicular applications:

High dynamic range of illumination

Glowing from the sun:

20-déc.-14 75 D Gingras – ME470 IV course CalPoly Week 4

Machine vision systems

Machine Vision

Ultrasonic or sonar

Ultrasonic sensors use the energy propagation of acoustic waves in a material (usually air or water). The acoustic waves used are in the high frequency ultrasound range. For automotive applications, the sonars used are actives, that is, an ultrasound burst is transmitted and its echo is measured.

20-déc.-14 76 D Gingras – ME470 IV course CalPoly Week 4 Ultrasonic Sonar

SONAR: Sound Navigation And Ranging.

slide-39
SLIDE 39

39

20-déc.-14 77 D Gingras – ME470 IV course CalPoly Week 4

Ultrasonic or sonar

Ultrasonic Sonar

The sound of speed in the air being known, the time of flight technique allows to compute the distance between the vehicle and the obstacle where the acoustic wave bounces. The maximal range for acoustic waves are of the

  • rder of a few meters, with an

emitted energy of more than 100

  • dB. So these perception sensors

are mainly used for short range perceptive task, such as in parking assist systems or dense

  • platooning. Arrays of those low

cost sensors can also be used to

  • btain angular position

information of the obstacles. Ultrasound travels in the air at around 340m/s like other sounds. The time it takes for an ultrasound wave to travel 10cm is approximately 3ms, as

  • pposed to 3.3ns for light and radio
  • waves. This allows measurement

using low speed signal processing.

20-déc.-14 78 D Gingras – ME470 IV course CalPoly Week 4

Ultrasonic or sonar

Ultrasonic Sonar

In the case of a rear sonar, two to four ultrasonic sensors are mounted on the rear bumper to detect an obstacle up to 2 to 2.5m away. The distance is communicated to the driver in real time using varying buzzer

  • sounds. Even a wire fence can be detected if

it is close enough. The main characteristics of ultrasonic sensors for rear sonar are directivity, ringing time, sensitivity and sound

  • pressure. Directivity of an ultrasonic

sensor corresponds to the size and shape

  • f the vibrating surface (that is emitting

the ultrasound) and the frequency at which it vibrates.

slide-40
SLIDE 40

40

Advantages  Low cost  Robust to object composition  Robust to weather conditions  Reliable with smooth black surfaces  Low speed (then low cost) signal processing Disadvantages  Short range only

20-déc.-14 79 D Gingras – ME470 IV course CalPoly Week 4

Ultrasonic or sonar

Ultrasonic Sonar

Advantages and drawbacks

Most sources of errors in sonar systems are systematic in nature (incorrect alignment or incorrect calibration). These are errors linked to the specifications of the sensor. Usually, sonar errors are function of the emitted beam width, beam resolution, emitted power, and sensitivity threshold of the sensor. 3D geometry of the perceived environment and acoustic reflectivity of the reflecting surface will also influence the performance of the sonar system.

20-déc.-14 80 D Gingras – ME470 IV course CalPoly Week 4

Ultrasonic or sonar

Ultrasonic Sonar

Typical ultrasonic parking assistance system(UPAS) with the implementation of a four-channel ultrasonic distance measurement system using PSoC.

Source: Munenori Hikita, Murata Corp., White Paper, An introduction to ultrasonic sensors for vehicle parking, www.newelectronics.co.uk

slide-41
SLIDE 41

41 Principle of magnetic landmarks

 A magnetic field is generated

in the infrastructure

 This magnetic field is capture

“ on the fly” using a magnetometer

  • r an antenna.

 Relative position estimation and

noise filtering

5 x

20-déc.-14 81 D Gingras – ME470 IV course CalPoly Week 4 Magnetic landmarks

Landmarks for perception in AGVs

Technologies

 Magnetometer:

 Electric cables (ex. : golf carts from Yamaha)  Magnets (ex. : Vehicle platooning PATH, UC Berkeley,  Magnetic marking (ex. : ARCOS, France)

 Active fixed antennas:

 Transponders (ex. : park of shuttles in Rotterdam)

 Passive antenna:

 wired-guiding system (ex. : shuttle transmanche)

20-déc.-14 82 D Gingras – ME470 IV course CalPoly Week 4

Landmarks for perception in AGVs

Magnetic landmarks

slide-42
SLIDE 42

42 Informations obtained using landmarks

Detection :

Detecting presence of a landmark with known position

Relative lateral positioning

Signal amplitude proportional to magnetic field strength

Weighting possible if multiple detectors used.

Absolute speed :

Time stamping of landmark detections when landmarks’ position are known.

Heading estimation:

Possible if landmark sensors are available at both end of the vehicle

Information coding

Magnet polarity, recording landmarks detection.

X X 20-déc.-14 83 D Gingras – ME470 IV course CalPoly Week 4 Magnetic landmarks

Landmarks for perception in AGVs

Field strength as a function

  • f relative distance in cm.

Guidance using electro-magnetic landmarks

 Advantages

 Robust to snow and water  Very good accuracy (mm to cm

range)

 Information coding possible (ex.

using polarity)

 Flexible in implementation

 Drawbacks

 Measurement and processing

made in the vehicles

 Sensitive to EM perturbations  Implementation of landmarks in

the infrastructure (expensive!)

 Electrical wiring in the road for

transponders

20-déc.-14 84 D Gingras – ME470 IV course CalPoly Week 4 Magnetic landmarks

Landmarks for perception in AGVs

Magnetic field perturbations Left arrow: high power line, right arrow: metallic bridge.

slide-43
SLIDE 43

43

Conclusion

 Radars are less sensitive than Lidars to air particles, air pollution

and humidity (water vapor) due to its much longer wavelength of

  • peration but have a poorer angular resolution and range accuracy.

 Lidar, in its high-resolution scanning formats, is useful for seeing

small objects, but still very expensive.

 Sonar has relatively small range (few meters). Very good for low

speed local maneuvers such as in parking assist systems.

 Advanced radar and stereo vision systems may also work as

alternative to expensive Lidars, but still computationally expensive.

 So far, performance of instrumented vehicles are good because

there is no interferences from other similar vehicles on the road. How reliable and robust will those active perception systems (radar, sonar, lidar) be when lots of instrumented vehicles will see each others? Still an open question.

20-déc.-14 85 D Gingras – ME470 IV course CalPoly Week 4

References

 Clark J et al., Data fusion for sensory information processing systems.,

Kluwer, 1990.

 Eskandarian Azim (Ed.),Handbook of Intelligent Vehicles, Chap 13,

Springer 2012.

 Gavrila D et al., Real-time vision for intelligent vehicles, IEEE

Instrumentation & Measurement Magazine, June 2001.

 Kyrynyuk V. et al., PSoC - Automotive Ultrasonic Distance Measurement for

Park-Assist Systems, Cypress Tech report. AN76530, 2010.

 Wenger J. , Automotive Radar – Status and Perspectives, IEEE CSIC

Digest, 2005.

 Vu Trung-Dung, et al., Object Perception for Intelligent Vehicle

Applications: A Multi-Sensor Fusion Approach. IEEE Intelligent Vehicles Symposium, 2014, Dearborn, Michigan.

 Broggi A et al., Sensors technologies for intelligent vehicles perception

systems: a comparison between vision and 3D-LIDAR, Proceedings of the 16th International IEEE Annual Conference on Intelligent Transportation Systems (ITSC 2013), The Hague, The Netherlands, 2013

20-déc.-14 86 D Gingras – ME470 IV course CalPoly Week 4

slide-44
SLIDE 44

44

QUESTIONS?

20-déc.-14 87 D Gingras – ME470 IV course CalPoly Week 4