Real-Time Mosaic-Aided Aerial Navigation: II. Sensor Fusion - - PowerPoint PPT Presentation

real time mosaic aided aerial navigation ii sensor fusion
SMART_READER_LITE
LIVE PREVIEW

Real-Time Mosaic-Aided Aerial Navigation: II. Sensor Fusion - - PowerPoint PPT Presentation

2009 AIAA Guidance, Navigation and Control Conference Real-Time Mosaic-Aided Aerial Navigation: II. Sensor Fusion


slide-1
SLIDE 1

Real-Time Mosaic-Aided Aerial Navigation: II. Sensor Fusion

  • !

" # $

August 2009

2009 AIAA Guidance, Navigation and Control Conference

slide-2
SLIDE 2

  • Camera scanning
  • On-line mosaic construction
  • Image-based motion estimation

Mosaicking improves estimation precision in challenging

scenarios

Narrow camera FOV Low-texture scene

  • Pos

V         Ψ  

  • Part I

Part I Now Now

Introduction

slide-3
SLIDE 3
  • Introduction

Relative Motion Measurement Model Fusion with Navigation System Observability Analysis Performance Evaluation Summary

  • Pos

V         Ψ  

  • Now

Now

slide-4
SLIDE 4

!

  • Image-based motion estimation
  • translation (known up to some scale )
  • rotation
  • In ideal

ideal conditions, when there are no navigation errors and assuming perfect translation and rotation motion estimations:

2

1 2 C

t →

  • 2

1

C C

R

2 2 2 2 2 2 1 1

2 1 1 2

( ) ( )

L C C L C C C C

Pos t Pos t T t T R γ

  − =   =

  • γ
  • Platform position
  • DCM from system N to system M

Pos

  • N

M

T

  • Coordinate systems

L - Local Level Local North (LLLN) B - Body C - Camera

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-5
SLIDE 5

"

  • In real conditions these constraints do not hold, due to

Navigation errors Imperfect image-based motion estimations

  • Residual measurements definition:

2 2 2 2 2 2 1 1

2 1 , 1 2 ,

ˆ ( ) ( ) ˆ

L C C Nav Nav L N translation rota av T C C C Na i v t on C

z Pos t Pos t T t T R I z

→ ×

  − × =         = −    

  • (

)

2 2 2 2 2 2 1 1

2 1 1 2

( ) ( )

L C C L T C C C C

Pos t Pos t T t T R I

  − × =   =

  • Introduction

Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-6
SLIDE 6

#

  • State vector definition
  • Continuous system matrix
  • a skew-matrix constructed based on accelerometer

sensors readings

  • DCM from Body to Local Level Local North systems

3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 15 15 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 3 B s L B c L

I A T T

× × × × × × × × × × × × × × × × × × × × × × ×

      Φ = − ∈ℜ          

s

A

B L

T

15 1 T T T T T T

X P V d b

×

  =

  • Ψ

∈ℜ  

  • Introduction

Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-7
SLIDE 7

$

  • Measurements Equations

3 3 3 3 3 3 3 3 Tr Tr Tr Tr Translation V d b Rot Rot Rotation d

P V z H H H H v z H H d b

×

  • Ψ

× × Ψ ×

 

       Ψ = +                

  • ( )(

) ( ) ( ) ( )

2 2 1 2 2 2 2 1 2 2 2 2 1 1 2 2 1 2 2 1 1 2 2 1

1 2 2 1 2 1 3 1 2 1 2 1 2

ˆ 1 ˆ 2 1 ˆ 6 1 ˆ 2

C C L Tr V L L C C L Tr L L s C C L B Tr d L L s L C C L B Tr b L L L

H T t T t H T t T A t t H T t T A t T t H T t T T t

× Ψ → × → × → ×

  = −

   = −

   =

   = −

( )

2 2 2 1 1 2 2 2 2 2 2 1 1 2 2 1

ˆ ˆ

C B L L Rot E C C B L E C B L B Rot d C C B L

H R T T T T I H R T T T t

Ψ =

− =

  • 2

1

t t t = −

Translation terms Rotation terms

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-8
SLIDE 8
  • Remarks

Motion parameters may be estimated based on the

homography or the fundamental matrices

Pos V         Ψ  

  • Introduction

Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

%

ˆ ˆ ˆ ˆ ˆ P V d b  

   Ψ          

slide-9
SLIDE 9
  • Adaptive translation measurement covariance
  • Measurement covariance matrix
  • Measurements-rejection mechanism is used to avoid fusion of

low-quality measurements

( ) ( )

2 2 2 2 2 2

2 1 1 2 1 2 1 2 1 2

ˆ ˆ ˆ ,

L L L L L L Tr Nav Nav

v Pos t Pos t t t t t

→ → → → ×

  = −

  • =

−  

  • ( )

( ) ( ) ( )

2 2 2 2

2 1 2 1 L L L L Tr Nav Nav Est Nav Nav

R Pos t Pos t R Pos t Pos t

× ×

    = − − −    

Tr k k Rot

R R R   =    

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

&

slide-10
SLIDE 10
  • Fictitious Velocity (FV) measurement

Unobservable states in are deteriorated due to

imperfectness in image-based motion estimation

Fictitious Velocity measurement is introduced

Goal – to let the filter “believe” the error along the flight

heading is small

Implementation: After the KF gain matrix is computed, the FV data is

removed

X

  • (

)

2 2 1

1 2, C C C

t R

  • (

)

T L

V V

  • =
  • (

)

1 3 1 3 1 3 1 3 T FV L

H V

× × × ×

  =    

  • 6 6

1 6 Aug FV

R R R

× ×

  =    

Trans Rot Aug FV

H H H H     =      

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

'(

slide-11
SLIDE 11
  • Piece-Wise Constant System (PWCS) [Goshen-Meskin & Bar-Itzhack 1992]

For each time segment j=1,…,r the system matrices are constant At least n measurements in each segment Observability matrix in each segment Total Observability Matrix (TOM)

( ) ( ) ( ) ( ) ( )

1

j j j j

x k F x k B u k z k H x k + = +    =  

  • (

) ( )

1 T T T T T T n j j j j j j

Q H H F H F −   =    

  • ( )

1 1 2 1 1 1 1 1 2 1 n n n n r r r

Q Q F Q r Q F F F

− − − − − −

      =      

  • Introduction

Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

''

slide-12
SLIDE 12

'

( )

2 1 2 1

1 1 2 1

r

n d d n r d d d

Q H Q r H

− −

    Φ Φ   =     Φ Φ Φ  

  • In our case

Each segment may have less than n measurements

Measurements frequency is not as high as desired

Examined scenario

Straight and Level (SL) flight + maneuver phase Maneuver phase is divided into segments

  • Worst case – one measurement per segment

( ) ( ) ( )

1

j Tr

d Trans j j Rot Rot j j

X k X k Z H X k Z H + = Φ     =            

slide-13
SLIDE 13

'

  • Number of observable modes –

rank of

  • Unobservable modes components

– Nullspace of the Observability Grammian

  • Analysis Results

Position terms are always

unobservable

After several maneuver

segments other states become observable

  • Problematic estimation of some

states in realistic scenarios

( ) ( )

T

G Q r Q r =

( )

Q r

P_N P_E P_D V_N V_E V_D Phi Theta Psi d_x d_y d_z b_x b_y b_z 0.5 1

SL

P_N P_E P_D V_N V_E V_D Phi Theta Psi d_x d_y d_z b_x b_y b_z 0.5 1

SL + 1 maneuver segment

P_N P_E P_D V_N V_E V_D Phi Theta Psi d_x d_y d_z b_x b_y b_z 0.5 1

SL + 2 maneuver segments

P_N P_E P_D V_N V_E V_D Phi Theta Psi d_x d_y d_z b_x b_y b_z 0.5 1

SL + 4 maneuver segments

P_N P_E P_D V_N V_E V_D Phi Theta Psi d_x d_y d_z b_x b_y b_z 0.5 1

SL + 6 maneuver segments

Unobservable modes

8 modes 6 modes 5 modes 4 modes 3 modes

slide-14
SLIDE 14

'!

  • Assumed initial navigation errors and IMU errors
  • Platform trajectory – Straight and level north heading flight

Ideal Measurements Two-view Aided Navigation Mosaic Aided Navigation

Description Value Units Initial position error m Initial velocity error m/s Initial attitude error deg IMU drift deg/hr IMU bias mg

(1 ) σ (1 ) σ (1 ) σ (1 ) σ (1 ) σ (0.1 0.1 0.1)T (1 1 1)T (1 1 1)T (0.3 0.3 0.3)T (100 100 100)T

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-15
SLIDE 15

Pos V         Ψ  

  • '"
  • Ideal relative motion estimations, computed based on platform

true trajectory (not image-based measurements)

Best possible performance

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-16
SLIDE 16

'#

  • Monte-Carlo results

Straight and level north heading flight Comparison to inertial scenario

Position Errors Velocity Errors

50 100 150 200 250 300 350 400 5000 North [m] 50 100 150 200 250 300 350 400 200 400 East [m] 50 100 150 200 250 300 350 400 200 400 Alt [m] Time [sec] σ Filter σ Inertial 50 100 150 200 250 300 350 400 20 40 VN [m/s] 50 100 150 200 250 300 350 400 0.5 1 VE [m/s] 50 100 150 200 250 300 350 400 0.5 1 VD [m/s] Time [sec] σ Filter σ Inertial

slide-17
SLIDE 17

'$

  • Monte-Carlo results

Straight and level north heading flight Comparison to inertial scenario

Euler Angles Errors Drift and Bias Estimation Errors

50 100 150 200 250 300 350 400 0.2 0.4 Φ [deg] 50 100 150 200 250 300 350 400 0.2 0.4 Θ [deg] 50 100 150 200 250 300 350 400 0.2 0.4 Ψ [deg] Time [sec] σ Filter σ Inertial 200 400 1 2 dx [deg/hr] 200 400 1 bx [mg] 200 400 1 2 dy [deg/hr] 200 400 1 by [mg] 200 400 1 2 Time [sec] dz [deg/hr] 200 400 1 2 bz [mg] Time [sec] σ Filter

slide-18
SLIDE 18

'%

  • Conclusions

Position and velocity errors perpendicular to the flight heading

are considerably reduced and nearly nullified, respectively

Roll angle error estimation Drift estimation in all axes Bias estimation in z axis

  • Increased observability while performing maneuvers

Pitch angle error estimation Bias estimation in y axis

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-19
SLIDE 19

'&

Pos V         Ψ  

  • !"#"$%
  • Motion estimation based on consecutive camera-captured images

The images were acquired from Google Earth Without mosaic image construction

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-20
SLIDE 20

(

!"#"$%

  • Wide

Wide field-of-view camera

Cumulative Distribution Function (CDF)

  • f translation motion estimation error

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

5 10 15 20 20 40 60 80 100 Error in translation direction [Deg] CDF [%]

slide-21
SLIDE 21

'

!"#"$%

  • With Fictitious Velocity (FV) measurement
  • Comparison to

Ideal relative motion measurements Inertial scenario

Velocity Errors Euler Angles Errors

50 100 150 200 250 300 350 400

  • 10

10 VN Error [m/s] 50 100 150 200 250 300 350 400

  • 3
  • 2
  • 1

1 VE Error [m/s] 50 100 150 200 250 300 350

  • 2
  • 1

1 Time [s] VD Error [m/s] Inertial Image-Based Ideal 50 100 150 200 250 300 350 400

  • 0.2

0.2 Φ [deg] 50 100 150 200 250 300 350 400

  • 0.4
  • 0.2

Θ [deg] 50 100 150 200 250 300 350 400

  • 0.4
  • 0.2

Time [s] Ψ [deg] Inertial Image-Based Ideal

slide-22
SLIDE 22
  • !"#"$%
  • Fictitious Velocity (FV) measurement influence

Real images, with FV Real images, without FV

  • Drift is not estimated in all cases

Velocity Errors Euler Angles Errors

50 100 150 200 250 300 350 400

  • 20

20 VN Error [m/s] 50 100 150 200 250 300 350 400

  • 2
  • 1

VE Error [m/s] 50 100 150 200 250 300 350 400

  • 2

2 Time [s] VD Error [m/s] Image-Based With FV Image-Based Without FV 50 100 150 200 250 300 350 400

  • 0.2

0.2 Φ [deg] 50 100 150 200 250 300 350 400

  • 0.4
  • 0.2

Θ [deg] 50 100 150 200 250 300 350 400

  • 0.4
  • 0.2

Time [s] Ψ [deg] Image-Based With FV Image-Based Without FV

slide-23
SLIDE 23
  • Mosaic Aided Navigation
  • Pos

V         Ψ  

  • Introduction

Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-24
SLIDE 24

!

$%

  • Mosaic construction based on images from camera scanning
  • Motion estimation between a new captured image and a mosaic

Downward-Looking images only Increased overlapping region

Additional Overlapping Area

New image Mosaic

Original Overlapping Area

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-25
SLIDE 25

"

$%

  • Narrow field of view (FOV) camera:
  • Low-texture type scenes

Example image acquired from Google Earth

5 3 ×

5 10 15 20 25 20 40 60 80 100 CDF [%] Error in translation direction estimation [Deg] Mosaic 2-View 2-View Wide FOV

Superior mosaic-based motion estimation precision

Cumulative Distribution Function (CDF) of translation motion estimation error

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

slide-26
SLIDE 26

#

$%

  • Straight and level north heading trajectory
  • Measurements fusion between
  • Inertial navigation elsewhere

50 100 150 50 100 150 North [m] Inertial Inertial Vision-Aided 50 100 150 100 200 East [m] 50 100 150 80 100 120 140 Time [s] Height [m] Mosaic based Inertial 2-view 50 100 150 1 2 VN Error [m/s] Inertial Inertial Vision-Aided 50 100 150

  • 2
  • 1

VE Error [m/s] Mosaic based Inertial 2-view 50 100 150

  • 1
  • 0.5

0.5 Time [s] VD Error [m/s]

Position Errors Velocity Errors

50 100 t ≤ ≤

slide-27
SLIDE 27

$

$%

  • Straight and level north heading trajectory
  • Measurements fusion between
  • Inertial navigation elsewhere

50 100 150

  • 0.2

0.2 Φ [deg] Inertial Inertial Vision-Aided 50 100 150

  • 0.2
  • 0.15
  • 0.1

Θ [deg] 50 100 150

  • 0.2
  • 0.15
  • 0.1

Time [s] Ψ [deg] Mosaic based Inertial 2-view 50 100 150 0.9 1 1.1 bx [mg] Inertial Inertial Vision-Aided 50 100 150 0.8 1 by [mg] 50 100 150 0.5 1 bz [mg] Time [sec] Mosaic based 2-view

Euler Angles Errors Bias Estimation Errors

50 100 t ≤ ≤

slide-28
SLIDE 28
  • Mosaic-aided navigation method was presented:

Camera scanning Mosaic construction Mosaic-based motion estimation fusion with an INS

  • The method does not require any a-priori information and does

rely on external sensors, apart from the camera sensor

  • The method may be applied also for two-view motion estimation
  • Observability analysis
  • Performance evaluation

Statistical study based on ideal motion estimations Two-view aided navigation for wide FOV cameras Improved performance of mosaic-aided navigation for narrow

FOV cameras

Introduction Observability Analysis Fusion with Navigation sys. Measurements Model Performance Evaluation Summary

%

slide-29
SLIDE 29

!&'…