An On-Board Visual-Based Attitude Estimation System for Unmanned - - PowerPoint PPT Presentation

an on board visual based attitude estimation system for
SMART_READER_LITE
LIVE PREVIEW

An On-Board Visual-Based Attitude Estimation System for Unmanned - - PowerPoint PPT Presentation

An On-Board Visual-Based Attitude Estimation System for Unmanned Aerial Vehicle Mapping M. Ridhwan Tamjis 1 and Samsung Lim 2 1 School of Civil and Environmental Engineering, UNSW Sydney, NSW, 2052, Australia Email: m.tamjis@student.unsw.edu.au


slide-1
SLIDE 1

An On-Board Visual-Based Attitude Estimation System for Unmanned Aerial Vehicle Mapping

  • M. Ridhwan Tamjis1 and Samsung Lim2

1School of Civil and Environmental Engineering,

UNSW Sydney, NSW, 2052, Australia Email: m.tamjis@student.unsw.edu.au

2School of Civil and Environmental Engineering,

UNSW Sydney, NSW, 2052, Australia Email: s.lim@unsw.edu.au

slide-2
SLIDE 2
  • To develop a visual-based attitude estimation

system that can be used for unmanned aerial vehicle (UAV) mapping

  • To use the system as a backup of the Inertial

Measurement Unit (IMU) to provide an accurate platform/camera pose during image acquisition

  • To remove dependencies on physical ground

control points (GCPs) for accurate camera pose

Objectives

slide-3
SLIDE 3
  • Commercial software packages are now

readily available to undertake the tasks (e.g. ERDAS, Photo modeller, Australis, Pix4D) which are good for off-board processing where computing/processing constraints are minimal

  • We aim for on-board attitude estimation

and camera self-calibration

Rationale

slide-4
SLIDE 4
  • Use optical flow to measure the egomotion of

the on-board camera, and estimate the platform’s attitude

  • Integrate optical flow with a keypoints detector

for attitude estimation and camera self- calibration

  • Find the best detector for the proposed visual-

based attitude estimation system

Methodology

slide-5
SLIDE 5
  • VGCPs are key points on the image,

extracted from salient features

  • VGCPs should be present in overlapping

images

  • VGCPs are crucial to estimate the pose of

the camera/platform based on the image plane

Virtual Ground Control Points (VGCPs)

slide-6
SLIDE 6

Image Plane

Image Plane Ground Scene Camera (Xc,Yc,Zc) Ground Point P (X,Y,Z) Image Point p (x,y) Focal Length, f Depth, Z

slide-7
SLIDE 7

Velocity on the image plane

vx vy vz ! " # # # # # $ % & & & & & = 1 Z − f x − f y 1 ! " # # # $ % & & & × Tx Ty Tz ! " # # # # # $ % & & & & & + xy f −( f + x2 f ) y −( f + y2 f ) xy f x 1 ! " # # # # # # # # # $ % & & & & & & & & & × ωx ωy ωz ! " # # # # # $ % & & & & &

slide-8
SLIDE 8

Optical flow field !

slide-9
SLIDE 9

A series of tests have been conducted to determine optimal key points based on the following criteria:

  • Detection rate
  • Time-to-complete (TTC)
  • Matching rate

Detection of key points

slide-10
SLIDE 10

A fixed-wing UAV is used to acquire 249

  • verlapping images over Loftus Oval, New South

Wales, Australia

  • Fixed-wing UAVs
  • Copters

Aerial Image Acquisition

Swinglet CAM and eBEE RTK – High altitude flight (> 100 m) DJI Phantom 3 and Parrot Bebop – Low altitude flight (<100 m) and system development test bed

slide-11
SLIDE 11

Aerial images over Loftus Oval

slide-12
SLIDE 12
  • Harris detector
  • Minimum Eigenvalue (MinEig)
  • Scale Invariant Feature Transform (SIFT)
  • Maximally Stable Extremal Region (MSER)
  • Speeded Up Robust Feature (SURF)
  • Features From Accelerated Segment Test (FAST)
  • Binary Robust Scale Invariant Keypoint (BRISK)

Key Points Detectors

slide-13
SLIDE 13

Two matching metrics were used to evaluate each key points detector

  • Sum of absolute differences (SAD)
  • Sum of squared differences (SSD)

Matching metrics

∑ = − + − + − + + ∑ = = 2 2 2 )) 2 , 1 ( ) , ( ( 1 1 ) 2 , 1 ( n n j d j y d i x g j y i x f n n i d d SSD ∑ = − + − + − + + ∑ = = 2 2 | )) 2 , 1 ( ) , ( ( | 1 1 ) 2 , 1 ( n n j d j y d i x g j y i x f n n i d d SAD

slide-14
SLIDE 14

1000 2000 3000 4000 5000 6000 BRISK FAST Harris MinEig MSER SIFT SURF SSD SAD

Detected key points (mean)

MinEig > SIFT > Harris > SURF > MSER > FAST > BRISK

slide-15
SLIDE 15

50 100 150 200 250 300 350 BRISK FAST Harris MinEig MSER SIFT SURF SSD SAD

Matching* key points (mean)

* Includes correct and false matching key points SIFT > MinEig > SURF > MSER > Harris > FAST > BRISK

slide-16
SLIDE 16

0.0% 2.0% 4.0% 6.0% 8.0% 10.0% 12.0% 14.0% BRISK FAST Harris MinEig MSER SIFT SURF SSD SAD

Matching* rate (%)

* Includes correct and false matching key points SIFT > SURF > MSER > FAST > BRISK > MinEig > Harris

slide-17
SLIDE 17

0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 BRISK FAST Harris MinEig MSER SIFT SURF SSD SAD

Time-to-complete (sec)

SIFT > MinEig > MSER > Harris > SURF > BRISK > FAST

slide-18
SLIDE 18

0% 20% 40% 60% 80% 100% 120% BRISK FAST Harris MinEig MSER SURF SSD SAD

Correct* matching rate (%)

Note: SIFT was not evaluated due to the difficulty of visual inspection of matching points

MinEig > Harris > BRISK > FAST > SURF > MSER * Depends on metrics

slide-19
SLIDE 19
  • SSD and SAD do not show consistent

statistics as SURF and MSER do not perform well in terms of SSD-based correct matching rate

  • SURF shows the highest correct matching

rate with respect to SAD

SSD vs SAD

slide-20
SLIDE 20

Which one is the optimal key points detector?

  • SIFT provides the highest number of

matching key points, however, too many key points would increase the processing time (similar to MinEig)

  • SURF shows an optimal processing time

(~ 0.2 sec) with a reasonable number of matching key points

Performance Analysis

slide-21
SLIDE 21
  • There exist a clustering pattern among

matching key points, which can be used for automatic classification of correct and false matching key points

Performance Analysis

slide-22
SLIDE 22
  • Use outlier information of the matching

key points

  • Apply cross-correlation (cxy) to distinguish

between correct and false matching key points in overlapping images

Automatic Visual Identification of Correct Matches

cxy (k)= 1 n (xt−x)(yt+k−y)

t=1 n−k

; 1 n (yt−y)(xt−k−x);

t=1 n+k

# $ % % % & % % %

slide-23
SLIDE 23

Algorithm

Automatic Visual Identification of Correct Matches

Start Identify Outlier, k Set value K = α+1 Clustering Cross-correlation Eliminate false matches End

slide-24
SLIDE 24

Automatic Visual Identification for SURF key points

Sample ID Matching key points

slide-25
SLIDE 25

Automatic Visual Identification for MinEig key points

Matching key points Sample ID

slide-26
SLIDE 26

Automatic Visual Identification

Keypoints Statistical Analysis Mean error Min error Max error Standard Deviation SURF 0.403 0.00 4.00 0.88 MinEig 1.26 0.00 8.00 1.31

slide-27
SLIDE 27

Matching key points before Automatic Visual Identification

slide-28
SLIDE 28

Matching Key Points After Automatic Visual Identification

slide-29
SLIDE 29
  • Performance of various keypoints detectors was

evaluated with respect to detection rate, time-to- complete and matching rate

  • A set of 249 aerial images taken using a fixed-wing

UAV’s camera has been evaluated

  • Assessments are conducted based on the chosen

criteria, which aims to fulfill the UAV’s on-board application requirements

  • It is concluded that SURF is the optimal key point

detector for on-board attitude estimation

Concluding Remarks