an on board visual based attitude estimation system for
play

An On-Board Visual-Based Attitude Estimation System for Unmanned - PowerPoint PPT Presentation

An On-Board Visual-Based Attitude Estimation System for Unmanned Aerial Vehicle Mapping M. Ridhwan Tamjis 1 and Samsung Lim 2 1 School of Civil and Environmental Engineering, UNSW Sydney, NSW, 2052, Australia Email: m.tamjis@student.unsw.edu.au


  1. An On-Board Visual-Based Attitude Estimation System for Unmanned Aerial Vehicle Mapping M. Ridhwan Tamjis 1 and Samsung Lim 2 1 School of Civil and Environmental Engineering, UNSW Sydney, NSW, 2052, Australia Email: m.tamjis@student.unsw.edu.au 2 School of Civil and Environmental Engineering, UNSW Sydney, NSW, 2052, Australia Email: s.lim@unsw.edu.au

  2. Objectives • To develop a visual-based attitude estimation system that can be used for unmanned aerial vehicle (UAV) mapping • To use the system as a backup of the Inertial Measurement Unit (IMU) to provide an accurate platform/camera pose during image acquisition • To remove dependencies on physical ground control points (GCPs) for accurate camera pose

  3. Rationale • Commercial software packages are now readily available to undertake the tasks (e.g. ERDAS, Photo modeller, Australis, Pix4D) which are good for off-board processing where computing/processing constraints are minimal • We aim for on-board attitude estimation and camera self-calibration

  4. Methodology • Use optical flow to measure the egomotion of the on-board camera, and estimate the platform’s attitude • Integrate optical flow with a keypoints detector for attitude estimation and camera self- calibration • Find the best detector for the proposed visual- based attitude estimation system

  5. Virtual Ground Control Points (VGCPs) • VGCPs are key points on the image, extracted from salient features • VGCPs should be present in overlapping images • VGCPs are crucial to estimate the pose of the camera/platform based on the image plane

  6. Image Plane Camera Focal Length, f (Xc,Yc,Zc) Image Point p (x,y) Image Plane Depth, Z Ground Point P (X,Y,Z) Ground Scene

  7. Velocity on the image plane ! $ − ( f + x 2 xy # & f ) y # & ! $ ! $ ! $ f vx Tx ! $ # & ω x # & # & # & − f 0 x − ( f + y 2 # & # & # & # & # & = 1 xy vy 0 − f y Ty f ) x × + × ω y # & # & # & # & # & Z f # & # & # & # & # & 0 0 1 " % # & # vz & # Tz & # & ω z 0 0 1 " % " % " % # & # & " %

  8. Optical flow field !

  9. Detection of key points A series of tests have been conducted to determine optimal key points based on the following criteria: • Detection rate • Time-to-complete (TTC) • Matching rate

  10. Aerial Image Acquisition A fixed-wing UAV is used to acquire 249 overlapping images over Loftus Oval, New South Wales, Australia • Fixed-wing UAVs Swinglet CAM and eBEE RTK – High altitude flight (> 100 m) • Copters DJI Phantom 3 and Parrot Bebop – Low altitude flight (<100 m) and system development test bed

  11. Aerial images over Loftus Oval

  12. Key Points Detectors • Harris detector • Minimum Eigenvalue (MinEig) • Scale Invariant Feature Transform (SIFT) • Maximally Stable Extremal Region (MSER) • Speeded Up Robust Feature (SURF) • Features From Accelerated Segment Test (FAST) • Binary Robust Scale Invariant Keypoint (BRISK)

  13. Matching metrics Two matching metrics were used to evaluate each key points detector • Sum of absolute differences (SAD) n n 1 2 SAD ( d , d ) | ( f ( x i , y j ) g ( x i d , y j d )) | = + + − + − + − ∑ ∑ 1 2 1 2 i n j n = = 1 2 • Sum of squared differences (SSD) n n 1 2 2 SSD ( d , d ) ( f ( x i , y j ) g ( x i d , y j d )) = + + − + − + − ∑ ∑ 1 2 1 2 i n j n = = 1 2

  14. Detected key points (mean) 6000 5000 4000 SSD 3000 SAD 2000 1000 0 BRISK FAST Harris MinEig MSER SIFT SURF MinEig > SIFT > Harris > SURF > MSER > FAST > BRISK

  15. Matching* key points (mean) * Includes correct and false 350 matching key points 300 250 200 SSD 150 SAD 100 50 0 BRISK FAST Harris MinEig MSER SIFT SURF SIFT > MinEig > SURF > MSER > Harris > FAST > BRISK

  16. Matching* rate (%) * Includes correct and false 14.0% matching key points 12.0% 10.0% 8.0% SSD 6.0% SAD 4.0% 2.0% 0.0% BRISK FAST Harris MinEig MSER SIFT SURF SIFT > SURF > MSER > FAST > BRISK > MinEig > Harris

  17. Time-to-complete (sec) 1.6 1.4 1.2 1 SSD 0.8 SAD 0.6 0.4 0.2 0 BRISK FAST Harris MinEig MSER SIFT SURF SIFT > MinEig > MSER > Harris > SURF > BRISK > FAST

  18. Correct* matching rate (%) * Depends on metrics 120% 100% 80% SSD 60% SAD 40% 20% 0% BRISK FAST Harris MinEig MSER SURF Note: SIFT was not evaluated due to the difficulty of visual inspection of matching points MinEig > Harris > BRISK > FAST > SURF > MSER

  19. SSD vs SAD • SSD and SAD do not show consistent statistics as SURF and MSER do not perform well in terms of SSD-based correct matching rate • SURF shows the highest correct matching rate with respect to SAD

  20. Performance Analysis Which one is the optimal key points detector? • SIFT provides the highest number of matching key points, however, too many key points would increase the processing time (similar to MinEig) • SURF shows an optimal processing time (~ 0.2 sec) with a reasonable number of matching key points

  21. Performance Analysis • There exist a clustering pattern among matching key points, which can be used for automatic classification of correct and false matching key points

  22. Automatic Visual Identification of Correct Matches • Use outlier information of the matching key points • Apply cross-correlation ( c xy ) to distinguish between correct and false matching key points in overlapping images # n − k 1 % ∑ ( x t − x )( y t + k − y ) ; % n % t = 1 c xy ( k ) = $ n + k % 1 ∑ ( y t − y )( x t − k − x ); % n % & t = 1

  23. Automatic Visual Identification of Correct Matches Algorithm Start Identify Outlier, k Set value K = α +1 Clustering Cross-correlation Eliminate false matches End

  24. Automatic Visual Identification for SURF key points Matching key points Sample ID

  25. Automatic Visual Identification for MinEig key points Matching key points Sample ID

  26. Automatic Visual Identification Statistical Analysis Keypoints Standard Mean error Min error Max error Deviation SURF 0.403 0.00 4.00 0.88 MinEig 1.26 0.00 8.00 1.31

  27. Matching key points before Automatic Visual Identification

  28. Matching Key Points After Automatic Visual Identification

  29. Concluding Remarks • Performance of various keypoints detectors was evaluated with respect to detection rate, time-to- complete and matching rate • A set of 249 aerial images taken using a fixed-wing UAV’s camera has been evaluated • Assessments are conducted based on the chosen criteria, which aims to fulfill the UAV’s on-board application requirements • It is concluded that SURF is the optimal key point detector for on-board attitude estimation

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend