visual slam with an event based camera
play

Visual SLAM with an Event-based Camera Hanme Kim Supervisor: Prof. - PowerPoint PPT Presentation

Qualcomm Augmented Reality Lecture Series: Visual SLAM with an Event-based Camera Hanme Kim Supervisor: Prof. Andrew Davison Robot Vision Group Department of Computing Imperial College London January 27, 2015 Hanme Kim Visual SLAM with an


  1. Qualcomm Augmented Reality Lecture Series: Visual SLAM with an Event-based Camera Hanme Kim Supervisor: Prof. Andrew Davison Robot Vision Group Department of Computing Imperial College London January 27, 2015 Hanme Kim Visual SLAM with an Event-based Camera

  2. Robot Vision Group • Principle investigator: Prof. Andrew Davison. MonoSLAM - Davison, ICCV 2003 DTAM - Newcombe et al., ICCV 2011 KinectFusion - Newcombe et al., ISMAR 2011 Dyson Robotics Lab • Closely linked to the Dyson Robotics Laboratory. Hanme Kim Visual SLAM with an Event-based Camera

  3. Self Introduction • 2nd year PhD student with industrial experience (about 12 years). CCTV A/V Mixer ISP Sensor Guided Robot ANPR SceneLib2 Hanme Kim Visual SLAM with an Event-based Camera

  4. Visual SLAM Applications • We are interested in visual SLAM applications. Dyson 360 Eye Google Tango Amazon PrimeAir • They require: ◦ fast control feedback; ◦ high dynamic range; ◦ low power consumption and hardware complexity. Hanme Kim Visual SLAM with an Event-based Camera

  5. Limited by Conventional Imaging Sensors • Only work well with controlled camera motion and scene condition. • High power consumption (e.g. Hands-free Google Glass with a battery pack on the hand!). • High hardware complexity (e.g. powerful GPU requirement). www.techradar.com Hanme Kim Visual SLAM with an Event-based Camera

  6. Sophisticated State-of-the-Art Algorithms • Sophisticated algorithms which mainly reduce their computational burden by selecting and processing only informative data. Semi-Dense VO, Engel, J. et al., ICCV 2013 Semi-Direct VO, Forster, C. et al., ICRA 2014 • They still rely on conventional imaging sensors, therefore they still suffer from some of the limitations (e.g. blind between frames, low dynamic range, high power consumption, etc.). Hanme Kim Visual SLAM with an Event-based Camera

  7. Motivation • How to satisfy these requirements of real-time SLAM applications? • Can bio-inspired silicon retinas from Neuromorphics be a solution? DVS128, iniLabs & Background from Nano Retina Inc. Hanme Kim Visual SLAM with an Event-based Camera

  8. Event-based Camera Benosman, R. et al., 2014 Hanme Kim Visual SLAM with an Event-based Camera

  9. DVS (Dynamic Vision Sensor) Live Demo Hanme Kim Visual SLAM with an Event-based Camera

  10. Event-based Camera • Advantages: ◦ Asynchronous and fast visual measurements, low latency. ◦ High dynamic range. ◦ Compressed visual information requires lower transmission bandwidth, storage capacity, processing time, and power consumption. • Has potential to overcome the limitations of conventional imaging sensors. • Requires totally new computer vision algorithms. Hanme Kim Visual SLAM with an Event-based Camera

  11. e.g. How To Calibrate It? Hanme Kim Visual SLAM with an Event-based Camera

  12. Related Work: Tracking and Recognition • Tracking algorithms showing its low latency capability. Delbruck, T. and Lichtsteiner, P., 2007 DVS Laser Tracker, 2008 Conradt, J. et al., 2009 • Combined with biologically inspired learning approaches. P´ erez-Carrasco, J. A. et al., PAMI 2013 Lee, J. et al., ISCAS 2012 Hanme Kim Visual SLAM with an Event-based Camera

  13. Related Work: SLAM Applications • SLAM with limitations or extra sensors. Weikersdorfer, D. et al., ICVS 2013 Mueggler, E. et al., IROS 2014 Weikersdorfer, D. et al., ICRA 2014 Censi, A. and Scaramuzza, D., ICRA 2014 Hanme Kim Visual SLAM with an Event-based Camera

  14. What We Want To Achieve • 3D Visual SLAM with a single event-based camera: ◦ able to track extremely fast 6 DoF camera motion and reconstruct 3D scenes; ◦ requires low computational cost, hardware complexity and power consumption; ◦ suitable for real world applications. ETAM (Event-based Tracking and Mapping), conceptual drawing Hanme Kim Visual SLAM with an Event-based Camera

  15. Simultaneous Mosaicing and Tracking with an Event Camera • Hanme Kim, Ankur Handa, Ryad Benosman, Sio-Hoi Ieng, Andrew J. Davison. • Published at BMVC (British Machine Vision Conference) 2014. • Oral presentation (7.7% acceptance rate). • Best industry paper. Hanme Kim Visual SLAM with an Event-based Camera

  16. Proposed Algorithm Hanme Kim Visual SLAM with an Event-based Camera

  17. Event-based Tracking { p ( t ) 1 , p ( t ) 2 , p ( t ) 3 , p ( t ) 4 } , p ( t ) = { R ( t ) ∈ SO (3) , w ( t ) } i i i Hanme Kim Visual SLAM with an Event-based Camera

  18. Event-based Tracking R ( t ) = R ( t − τ ) exp( � 3 k =1 n k G k ) , n i ∼ N (0 , σ 2 i ) i i Hanme Kim Visual SLAM with an Event-based Camera

  19. Event-based Tracking Hanme Kim Visual SLAM with an Event-based Camera

  20. Event-based Tracking w ( t ) = P ( z | R ( t ) 1 ) w ( t − τ ) , z = log( M ( p ( t ) m )) − log( M ( p ( t − τ c ) )) m 1 1 Hanme Kim Visual SLAM with an Event-based Camera

  21. Event-based Tracking w ( t ) = P ( z | R ( t ) 1 ) w ( t − τ ) , z = log( M ( p ( t ) m )) − log( M ( p ( t − τ c ) )) m 1 1 Hanme Kim Visual SLAM with an Event-based Camera

  22. Event-based Tracking w ( t ) = P ( z | R ( t ) 2 ) w ( t − τ ) , z = log( M ( p ( t ) m )) − log( M ( p ( t − τ c ) )) m 2 2 Hanme Kim Visual SLAM with an Event-based Camera

  23. Event-based Tracking w ( t ) = P ( z | R ( t ) 3 ) w ( t − τ ) , z = log( M ( p ( t ) m )) − log( M ( p ( t − τ c ) )) m 3 3 Hanme Kim Visual SLAM with an Event-based Camera

  24. Event-based Tracking w ( t ) = P ( z | R ( t ) 4 ) w ( t − τ ) , z = log( M ( p ( t ) m )) − log( M ( p ( t − τ c ) )) m 4 4 Hanme Kim Visual SLAM with an Event-based Camera

  25. Event-based Tracking Hanme Kim Visual SLAM with an Event-based Camera

  26. Event-based Tracking Hanme Kim Visual SLAM with an Event-based Camera

  27. Event-based Tracking Hanme Kim Visual SLAM with an Event-based Camera

  28. Event-based Tracking Hanme Kim Visual SLAM with an Event-based Camera

  29. Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  30. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  31. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  32. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  33. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  34. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  35. Pixel-wise EKF Gradient Estimation • Measurement z and measurement model h . z ( t ) = 1 h ( t ) = g ( t ) · v ( t ) τ c , C • Update the gradient vector and its covariance matrix using standard EKF equation. g ( t ) = g ( t − τ c ) + W ν, P ( t ) = P ( t − τ c ) − W S W ⊤ g g ν = z ( t ) − h ( t ) ⊤ S − 1 W = P ( t − τ c ) ∂ h g ∂ g ⊤ + R ∂ g P ( t − τ c ) S = ∂ h ∂ h g ∂ g Hanme Kim Visual SLAM with an Event-based Camera

  36. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  37. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  38. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  39. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  40. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  41. Pixel-wise EKF Gradient Estimation Hanme Kim Visual SLAM with an Event-based Camera

  42. Reconstruction from Gradients in 1D Agrawal, A. and Rasker, R., 2007 Hanme Kim Visual SLAM with an Event-based Camera

  43. Reconstruction from Gradients in 1D Agrawal, A. and Rasker, R., 2007 Hanme Kim Visual SLAM with an Event-based Camera

  44. Reconstruction from Gradients in 1D Agrawal, A. and Rasker, R., 2007 Hanme Kim Visual SLAM with an Event-based Camera

  45. Reconstruction from Gradients in 2D • Reconstruct the log intensity of the image whose gradients M x and M y across the whole image domain are close to the estimated gradient g x and g y in a least squares sense (Tumblin, J. et al. , 2005): ( M x − g x ) 2 + ( M y − g y ) 2 dxdy . � � J ( M ) = The Euler-Lagrange equation to minimise J ( M ) is: ∂ J ∂ M − d ∂ M x − d ∂ J ∂ J ∂ M y = 0 d x d y which leads to the well known Poisson equation: ∇ 2 M = ∂ x g x + ∂ ∂ ∂ y g y . • We use a sine transform based method to solve the Poisson equation (Agrawal, A. et al. , 2005 and 2006). Hanme Kim Visual SLAM with an Event-based Camera

  46. Demo Video Hanme Kim Visual SLAM with an Event-based Camera

  47. Contributions and Limitations • Contributions ◦ Track camera rotation while building a mosaic of a scene purely based on an event stream with no additional sensing. ◦ Reconstruct high resolution and dynamic range scenes by harnessing the characteristics of event cameras. ◦ Show that all visual information is in the event stream by reconstructing scenes. • Limitations ◦ Processing time depends on the # of particles and map resolution. ◦ No proper bootstrapping. Hanme Kim Visual SLAM with an Event-based Camera

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend