computer vision for mobile robots in gps denied areas
play

Computer Vision for Mobile Robots in GPS Denied Areas Michael - PowerPoint PPT Presentation

Computer Vision for Mobile Robots in GPS Denied Areas Michael Berli, 28th of April 2015 Supervisor: Tobias Ngeli 1 Robots can work in places we as humans can't reach and they can do jobs we are unable or unwilling to do. 2 [1,2]


  1. Computer Vision for Mobile Robots in GPS Denied Areas Michael Berli, 28th of April 2015 Supervisor: Tobias Nägeli 1

  2. Robots can work in places we as humans can't reach and they can do jobs we are unable or unwilling to do. 2 [1,2]

  3. Autonomous mobile robots § How do we make robots navigate autonomously? Robots should be able to explore an unknown environment and navigate inside this environment without active human control 3

  4. Autonomous mobile robots § Using computer vision for autonomous navigation Mapless Map-Based Map-Building 4

  5. Robots 5 [3,4,5]

  6. Focus in this talk Type of robot § Autonomous Ground Vehicles Environment § Indoor environments (rooms, tunnels, warehouses) Sensors § Cameras, wheel sensors 6

  7. Robot scenarios: Industrial-Automation 7 [6]

  8. Robot scenarios: Inspection & Discovery 8 [7]

  9. Robot scenarios: Space operations 9 [8]

  10. The three navigation classes Mapless Map-Based Map-Building 10

  11. Mapless Navigation Walk through Paris without colliding 11 [10]

  12. Collision Avoidance 12

  13. Optical Flow § Describe the motion of patterns in successive images (x,y) u v (x,y) (x+dx,y+dy) Frame @ t Frame @ t+1 13

  14. Optical Flow 14 [11]

  15. t 0 t 1 15 [11]

  16. Optical Flow § Get an understanding of depth in images § Time-To-Contact between a camera and an object 16 [11]

  17. Optical Flow: Time-To-Contact 17

  18. Optical Flow: Time-To-Contact 18

  19. Optical Flow: Time-To-Contact FOE Focus of Expansion Where the camera points at 19

  20. Optical Flow: Time-To-Contact Left Flow Central Flow Right Flow FOE TTCr TTCc TTC l 20

  21. Obstacle Avoidance FSM 21 [23]

  22. Inspired by biology 22

  23. Inspired by biology 23

  24. Inspired by biology Maximum of optical flow 24

  25. Optical Flow: Further applications § Applications for visually impaired § Image Stabilization § Video Compression (MPEG) Drawbacks § Hard if no textures § Dynamic scenes? 25

  26. The three navigation classes C Mapless B A D Map-Based E F G Map-Building 26

  27. Map-Based Navigation Use a map of Paris to navigate to champs elysée 27 [12]

  28. Map-Based Navigation: Robot Scenario 28 [13]

  29. Map-Based Navigation: Map Representation Topological Map Graph-based representation of features and their relations, often associated with actions. feature + simple and compact path - no absolute distances - obstacle avoidance needed Metric Map Two-Dimensional space in which objects and paths are placed. + very precise - hard to obtain and to maintain 29

  30. Map-Based Navigation Example Build a topological Use the topological map of the floor map to navigate 30

  31. Feature Extraction Feature Elements which can easily be re-observed and distinguished from the environment § Features should be § Easily re-observable and distinguishable § Plentiful in the environment § Stationary 31

  32. Room Identification F Signature Room F 32 [14]

  33. Topological Map 33 [14]

  34. Room Searching Signature matching 34 [P]

  35. Drawbacks and Extensions § Learning and maintenance is expensive remove cupboard ? § Use scanner tags or artificial beacons? 35

  36. The three navigation classes Mapless Map-Based Map-Building 36 [15]

  37. Map-Building Navigation Leave your hotel in Paris, explore the environment and return to the hotel afterwards 37 [16]

  38. Map-Building Navigation § Goal: in an unknown environment the robot can build a map and localize itself in the map § Two application categories § Structure from Motion (Offline) § Simultaneous Localization and Mapping (SLAM) ß Real-Time! 38

  39. Structure from Motion (Offline) Robot moves around and captures video frames Frame-To-Frame 3D Map and trajectory feature detection reconstruction Pros Cons § Offline approach § Well studied § Changing environment § Very accurate and robust requires new learning phase solution 39

  40. Simultaneous Localisation and Mapping (SLAM) § Build a map using dead reckoning and camera readings § We focus on EKF-SLAM (Extended Kalman Filter) 40

  41. 41 [15]

  42. A map built with SLAM 42 [15]

  43. Dead Reckoning § Motion estimation with data from odometry and heading sensors Uncertainty Prediction Starting position 43

  44. Six steps of map-building (1/2) 44 [17]

  45. Six steps of map-building (2/2) 45 [17]

  46. EKF-SLAM: The system This system is represented by - System state vector - System covariance matrix 46

  47. EKF-SLAM: The state vector ! $ x r ! $ y 1 = x 1 ⌢ ⌢ # & x v = y r # & # & y 1 " % # & θ r " % ! $ y 2 = x 2 ⌢ # & y 2 " % ! $ y 3 = x 3 ⌢ # & y 3 " % 47

  48. EKF-SLAM: The covariance matrix 48

  49. SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of Robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 49

  50. Motion model § Estimate robot‘s new position after a movement Motion model x v = f v ( ˆ x v , u ) old Estimated position robot position odometry 50

  51. SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 51

  52. Measurement model § Based on the predicted robot position and the map, use a measurement model to predict which features should be in view now 52

  53. SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 53

  54. Data matching § Match predicted and observed features Prediction Camera 54

  55. SLAM Process ESTIMATION Robot moved of updated robot position PREDICTION of robot position EKF Fusion PREDICTION of observed features Match predicted and observed features Camera Feature Extraction 55

  56. EKF Fusion Residual Prediction Camera 56

  57. EKF Fusion 57

  58. EKF Update 58

  59. SLAM – Research topics § Robustness in changing environments § Multiple robot mapping 59

  60. Motion estimation of agile cameras § Real-Time SLAM with a Single Camera § Andrew J. Davison, University of Oxford, 2003 § Parallel Tracking and Mapping for Small AR Workspaces § Georg Klein, David Murray, University of Oxford, 2007 60 [18]

  61. Motion estimation of agile cameras § No odometry data, fast and unpredictable movements § Use a constant velocity model instead of odometry x v = ( x y z v x v y v z v α v β v δ ) α β δ Position Velocity Orientation 61

  62. 62 [19]

  63. Motion estimation of agile cameras § Real-Time SLAM with a Single Camera § Andrew J. Davison, University of Oxford, 2003 § Parallel Tracking and Mapping for Small AR Workspaces § Georg Klein, David Murray, University of Oxford, 2007 63 [19]

  64. Tracking and Mapping for AR Workspaces 64 [20]

  65. 65 [21]

  66. What we have seen § What autonomous mobile robots are used for § How todays mobile robots navigate autonomously § mapless, map-based, map-building § The potential and the challenges of SLAM 66

  67. References Papers 1. Bonin-Font, Francisco, Alberto Ortiz, and Gabriel Oliver. " Visual navigation for mobile robots: A survey. " Journal of intelligent and robotic systems 53.3 (2008): 263-296. 2. Davison, Andrew J. " Real-time simultaneous localisation and mapping with a single camera ." Proceedings of 9th IEEE International Conference onComputer Vision, 2003. 3. Klein, Georg, and David Murray. " Parallel tracking and mapping for small AR workspaces ." Proceedings of 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), 2007 4. Davison, Andrew J. " Sequential localisation and map-building for real-time computer vision and robotics “, Robotics and Autonomous Systems 36 (2001) 171-183. 2001 5. Mehmed Serdar Guzel, Robert Bicker. “ Optical Flow Based System Design for Mobile Robots ”, Robotics Automation and Mechatronics, 2010 6. M. Mata, J-M.Armingol, A. de la Escalera, M.A. Salichs. “ Using learned visual landmarks for intelligent topological navigation of mobile robots ”, Mata, 2003 67

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend