tddc17 robotics perception i
play

TDDC17 Robotics/Perception I Piotr Rudol In case of problems use - PowerPoint PPT Presentation

TDDC17 Robotics/Perception I Piotr Rudol In case of problems use chat or microphone Feel free to interrupt if you have a question IDA/AIICS I will switch off my webcam during the lecture 1 2 Outline 4 TDDC17 Robotics /


  1. TDDC17 Robotics/Perception I Piotr Rudol • In case of problems use chat or microphone • Feel free to interrupt if you have a question IDA/AIICS • I will switch off my webcam during the lecture 1 2 Outline 4 TDDC17 Robotics / Perception Definition of a Robot • Introduction • Camera model & Image formation The word first appeared in the play RUR published in 1920 • • Stereo-vision by Karel Capek - "robota" meaning "labor" • Example for UAV obstacle avoidance A robot is a mechanical device which performs automated • • Optic flow physical tasks • Vision-based pose estimation • The task can be carried out according to: • Example for indoor UAVs – Direct human supervision (teleoperation) • Object recognition – Pre-defined program (industrial robots) • Example for outdoor UAVs – Set of general higher-level goals (using AI techniques) • Kinect: Structured Light • Laser Range Finder 3 4

  2. 5 6 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Definition of a Robot Shakey Many consider the first robot in the modern sense to be a teleoperated boat, similar to a modern ROV (Remotely Operated Vehicle), Shakey was one of the first autonomous mobile robots, built devised by Nikola Tesla and at the SRI AI Center during 1966-1972. Many techniques demonstrated at an 1898 exhibition present in Shakey’s system are still under research today! in Madison Square Garden. Based on his patent 613,809 for "teleautomation", Tesla hoped to develop the "wireless torpedo" into an automated weapon system for the US Navy. 5 6 8 TDDC17 Robotics / Perception Why do we use robots? • The DDD rule: • Dangerous, Dirty, and Dull • Some usage areas: • Industrial • Military • Search and Rescue • Space Exploration • Research • Entertainment • .... http://www.ai.sri.com/shakey 7 8

  3. 9 10 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Why is (AI) robotics hard? Categories of robots • Real-life robots have to operate in unknown dynamic • Industrial robots: environments – Mostly stationary • It is a multi-disciplinary domain - from mechanics to – Some mobile used for transportation philosophy... • Mobile robots: • It involves many practical problems: – Ground robots – legged, wheeled (UGV – legged) – it is very technical, – Aerial – rotor-craft – fixed-wing (UAV) – it takes a long time from an idea to a built system, – (Under) water (AUV) – Space – debugging can be difficult, • Humanoids: – expensive. – Atlas, Nao, Asimo 9 10 11 12 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Categories of robots Categories of robots • Mobile robots: legged • Industrial robots: BigDog, 2009 BostonDynamics: https://www.youtube.com/watch?v=3gi6Ohnp9x8 11 12

  4. 13 14 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Categories of robots Categories of robots • Mobile robots: legged • Humanoids SpotMini, 2018 Monty, 2007 BostonDynamics: https://www.youtube.com/watch?v=aFuA50H9uek Anybots Inc.: https://www.youtube.com/watch?v=23kCn51FW3A 13 14 15 16 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Categories of robots Anatomy of Robots • Humanoids Robots consist of: • Motion mechanism – Wheels, belts, legs, propellers • Manipulators – Arms, grippers • Computer systems – Microcontrollers, embedded computers • Sensors Atlas, 2019 BostonDynamics: https://www.youtube.com/watch?v=_sBBaNYex3E 15 16

  5. 17 18 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Perception Proprioceptive Sensors Inform robot about its internal state: In order for the robots to act in the environment a suite of • Shaft encoders sensors is necessary: – Odometry (a measurement of traveled distance) • Active – Positions of arm joints – Emit energy (sound/light) and measure how much of • Inertial sensors it comes back or/and with how large delay. – Gyroscope (attitude angles: speed of rotation) • Passive – Accelerometers – Just observers, measuring energy “emitted” by the • Magnetic environment. – Compass • Force sensors – Torque measurement (how hard is the grip, how heavy is the object) 17 18 19 20 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Position Sensors Imaging Sensors (Cameras) Deliver images which can be used by computer vision algorithms to sense different types of stimuli. Measure placement of the a robot in its environment. • Output data • Tactile sensors (whiskers, bumpers, etc.) – Color • Sonar (ultrasonic transducer) – Black-White (Thermal) Type of sensor (exposure) • • Laser range finder – Dynamic Vision Sensor (DVS) - CCD • Radar • Configuration CMOS - • (D)-GPS, A-GPS, RTK – Monocular – Stereo • Motion Capture System – Omnidirectional – Stereo-omnidirectional 19 20

  6. 21 22 TDDC17 Robotics / Perception TDDC17 Robotics / Perception CMOS problem CMOS problem http://petapixel.com/2015/11/14/this-is-how-cameras-glitch-with-photos-of-propellers/ 21 22 Outline 24 TDDC17 Robotics / Perception Image composition • Introduction • Camera model & Image formation • Stereo-vision • Example for UAV obstacle avoidance • Optic flow • Vision-based pose estimation • Example for indoor UAVs • Object recognition • Example for outdoor UAVs • Kinect: Structured Light • Laser Range Finder I(x, y, t) is the intensity at (x, y) at time t 23 24

  7. 26 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Pinhole Camera Model Lens Distortion Happens when light passes a lens on its way to the CCD element. (X,Y,Z) (X,Y) P is a point in the scene, with coordinates (X, Y, Z) Pʹ is its image on the image plane, with coordinates (x, y) x = fX/Z, y = fY/Z - perspective projection by similar triangles. Lens distortion is especially visible for wide angle lenses and close to edges of the image. Scale/distance is indeterminate! 25 26 27 28 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Omnidirectional Lens Camera Calibration Estimate: • camera constant f, • image principle point (optical axis intersects image plane), • lens distortion coefficients, • pixel size, from different views of a calibration object *f(…) = 27 28

  8. 30 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Lens Distortion 2 Why is computer vision hard • Noise and lighting variations are disturbing images significantly. • Difficult color perception. • In pattern recognition - objects changing appearance depending on their pose (occlusions). • Image understanding involves cognitive capabilities (i.e. AI). • Real-time requirements + huge amount of data. • ... 29 30 Outline TDDC17 Robotics / Perception Stereo vision • Introduction • Camera model & Image formation P(X,Y,Z) • Stereo-vision • Example for UAV obstacle avoidance • Optic flow Left image Right image Depth Disparity: • Vision-based pose estimation B dx = x r - x l = = Z D f dx • Example for indoor UAVs Close Image plane Image plane • Object recognition p l (x l ,y l ) p r (x r ,y r ) f = focal length f = focal length • Example for outdoor UAVs Far • Kinect: Structured Light Optical Center O l Optical Center O r B = Baseline • Laser Range Finder LEFT CAMERA RIGHT CAMERA 30 31 32

  9. TDDC17 Robotics / Perception TDDC17 Robotics / Perception Stereo vision A scene is photographed by two cameras: what do we gain? CMU CIL Stereo Dataset: Castle sequence 34 33 34 36 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Stereo processing Disparity To determine depth from stereo disparity: 2.13cm 2.66cm 1. Extract the "features" from the left and right images 2. For each feature in the left image, find the corresponding The larger the disparity is, feature in the right image. the closer the object is to 3. Measure the disparity between the two images of the feature. the cameras. 4. Use the disparity to compute the 3D location of the feature. 35 35 36

  10. 37 38 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Stereo Vision for UAVs Stereo Vision for UAVs Stereo pair of 1m baseline – cross-shaft for stiffness 37 38 TDDC17 Robotics / Perception Stereo Vision for UAVs at LiU TDDC17 Robotics / Perception Yamaha RMAX 39 40

  11. Outline TDDC17 Robotics / Perception Optic Flow • Introduction • Camera model & Image formation • Stereo-vision • Example for UAV obstacle avoidance Optical flow methods try to calculate the motion between two image frames which are taken at times t and t + δt at • Optic flow every pixel position. In practice, it is often done for features • Vision-based pose estimation instead all of all pixels. • Example for indoor UAVs • Object recognition • Example for outdoor UAVs • Kinect: Structured Light • Laser Range Finder 41 42 Optic flow we all use Optic Flow 44 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Surface Side view 43 44

  12. TDDC17 Robotics / Perception Optic flow we all use Optic Flow 45 TDDC17 Robotics / Perception Top view Optical Mouse as Arduino Web Camera https://www.youtube.com/watch?v=Ix6532mrIKA 45 46 TDDC17 Robotics / Perception TDDC17 Robotics / Perception Optic Flow Optic Flow 47 48

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend