TDDC17 Robotics/Perception I Piotr Rudol In case of problems use - - PowerPoint PPT Presentation

tddc17 robotics perception i
SMART_READER_LITE
LIVE PREVIEW

TDDC17 Robotics/Perception I Piotr Rudol In case of problems use - - PowerPoint PPT Presentation

TDDC17 Robotics/Perception I Piotr Rudol In case of problems use chat or microphone Feel free to interrupt if you have a question IDA/AIICS I will switch off my webcam during the lecture 1 2 Outline 4 TDDC17 Robotics /


slide-1
SLIDE 1

TDDC17 Robotics/Perception I

Piotr Rudol IDA/AIICS

1

  • In case of problems use chat or microphone
  • Feel free to interrupt if you have a question
  • I will switch off my webcam during the lecture

2

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

3

TDDC17 Robotics / Perception

Definition of a Robot

4

  • The word first appeared in the play RUR published in 1920

by Karel Capek - "robota" meaning "labor"

  • A robot is a mechanical device which performs automated

physical tasks

  • The task can be carried out according to:

– Direct human supervision (teleoperation) – Pre-defined program (industrial robots) – Set of general higher-level goals (using AI techniques)

4

slide-2
SLIDE 2

TDDC17 Robotics / Perception

Definition of a Robot

5

Many consider the first robot in the modern sense to be a teleoperated boat, similar to a modern ROV (Remotely Operated Vehicle), devised by Nikola Tesla and demonstrated at an 1898 exhibition in Madison Square Garden. Based

  • n his patent 613,809 for

"teleautomation", Tesla hoped to develop the "wireless torpedo" into an automated weapon system for the US Navy.

5

TDDC17 Robotics / Perception

Shakey

6

Shakey was one of the first autonomous mobile robots, built at the SRI AI Center during 1966-1972. Many techniques present in Shakey’s system are still under research today!

6

http://www.ai.sri.com/shakey

7

TDDC17 Robotics / Perception

Why do we use robots?

8

  • The DDD rule:
  • Dangerous, Dirty, and Dull
  • Some usage areas:
  • Industrial
  • Military
  • Search and Rescue
  • Space Exploration
  • Research
  • Entertainment
  • ....

8

slide-3
SLIDE 3

TDDC17 Robotics / Perception

Why is (AI) robotics hard?

9

  • Real-life robots have to operate in unknown dynamic

environments

  • It is a multi-disciplinary domain - from mechanics to

philosophy...

  • It involves many practical problems:

– it is very technical, – it takes a long time from an idea to a built system, – debugging can be difficult, – expensive.

9

TDDC17 Robotics / Perception

Categories of robots

10

  • Industrial robots:

– Mostly stationary – Some mobile used for transportation

  • Mobile robots:

– Ground robots – legged, wheeled (UGV – legged) – Aerial – rotor-craft – fixed-wing (UAV) – (Under) water (AUV) – Space

  • Humanoids:

– Atlas, Nao, Asimo

10

TDDC17 Robotics / Perception

Categories of robots

11

  • Industrial robots:

11

TDDC17 Robotics / Perception

Categories of robots

12

  • Mobile robots: legged

BostonDynamics: https://www.youtube.com/watch?v=3gi6Ohnp9x8

BigDog, 2009

12

slide-4
SLIDE 4

TDDC17 Robotics / Perception

Categories of robots

13

  • Mobile robots: legged

BostonDynamics: https://www.youtube.com/watch?v=aFuA50H9uek

SpotMini, 2018

13

TDDC17 Robotics / Perception

Categories of robots

14

  • Humanoids

Anybots Inc.: https://www.youtube.com/watch?v=23kCn51FW3A

Monty, 2007

14

TDDC17 Robotics / Perception

Categories of robots

15

  • Humanoids

BostonDynamics: https://www.youtube.com/watch?v=_sBBaNYex3E

Atlas, 2019

15

TDDC17 Robotics / Perception

Anatomy of Robots

16

Robots consist of:

  • Motion mechanism

– Wheels, belts, legs, propellers

  • Manipulators

– Arms, grippers

  • Computer systems

– Microcontrollers, embedded computers

  • Sensors

16

slide-5
SLIDE 5

TDDC17 Robotics / Perception

Perception

17

In order for the robots to act in the environment a suite of sensors is necessary:

  • Active

– Emit energy (sound/light) and measure how much of it comes back or/and with how large delay.

  • Passive

– Just observers, measuring energy “emitted” by the environment.

17

TDDC17 Robotics / Perception

Proprioceptive Sensors

18

Inform robot about its internal state:

  • Shaft encoders

– Odometry (a measurement of traveled distance) – Positions of arm joints

  • Inertial sensors

– Gyroscope (attitude angles: speed of rotation) – Accelerometers

  • Magnetic

– Compass

  • Force sensors

– Torque measurement (how hard is the grip, how heavy is the object) 18

TDDC17 Robotics / Perception

Position Sensors

19

Measure placement of the a robot in its environment.

  • Tactile sensors (whiskers, bumpers, etc.)
  • Sonar (ultrasonic transducer)
  • Laser range finder
  • Radar
  • (D)-GPS, A-GPS, RTK
  • Motion Capture System

19

TDDC17 Robotics / Perception

Imaging Sensors (Cameras)

20

Deliver images which can be used by computer vision algorithms to sense different types of stimuli.

  • Output data

– Color – Black-White (Thermal) – Dynamic Vision Sensor (DVS)

  • Configuration

– Monocular – Stereo – Omnidirectional – Stereo-omnidirectional

  • Type of sensor (exposure)
  • CCD
  • CMOS

20

slide-6
SLIDE 6

TDDC17 Robotics / Perception

CMOS problem

21

21

TDDC17 Robotics / Perception

CMOS problem

22 http://petapixel.com/2015/11/14/this-is-how-cameras-glitch-with-photos-of-propellers/

22

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

23

TDDC17 Robotics / Perception

Image composition

24

I(x, y, t) is the intensity at (x, y) at time t

24

slide-7
SLIDE 7

TDDC17 Robotics / Perception

Pinhole Camera Model

P is a point in the scene, with coordinates (X, Y, Z) Pʹ is its image on the image plane, with coordinates (x, y) x = fX/Z, y = fY/Z - perspective projection by similar triangles. Scale/distance is indeterminate! (X,Y,Z) (X,Y)

25

TDDC17 Robotics / Perception

Lens Distortion

26

Happens when light passes a lens on its way to the CCD element. Lens distortion is especially visible for wide angle lenses and close to edges of the image.

26

TDDC17 Robotics / Perception

Omnidirectional Lens

27

27

TDDC17 Robotics / Perception

Camera Calibration

28

Estimate:

  • camera constant f,
  • image principle point (optical axis intersects image plane),
  • lens distortion coefficients,
  • pixel size,

from different views of a calibration object

*f(…) =

28

slide-8
SLIDE 8

TDDC17 Robotics / Perception

Lens Distortion 2

29

TDDC17 Robotics / Perception

Why is computer vision hard

30

  • Noise and lighting variations are disturbing images

significantly.

  • Difficult color perception.
  • In pattern recognition - objects changing appearance

depending on their pose (occlusions).

  • Image understanding involves cognitive capabilities (i.e.

AI).

  • Real-time requirements + huge amount of data.
  • ...

30

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

31

TDDC17 Robotics / Perception

Stereo vision

30

P(X,Y,Z) pl(xl,yl) Optical Center Ol f = focal length Image plane LEFT CAMERA B = Baseline Depth f = focal length Optical Center Or pr(xr,yr) Image plane RIGHT CAMERA

dx B f D Z = = Disparity: dx = xr - xl

Left image Right image Close Far

32

slide-9
SLIDE 9

TDDC17 Robotics / Perception

A scene is photographed by two cameras: what do we gain?

Stereo vision

CMU CIL Stereo Dataset: Castle sequence

33

TDDC17 Robotics / Perception 34

34

TDDC17 Robotics / Perception 35

2.13cm 2.66cm Disparity The larger the disparity is, the closer the object is to the cameras.

35

TDDC17 Robotics / Perception

Stereo processing

36

To determine depth from stereo disparity:

  • 1. Extract the "features" from the left and right images
  • 2. For each feature in the left image, find the corresponding

feature in the right image.

  • 3. Measure the disparity between the two images of the feature.
  • 4. Use the disparity to compute the 3D location of the feature.

36

slide-10
SLIDE 10

TDDC17 Robotics / Perception

Stereo Vision for UAVs

37

Stereo pair of 1m baseline – cross-shaft for stiffness

37

TDDC17 Robotics / Perception

Stereo Vision for UAVs

38

38

TDDC17 Robotics / Perception Stereo Vision for UAVs at LiU

Yamaha RMAX

39

TDDC17 Robotics / Perception

40

slide-11
SLIDE 11

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

41

TDDC17 Robotics / Perception

Optic Flow

Optical flow methods try to calculate the motion between two image frames which are taken at times t and t + δt at every pixel position. In practice, it is often done for features instead all of all pixels.

42

TDDC17 Robotics / Perception

Optic Flow

43

TDDC17 Robotics / Perception

Optic flow we all use

44

Surface Side view

44

slide-12
SLIDE 12

TDDC17 Robotics / Perception Optic flow we all use 45

https://www.youtube.com/watch?v=Ix6532mrIKA

Optical Mouse as Arduino Web Camera

45

TDDC17 Robotics / Perception

Optic Flow

Top view

46

TDDC17 Robotics / Perception

Optic Flow

47

TDDC17 Robotics / Perception

Optic Flow

48

slide-13
SLIDE 13

TDDC17 Robotics / Perception

Obstacle Avoidance Example

49

  • Combined Optic-Flow and Stereo-Based Navigation of

Urban Canyons for a UAV [Hrabar05] – Optic flow-based technique tested on USC autonomous helicopter Avatar to navigate in urban canyons.

49

TDDC17 Robotics / Perception

USC Avatar in Urban Canyon

50

50

TDDC17 Robotics / Perception

USC Avatar in Urban Canyon cont’d

Urban Search and Rescue training site. The helicopter was flown between a tall tower and a railway carriage to see if it could navigate this ’canyon’ by balancing the flows.

Result A single successful flight was made between the tower and railway carriage. The other flights were aborted as the helicopter was blown dangerously close to the obstacles.

51

TDDC17 Robotics / Perception

USC Avatar in Urban Canyon cont’d

52

Open field lined by tall trees on one side. The helicopter was set off on a path parallel to the row of trees at the first site to see if the resultant flow would turn it away from the trees.

Result The optic flow-based control was able to turn it away from the trees successfully 5/8 times. Although it failed to turn away from the trees on

  • ccasion, it never turned towards the trees.

52

slide-14
SLIDE 14

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

53

TDDC17 Robotics / Perception

Pose Estimation

54

  • It is not always possible to directly measure robots pose in

an environment (e.g. no GPS indoor). For a robot to navigate in a reasonable way some sort of position and attitude information is necessary. This is the goal of pose estimation algorithms.

  • Example of ARToolkit [Hirokazu Kato et.al.]
  • ARToolKit (Plus) video tracking libraries calculate the real

camera position and orientation relative to physical markers in real time.

54

TDDC17 Robotics / Perception

Pose Estimation

55

TDDC17 Robotics / Perception

ARToolkit

56

56

slide-15
SLIDE 15

TDDC17 Robotics / Perception

Pose Estimation

57

TDDC17 Robotics / Perception

Pose Estimation for Indoor Flying Robots

58

Motivation:

  • Indoor-flying robots cannot use GPS signal to measure

their position in the environment.

  • For controlling a flying robot fast update of readings is

required.

  • The operation range should be as big as possible in order

to be able to fly – not just hover.

  • Micro-scale robots have very little computation power on

board.

58

TDDC17 Robotics / Perception

LinkMAV

59

59

TDDC17 Robotics / Perception

Ground robot

60

60

slide-16
SLIDE 16

TDDC17 Robotics / Perception

Ground robot

61

LinkMAV Mobile robot Camera on Pan/Tilt Unit Find: R, T

61

TDDC17 Robotics / Perception

The pattern

62

The cube-shaped structure consists of four faces - only one required for pose estimation There is a high-intensity LED in each corner; three colors (RGB) code uniquely four

  • patterns. During the flight at least one (at most two) face is visible for the ground robot

62

TDDC17 Robotics / Perception

Image processing

63

  • Four identified diodes from one face of

the cube go through the “Robust Pose Estimation from a Planar Target” algorithm to extract the pose of the face.

  • Video camera operates with closed shutter for easy and

fast diode identification.

  • Knowing which face is visible and the angles of the PTU

unit, the complete pose of the UAV relative to the ground robot can be calculated.

63

TDDC17 Robotics / Perception

Control

64

  • Four PID loops are used to control lateral displacement,

yaw and altitude of the UAV

  • Control signal (desired attitude + altitude) is sent up to the

UAV to be used in the inner loop onboard

  • Pan-Tilt unit of the camera mounted on the ground robot is

controlled by a simple P controller, which tries to keep the diodes in the center of an image

64

slide-17
SLIDE 17

TDDC17 Robotics / Perception 65

65

TDDC17 Robotics / Perception 66

66

TDDC17 Robotics / Perception 67

67

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

68

slide-18
SLIDE 18

TDDC17 Robotics / Perception

Object recognition

69

69

TDDC17 Robotics / Perception

Object recognition

70

  • Cascade of boosted classifiers working with Haar-like

features

70

TDDC17 Robotics / Perception

Object recognition

71

71

TDDC17 Robotics / Perception 72

Human body detection with UAVs

72

slide-19
SLIDE 19

TDDC17 Robotics / Perception 73

Human body detection with UAVs

73

TDDC17 Robotics / Perception 74

Human body detection with UAVs

74

TDDC17 Robotics / Perception 75

Human body detection with UAVs

75

TDDC17 Robotics / Perception 76

Human body detection with UAVs

DJI Matrice 100

76

slide-20
SLIDE 20

TDDC17 Robotics / Perception 77

Human body detection with UAVs

77

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

78

TDDC17 Robotics / Perception

Structured Light

79

79

TDDC17 Robotics / Perception

Structured Light

80

80

slide-21
SLIDE 21

TDDC17 Robotics / Perception

Structured Light

81

81

TDDC17 Robotics / Perception

Structured Light

82

82

TDDC17 Robotics / Perception

Kinect

83

IR Illuminator Color Camera IR Detector

83

TDDC17 Robotics / Perception

Kinect

84 IR Illuminator Color Camera IR Detector

84

slide-22
SLIDE 22

TDDC17 Robotics / Perception

Kinect

85

85

TDDC17 Robotics / Perception

Structured Light

86

86

TDDC17 Robotics / Perception

Structured Light

87

87

TDDC17 Robotics / Perception

Shadows

88

Obstacle IR Illuminator IR Detector Shadow

88

slide-23
SLIDE 23

TDDC17 Robotics / Perception

Shadows

89

89

TDDC17 Robotics / Perception 90

Application - FaceID

The Verge, Apple

90

TDDC17 Robotics / Perception 91

Application - LiDAR

The Verge, Apple

91

TDDC17 Robotics / Perception 92

Application - LiDAR - AR

IKEA Place app Top view Lamp

92

slide-24
SLIDE 24

TDDC17 Robotics / Perception 93

Kinect and UAVs

93

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder

94

TDDC17 Robotics / Perception

Laser Range Finder – 1D

95

Emitter Detector

95

TDDC17 Robotics / Perception

Laser Range Finder - Sweeping

96

Detector Rotating mirror Emitter

96

slide-25
SLIDE 25

TDDC17 Robotics / Perception 97

Detector Emitter Rotating mirror

Hokuyo URG-04LX

97

TDDC17 Robotics / Perception

Laser Range Finder - Sweeping

98

LiDAR Top view

98

TDDC17 Robotics / Perception 99

Examples of using a LiDAR for robot localisation and navigation tomorrow!

99

TDDC17 Robotics / Perception

Laser Range Finder

100

LiDAR Top view Mirror

100

slide-26
SLIDE 26

TDDC17 Robotics / Perception

3D Laser Scanner On a UAV

101

Yamaha RMAX Sick LMS 291 DJI Matrice 600 Pro Velodyne Puck

101

TDDC17 Robotics / Perception 102

102

TDDC17 Robotics / Perception 103

103

TDDC17 Robotics / Perception 104

104

slide-27
SLIDE 27

Outline

  • Introduction
  • Camera model & Image formation
  • Stereo-vision
  • Example for UAV obstacle avoidance
  • Optic flow
  • Vision-based pose estimation
  • Example for indoor UAVs
  • Object recognition
  • Example for outdoor UAVs
  • Kinect: Structured Light
  • Laser Range Finder
  • Next lecture: Localisation, planning, other

hardware...

105