iCub a shared platform for research in robotics & AI Genoa - - PowerPoint PPT Presentation

icub
SMART_READER_LITE
LIVE PREVIEW

iCub a shared platform for research in robotics & AI Genoa - - PowerPoint PPT Presentation

iCub a shared platform for research in robotics & AI Genoa June 25, 2015 Giorgio Metta & the iCub team Istituto Italiano di Tecnologia Via Morego, 30 - Genoa, Italy we have a dream 6/25/2015 3 the iCub price: 250K 30 iCub


slide-1
SLIDE 1

iCub

a shared platform for research in robotics & AI

Genoa

June 25, 2015

Giorgio Metta & the iCub team

Istituto Italiano di Tecnologia Via Morego, 30 - Genoa, Italy

slide-2
SLIDE 2

we have a dream

6/25/2015 3

slide-3
SLIDE 3

the iCub

6/25/2015 4

price: 250K€ 30 iCub distributed since 2008 about 3-4 iCub’s/year

slide-4
SLIDE 4

why is the iCub special?

6/25/2015 5

  • hands: we started the design from the hands

– 5 fingers, 9 degrees of freedom, 19 joints

  • sensors: human-like, e.g. no lasers

– cameras, microphones, gyros, encoders, force, tactile…

  • electronics: flexibility for research

– custom electronics, small, programmable (DSPs)

  • reproducible platform: community designed

− reproducible & maintainable yet evolvable platform − large software repository (~2M lines of code)

slide-5
SLIDE 5

why humanoids?

6/25/2015 6

  • scientific reasons

– e.g. elephants don’t play chess

  • natural human-robot interaction
  • challenging mechatronics
  • fun!
slide-6
SLIDE 6

why open source?

6/25/2015 7

  • repeatable experiments
  • benchmarking
  • quality

this resonates with industry-grade R&D in robotics

slide-7
SLIDE 7
  • pen source

6/25/2015 8

slide-8
SLIDE 8

6/25/2015 9

slide-9
SLIDE 9

series-elastic actuators

6/25/2015 10

  • C spring design
  • 320Nm/rad stiffness
  • features:

– stiffness by design – no preloading necessary (easy assembly) – requires only 4 custom mechanical parts – high resolution encoder for torque sensing

slide-10
SLIDE 10

Yet Another Robot Platform

6/25/2015 11

  • YARP is an open-source (LGPL)

middleware for humanoid robotics

  • history

– an MIT / Univ. of Genoa collaboration – born on Kismet, grew on COG, under QNX – with a major overhaul, now used by the iCub project

  • C++ source code (some 400K lines)
  • IPC & hardware interface
  • portable across OSs and development

platforms

2000-2001 2001-2002 2003 2004-Today

slide-11
SLIDE 11

exploit diversity: portability

6/25/2015 12

  • perating system portability:

– Adaptive Communication Environment, C++ OS wrapper: e.g. threads, semaphores, sockets

  • development environment portability:

– CMake

  • language portability:

– via Swig: Java (Matlab), Perl, Python, C#

C/C++ library C/C++ library C/C++ library C/C++ library

slide-12
SLIDE 12

6/25/2015 13

wiki & manual SVN & GIT part lists drawings

slide-13
SLIDE 13

iCub sensors

6/25/2015 14

slide-14
SLIDE 14

torque control

6/25/2015 15

̂ ∙

  • ̂

∙ ∙ ∙ ∙ ∙

slide-15
SLIDE 15

learning dynamics

6/25/2015 16

  • learning body dynamics

– compute external forces – implement compliant control

  • so far we did it starting from

e.g. the cad models

– but we’d like to avoid it

slide-16
SLIDE 16
  • ur method in 4 easy steps

6/25/2015 17

  • regularized least square
  • kernelized
  • approximate kernel
  • make it incremental

   

m i i i

x x k c x f

1

,

 

y I K c

1 

  

 

 

 

     

 D d j w T i w j i

x z x z D E x x k

d d

1

1 ,

 

     

x w x w x z

T T w

sin , cos 

 

y I w

T T

    

1

 + Cholesky rank-1 update

 

x w x f

T

2 2

2 1 2 min Xw y w J

w

   

 

y X X X I w

T T 1 

  

slide-17
SLIDE 17

properties

  • O(1) update complexity w.r.t. # training samples
  • exact batch solution after each update
  • dimensionality of feature mapping trades computation for approximation

accuracy

  • O(D²) time and space complexity per update w.r.t. dimensionality of feature

mapping

  • easy to understand/implement (few lines of code)
  • not exclusively for dynamics/robotics learning!

6/25/2015 18

slide-18
SLIDE 18

batch experiments

  • 3 inverse dynamics datasets: Sarcos, Simulated Sarcos,

Barrett [Nguyen-Tuong et al., 2009]

  • approximately 15k training and 5k test samples
  • comparison with LWPR, GPR, LGP, Kernel RR
  • RFRR with 500, 1000, 2000 random features
  • hyperparameter optimization by exploiting functional similarity

with GPR (log marginal likelihood optimization)

6/25/2015 19

slide-19
SLIDE 19

datasets

6/25/2015 20

slide-20
SLIDE 20

6/25/2015 21

slide-21
SLIDE 21

6/25/2015 22

slide-22
SLIDE 22

6/25/2015 23

slide-23
SLIDE 23

incremental experiments

6/25/2015 24

slide-24
SLIDE 24

incremental experiments

6/25/2015 25

slide-25
SLIDE 25

temperature compensation

6/25/2015 26

slide-26
SLIDE 26

summary

  • Fast prediction and model update of RFRR

– 200RF: 400μs – 500RF: 2ms – 1000RF: 7ms

  • Non-stationary: thermal sensor drift in force components
  • Rapid convergence of RFRR
  • No further gain by using additional random features

(problem specific)

6/25/2015 27

slide-27
SLIDE 27

experiments & model validation

6/25/2015 28

static configuration:

an additional six axis F/T sensor is placed at the end effector to measures the external wrenches we

Joint Joint 0 Joint Joint 1 Joint Joint 2 Joint Joint 3 E ( E (τj-τ

  • τft

ft)

0.127 Nm

  • 0.049 Nm
  • 0.002 Nm
  • 0.032 Nm

σ ( σ (τj-τ

  • τft

ft)

0.186 Nm 0.131 Nm 0.013 Nm 0.042 Nm E ( E (τj-(τ

  • (τm+τ

+τe)) )) 0.075 Nm

  • 0.098 Nm
  • 0.006 Nm

0.006 Nm σ ( σ (τj-(τ

  • (τm+τ

+τe)) )) 0.191 Nm 0.173 Nm 0.020 Nm 0.032 Nm

additional F/T sensor at the end-effector in this experiment we consider the following quantities:

  • joint torques measured by the joint torque sensors: tj
  • joint torques computed from the arm F/T sensor: tFT
  • joint torques estimated thought the additional F/T sensor located

at the end effector: te=JTwe

  • joint torques predicted by the arm model (no external forces): tm
slide-28
SLIDE 28

the robot skin

6/25/2015 29

capacitor

electrodes: etched on a flexible PCB parameters: shape, folding, etc. soft material: e.g. silicone parameters: dielectric constant, mechanical stiffness, etc. ground plane: e.g. conductive fabric parameters: mechanical properties, impedance, etc.

slide-29
SLIDE 29

6/25/2015 30

principle

lots of sensing points structure of the skin

slide-30
SLIDE 30

tests of various materials

6/25/2015 31 2 4 6 8 10 12 14 16 18 20 40 60 80 100 120

LC [kPa] C [bit]

Soma Foama Hard PDMS Soma Foama Soft Neoprene2mm SpongeRubberLycra NeopreneGrid2mm NeoGridNeobulkLycra NeoGridNeobulkLycra2 Neoprene2mmLycra NeopreneGrid3mm

Others (discarded)

slide-31
SLIDE 31

latest implementation

6/25/2015 32

advantages:

  • good performance: gluing is made with industrial machines, no hysteresis due to glue
  • production: automatic and reliable
  • mounting and replacing is easy, easy ground connection
  • protective layer can be of different materials  increase reliability
  • customizations: surface can be printed
slide-32
SLIDE 32

6/25/2015 33

slide-33
SLIDE 33

skin calibration

6/25/2015 34

performed manually by poking the robot with a tool works iteratively with different datasets taken in different robot positions

  • A. Del Prete, S. Denei, L. Natale, F. Mastrogiovanni, F. Nori, G. Cannata, and G. Metta. “Skin spatial calibration using

force/torque measurements”, in Intelligent Robots and Systems (IROS), 2011

slide-34
SLIDE 34

6/25/2015 35

slide-35
SLIDE 35

6/25/2015 36

slide-36
SLIDE 36

6/25/2015 37

slide-37
SLIDE 37

6/25/2015 38

project CoDyCo, Nori et al.

slide-38
SLIDE 38

floating base robots

6/25/2015 39

slide-39
SLIDE 39

6/25/2015 40

slide-40
SLIDE 40

6/25/2015 41

slide-41
SLIDE 41

6/25/2015 42

slide-42
SLIDE 42

6/25/2015 43

slide-43
SLIDE 43

point to point movements

6/25/2015 44

slide-44
SLIDE 44

ipopt

6/25/2015 45

  • quick convergence (<20ms)
  • scalability
  • singularities and joints bound handling
  • tasks hierarchy
  • complex constraints

. .

  • merges joint and Cartesian space trajectories

min

,

1 2 . . ∙

slide-45
SLIDE 45

6/25/2015 46

slide-46
SLIDE 46

6/25/2015 47

slide-47
SLIDE 47

6/25/2015 48

slide-48
SLIDE 48

6/25/2015 49

Please put those into the dishwashing machine Could you please help me with the TV set?

The iCub puts the plates into the dishwashing machine Trueswell, J. C., & Gleitman, L. R. (2007) Learning to parse and its implications for language acquisition, in G. Gaskell (ed.) Oxford Handbook of Psycholing. Oxford: Oxford Univ. Press.

slide-49
SLIDE 49

6/25/2015 50

The iCub puts the plates into the dishwashing machine

actions

  • bjects

tools

slide-50
SLIDE 50

6/25/2015 51

actions

learning actions recognizing actions

  • bjects

learning objects recognizing objects

tools

learning tools using tools learn use

slide-51
SLIDE 51

6/25/2015 52

Ranked 2nd at Microsoft Kinect Demonstration Competition, CVPR 2012 Providence, Rhode Island

slide-52
SLIDE 52
  • bject recognition

6/25/2015 53

self-supervised strategies kinematics motion

Human Robot Interaction is a new and natural application for visual recognition In robotics settings strong cues are often available, therefore object detectors can be avoided Recognition as tool for complex tasks: grasp, manipulation, affordances, pose

S.R. Fanello, C. Ciliberto, L. Natale, G. Metta – Weakly Supervised Strategies for Natural Object Recognition in Robotics, ICRA 2013

  • C. Ciliberto, S.R. Fanello, M. Santoro, L. Natale, G. Metta, L. Rosasco - On the Impact of Learning Hierarchical Representations for Visual

Recognition in Robotics, IROS 2013

slide-53
SLIDE 53

6/25/2015 54

slide-54
SLIDE 54

dataset

6/25/2015 55

slide-55
SLIDE 55

iCubWorld dataset (2.0)

6/25/2015 56

  • Growing dataset collecting images from a real robotic setting
  • Provide the community with a tool for benchmarking visual recognition systems in robotics
  • 28 Objects, 7 categories, 4 sessions of acquisition (four different days)
  • 11Hz acquisition frequency.
  • ~50K Images

http://www.iit.it/en/projects/data-sets.html

slide-56
SLIDE 56

methods

6/25/2015 57

slide-57
SLIDE 57

methods

6/25/2015 58

Sparse Coding [yang et al. ’09] Learned on iCubWorld Sparse Coding [yang et al. ’09] Learned on iCubWorld Overfeat [sermanet et al. ’14] Learned on Imagenet Overfeat [sermanet et al. ’14] Learned on Imagenet

slide-58
SLIDE 58

some questions

  • Scalability. How do iCub recognition capability

decrease as we add more objects to distinguish?

  • Can we use assumptions on physical continuity to

make recognition more stable?

  • Incremental Learning. How does learning during

multiple sessions affect the system recognition skills?

  • Generalization. How well does the system

recognize objects “seen” under different settings?

6/25/2015 59

slide-59
SLIDE 59

first results

6/25/2015 60

We started by addressing ins instance ance recognition cognition

slide-60
SLIDE 60

performance wrt # of objects

6/25/2015 61

slide-61
SLIDE 61

exploiting continuity in time

6/25/2015 62

slide-62
SLIDE 62

incremental learning

6/25/2015 63

  • Present: test on current day
  • Causal: test on current and

past days

  • Future: test on future days

(current not included)

  • Independent: train & test on

current day only Cumulative learning on the 4 days of acquisition. Tested on:

1 2 3 4 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

day accuracy (avg. over classes)

OF present OF causal OF future OF Independent

slide-63
SLIDE 63

what about a week?

6/25/2015 64 1 2 3 4 5 6 7 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

day accuracy (avg. over classes)

OF present OF causal OF future OF Independent

?

slide-64
SLIDE 64

… or a month?

6/25/2015 65 1 6 11 16 21 26 30 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1

day accuracy (avg. over classes)

OF present OF causal OF future OF Independent

??

slide-65
SLIDE 65

6/25/2015 66

slide-66
SLIDE 66

6/25/2015 67

slide-67
SLIDE 67

Experiments on affordances

6/25/2015 68

slide-68
SLIDE 68

Experiments on affordances

6/25/2015 69

slide-69
SLIDE 69

Experiments on affordances

6/25/2015 70

slide-70
SLIDE 70

Experiments on affordances

6/25/2015 71

slide-71
SLIDE 71

6/25/2015 72

slide-72
SLIDE 72

3D vision for grasping

6/25/2015 73

Input Segmentation Disparity

slide-73
SLIDE 73

grasping

6/25/2015 74

Gori,I; Pattacini, U; Tikhanoff, V; Metta G; (submitted 2013) Ranking the Good Points: A Comprehensive Method for Humanoid Robots to Grasp Unknown Objects 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2013)

point cloud from stereo minimum bounding box segmentation surface extraction grasp points from curvature + bias (task dependent) grasp selection from experience

slide-74
SLIDE 74

6/25/2015 75

slide-75
SLIDE 75

force reconstruction

6/25/2015 76

slide-76
SLIDE 76

raw data

6/25/2015 77

slide-77
SLIDE 77

GP approximation

6/25/2015 78

slide-78
SLIDE 78

6/25/2015 79

slide-79
SLIDE 79

6/25/2015 81 from: Fogassi L., Gallese V., Fadiga L., Luppino G., Matelli M., Rizzolatti G. Coding of peripersonal space in inferior premotor cortex (area F4). Journal of Neurophysiology 76 (1) 1996.

From: Graziano 1999 From: Graziano et al. 2006

slide-80
SLIDE 80

spinal reflexes

6/25/2015 82

walking behavior: cat rehabilitated to walk after complete spinal cord transection wiping reflex: an irritating stimulus elicits a wiping movement precisely directed at the stimulus location

from: Poppele, R., & Bosco, G. (2003). Sophisticated spinal contributions to motor control. Trends in Neurosciences, 26(5), 269-276.

slide-81
SLIDE 81

visuo-tactile fusion

6/25/2015 83

slide-82
SLIDE 82

double touch

6/25/2015 84

From two fixed-base chain to a single floating-base serial chain  12 dof

slide-83
SLIDE 83

6/25/2015 85

slide-84
SLIDE 84

6/25/2015 86

slide-85
SLIDE 85

receptive fields

  • Receptive field: a cone that extends up to 0.2m and angle of 40

6/25/2015 87

slide-86
SLIDE 86

single taxel model

6/25/2015 88

slide-87
SLIDE 87

no noise

6/25/2015 89

slide-88
SLIDE 88

noisy data

6/25/2015 90

slide-89
SLIDE 89

with double touch

6/25/2015 91

slide-90
SLIDE 90

visual tracker

6/25/2015 92

slide-91
SLIDE 91

external stimulation

6/25/2015 93

slide-92
SLIDE 92

6/25/2015 94

slide-93
SLIDE 93

extending peripersonal space

6/25/2015 95

slide-94
SLIDE 94

6/25/2015 96

slide-95
SLIDE 95

6/25/2015 97

slide-96
SLIDE 96

6/25/2015 98

slide-97
SLIDE 97

what future?

6/25/2015 99 iCub2.0 iCub3.0 now the future? iCub new tech

slide-98
SLIDE 98

6/25/2015 100

slide-99
SLIDE 99

6/25/2015 101

slide-100
SLIDE 100

6/25/2015 102

“How old are you?” she wanted to know. “Thirty-two,” I said. “Then you don’t remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to

  • him. Mankind is no longer alone. Have you ever thought of it that way?”
slide-101
SLIDE 101

external funding

6/25/2015 103

– RobotCub, grant FP6-004370, http://www.robotcub.org – CHRIS, grant FP7-215805, http://www.chrisfp7.eu – ITALK, grant FP7-214668, http://italkproject.org – Robotdoc, grant FP7-ITN-235065 http://www.robotdoc.org – Roboskin, grant FP7-231500 http://www.roboskin.eu – eMorph, grant FP7-231467 http://www.emorph.eu – Poeticon, grant FP7-215843 http://www.poeticon.eu

  • more information:

http://www.iCub.org

completed projects – Poeticon++, grant FP7-288382 http://www.poeticon.eu – Xperience, grant FP7-270273 http://www.xperience.org – EFAA, grant FP7-270490 http://efaa.upf.edu/ – Codyco, grant FP7-600716 http://www.codyco.eu – Tacman, grant FP7-610967 – Wysiwyd, grant FP7-612139 – Walk-man, grant FP7-611832 – Koroibot, grant FP7-611909

new projects