Human-Robot Interaction Elective in Artificial Intelligence - - PDF document

human robot interaction
SMART_READER_LITE
LIVE PREVIEW

Human-Robot Interaction Elective in Artificial Intelligence - - PDF document

Human-Robot Interaction Elective in Artificial Intelligence Lecture 9 Motion control for HRI Luca Iocchi DIAG, Sapienza University of Rome, Italy Outline Robot motion: main feature characterizing HRI from HCI Human-robot distance


slide-1
SLIDE 1

Human-Robot Interaction

Elective in Artificial Intelligence Lecture 9 – Motion control for HRI

Luca Iocchi DIAG, Sapienza University of Rome, Italy Robot motion: main feature characterizing HRI from HCI

  • Human-robot distance control
  • Proxemics
  • Human-aware/social navigation
  • Gesture production
  • Physical HRI

Outline

2

  • L. Iocchi - Motion Control for HRI
slide-2
SLIDE 2

Well-known problem, several solutions https://qiao.github.io/PathFinding.js/visual/

Robot Navigation

3

  • L. Iocchi - Motion Control for HRI

People are not obstacles! Treating people as obstacles:

  • No yielding
  • Blocking people (deadlocks or slow decision making)
  • Collisions due to unexpected behaviors

Robot Navigation

4

  • L. Iocchi - Motion Control for HRI
slide-3
SLIDE 3
  • Facial expressions
  • Gaze
  • Gestures and body

movements

  • Posture
  • Bodily contact
  • Spatial behaviour

Bobily Communication

5

  • L. Iocchi - Motion Control for HRI

Non-verbal (NV) signals

Argyle M. 1998 Bodily communication

Proxemics: study of social use and management of space

  • In HRI
  • Human-robot distance is a fundamental variable to control
  • Intrusion and discomfort are proportional
  • Proxemics concepts depend on the robot morphology and

size, on the task, on the situation or status of interaction, etc.

Proxemics

6

  • L. Iocchi - Motion Control for HRI
slide-4
SLIDE 4
  • Proxemics space [Hall, 1966]

public zone > 3.6m social zone > 1.2m personal zone > 0.45m intimate zone < 0.45m

Proxemics

7

  • L. Iocchi - Motion Control for HRI
  • Personal space is variable, continuous and context-

dependent

  • Social Force model

[Helbing and Molnar, 1995]

  • Personal space can be

asymmetric and modulated by motion

Proxemics

8

  • L. Iocchi - Motion Control for HRI
slide-5
SLIDE 5

Personal Space

Proxemics

9

  • L. Iocchi - Motion Control for HRI

Information Process Space: space in which pedestrians take into account obstacles

[Kitazawa and Fujiyama, 2010]

Proxemics

10

  • L. Iocchi - Motion Control for HRI
slide-6
SLIDE 6

Space related to group of people

  • O-Shape [Kendon, 2010]

Proxemics

11

  • L. Iocchi - Motion Control for HRI

Space related to activities

  • Activity space [Lindner and Eschenbach, 2011]

Proxemics

12

  • L. Iocchi - Motion Control for HRI
slide-7
SLIDE 7

Semantics of space: assigning a meaning to any space of the environment

Proxemics

13

  • L. Iocchi - Motion Control for HRI

Semantics of space around a person

Proxemics

14

  • L. Iocchi - Motion Control for HRI
slide-8
SLIDE 8

Semantics of person-robot distance

Proxemics

15

  • L. Iocchi - Motion Control for HRI
  • Empirical framework [Walters et al. (2009)]
  • Perceptual models [Mead and Mataric, 2014]
  • Closeness models [Mumm, J. & Mutlu, B., 2011]
  • Qualitative spatial reasoning [Bhatt and Dylla, 2009]
  • Cognitive maps

Human-Robot Proxemics

16

  • L. Iocchi - Motion Control for HRI
slide-9
SLIDE 9

An empirical framework for human-robot proxemics

[Walters et al. (2009)]

  • Approach distance based on several proxemic factors,

(robot appearance, task, user preferences, etc.)

  • Base distance = 57 cm
  • Adjustment Factor -7/+13 cm

Human-Robot Proxemics

17

  • L. Iocchi - Motion Control for HRI

Perceptual model of human-robot proxemics

[Mead and Mataric, 2014]

How does human-robot positioning influence human-robot interaction? Bayesian network modeling relationship between pose, speech and gesture.

Human-Robot Proxemics

18

  • L. Iocchi - Motion Control for HRI
slide-10
SLIDE 10

[Mead and Mataric, 2014]

Speech output over distance

Human-Robot Proxemics

19

  • L. Iocchi - Motion Control for HRI

[Mead and Mataric, 2014]

Speech/gesture recognition rate over distance

Human-Robot Proxemics

20

  • L. Iocchi - Motion Control for HRI
slide-11
SLIDE 11

Human-Robot Proxemics: Physical and Psychological Distancing in Human-Robot Interaction

[Mumm, J. & Mutlu, B., 2011]

Closeness models

21

  • L. Iocchi - Motion Control for HRI

Other factors cultural background ethnic group gender age physical attractiveness body orientation

[Mumm, J. & Mutlu, B., 2011]

Human-Robot Closeness models

22

  • L. Iocchi - Motion Control for HRI
  • Participants maintain a greater distance with the robot when the robot maintains

eye contact with them than they do when it avoids gaze.

  • Participants maintain a greater distance with the disliked robot when the robot

maintains eye contact with them than they do when it avoids gaze, while distance is not affected by eye contact with the liked robot.

  • Participants disclose more to the liked robot, independently by other factors (e.g.,

gaze). Human-human model only partially supported in human-robot experiments.

slide-12
SLIDE 12

Qualitative trajectory calculus (QTC) represents the relative motion of two points in a time interval with respect to the reference line that connects them on a 2D plane.

Qualitative Spatial Reasoning

23

  • L. Iocchi - Motion Control for HRI

(q1,q2,q3,q4) q1: movement of H wrt R q2: movement of R wrt H q3: movement of H wrt HR q4: movement of R wrt RH

Qualitative Spatial Reasoning

24

  • L. Iocchi - Motion Control for HRI
slide-13
SLIDE 13

Qualitative Spatial Reasoning

25

  • L. Iocchi - Motion Control for HRI

Reasoning with QTC: QTC-MM

[Bellotto et al. 2013]

Qualitative Spatial Reasoning

26

  • L. Iocchi - Motion Control for HRI
slide-14
SLIDE 14

Autonomous safe navigation following social norms

  • Depending on task
  • Unfocused vs. focused interaction (person identification)

Human-Aware Navigation

27

  • L. Iocchi - Motion Control for HRI

Unfocused interactions

  • Avoidance [many works]
  • Passing people [Pacchierotti et al., 2006]
  • Staying in line [Nakauchi and Simmons, 2000]

Human-Aware Navigation

28

  • L. Iocchi - Motion Control for HRI
slide-15
SLIDE 15

Focused interactions

  • Approach person [Carton et al. 2012]
  • Follow person [Gockley et al. 2007]
  • Walking side-by-side [Morales Saiki et al. 2012]
  • Guiding [Martin et al. 2004, Hoeller et al. 2007]

Human-Aware Navigation

29

  • L. Iocchi - Motion Control for HRI

Human-robot encounters in a hallway

  • 1. Upon entering the social space of the person, move to

the right (wrt. to the robot reference frame) to signal the person that s/he has been detected.

  • 2. Move to the right of the hallway, while passing the

person.

  • 3. Return to normal navigation after the person has fully

passed by.

Human-Aware Navigation

30

  • L. Iocchi - Motion Control for HRI
slide-16
SLIDE 16

Human-robot encounters in a hallway

Human-Aware Navigation

31

  • L. Iocchi - Motion Control for HRI

Follow person

  • Person tracking + target following
  • Use verbal feedback to inform and keep engagement

Human-Aware Navigation

32

  • L. Iocchi - Motion Control for HRI
slide-17
SLIDE 17

[Salem et al. 2012]

  • Gestures improve engagement in interaction

(several user studies)

  • Synchronized voice and gestures
  • Synchronized gestures and facial expressions

Gesture production

33

  • L. Iocchi - Motion Control for HRI

Exercises with Pepper robot (next lecture)

Gesture production

34

  • L. Iocchi - Motion Control for HRI
slide-18
SLIDE 18

Kobian at Waseda University

Face expression production

35

  • L. Iocchi - Motion Control for HRI

Kobian at Waseda University

Face and gesture production

36

  • L. Iocchi - Motion Control for HRI
slide-19
SLIDE 19
  • Prof. A. De Luca.

Robotics 2

  • Main focus is on safety
  • Variable stiffness actuators

Physical HRI

37

  • L. Iocchi - Motion Control for HRI

Bellotto, N., Hanheide, M. & Weghe, N. Van De. Qualitative design and implementation of human robot spatial interactions. Int Conf Social Robotics 2013. Bhatt, M. and Dylla, F. A qualitative model of dynamic scene analysis and interpretation in ambient intelligence systems. International Journal of Robotics and Automation, 24(3), 2009.

  • E. T. Hall. The Hidden Dimension. Doubleday, New York, 1966.

Helbing, D. and Molnar, P. Social force model for pedestrian dynamics. PHYSICAL REVIEW E, 51:4282, 1995.

References

38

  • L. Iocchi - Motion Control for HRI
slide-20
SLIDE 20
  • F. Hoeller, D. Schulz, M. Moors and F.E. Schneider. Accompanying persons with

a mobile robot using motion prediction and probabilistic roadmaps. In Proc. Of

  • Intern. Conf. on Intelligent Robots and Systems, IROS 2007.
  • A. Kendon. Spacing and orientation in co-present interaction. In Development of

Multimodal Interfaces: Active Listening and Synchrony, LNCS 5967, 2010 Lindner, F. and Eschenbach, C. (2011). Towards a formalization of social spaces for socially aware robots. In Proceedings of the 10th international conference on Spatial information theory, COSIT’11.

References

39

  • L. Iocchi - Motion Control for HRI
  • C. Martin, H.J. Bohme and H.M. Gross. Conception and realization of a multi-

sensory interactive mobile office guide. In IEEE International Conference on Systems, Man and Cybernetics, vol. 6, 2004. Mead, R. and Matarić, M.J. Perceptual Models of Human-Robot Proxemics, In the Proceedings of the 2014 International Symposium on Experimental Robotics (ISER), 2014. Mumm, J. & Mutlu, B., 2011. Human-robot proxemics. In Proceedings of the 6th international conference on Human-robot interaction - HRI ’11. New York, New York, USA: ACM Press.

References

40

  • L. Iocchi - Motion Control for HRI
slide-21
SLIDE 21
  • M. Salem, S. Kopp, I. Wachsmuth, K. Rohlfing, F. Joublin. Generation and

Evaluation of Communicative Robot Gesture International Journal of Social Robotics., 4(2), 201-217, 2012 Walters, M. L., Dautenhahn, K., te Boekhorst, R., Koay, K. L., Syrdal, D. S., and Nehaniv., C. L. (2009). An empirical framework for human-robot proxemics. Proceedings New Frontiers in Human-Robot Interaction.

References

41

  • L. Iocchi - Motion Control for HRI