Interactions Animating Dexterous Motions How can we easily animate - - PowerPoint PPT Presentation

interactions animating dexterous motions
SMART_READER_LITE
LIVE PREVIEW

Interactions Animating Dexterous Motions How can we easily animate - - PowerPoint PPT Presentation

Directing Physically Based (and Physical) Interactions Animating Dexterous Motions How can we easily animate the starfishs escape? Appearance of intelligent motion Believable physical interaction with the glass box Dynamic,


slide-1
SLIDE 1

Directing Physically Based (and Physical) Interactions

slide-2
SLIDE 2

Animating Dexterous Motions

How can we easily animate the starfish’s escape?

  • Appearance of intelligent motion
  • Believable physical interaction with the glass box
  • Dynamic, fun actions
  • Animation tools accessible to anyone
slide-3
SLIDE 3

Animating Dexterous Motions

Videos created by two novice users using our system.

slide-4
SLIDE 4

Background: Classical Approaches

  • Motion Capture
  • Not available for leaping starfish!
  • Traditional Keyframing
  • Keyframing complex dynamic

interactions is hard

  • Physically based simulation
  • Great for passive objects
  • Difficult to create “intelligent” motions
  • Physically based controllers /

Physically based optimization

  • How do we build a controller or
  • bjective function for this task?
  • No reference motions are available
slide-5
SLIDE 5

Factors that appear to contribute to human motion selection

Lillian Y. Chang and Nancy S. Pollard, “Pre-Grasp Interaction for Object Acquisition in Difficult Tasks,” forthcoming book chapter

slide-6
SLIDE 6

Background: Better Alternatives

  • Operational space / task space

control

  • Great concept, and we will use it
  • Direct control of physically based

systems

  • Goal: a more general animation

system, motivated by demonstrations like these!

  • We found that a variety of control

modalities are needed and can be incorporated easily

Sentis and Khatib Laszlo, van de Panne, and Fiume van de Panne

slide-7
SLIDE 7

Ski Stunt Simulator

slide-8
SLIDE 8

What Control Modes are Intuitive?

slide-9
SLIDE 9

User Interface Example

slide-10
SLIDE 10

Interface Modes

Manipulate Bones

  • Drag a bone to control its motion
  • direct control of head position
  • Constrain a bone to a fixed

position / orientation

  • constrain base to orientation shown
slide-11
SLIDE 11

Interface Modes

Manipulate Center of Gravity

  • Drag the CG of the lamp in a

tightly controlled manner to keep it balanced

  • Drag the CG of the starfish

abruptly to create a jump

  • Drag the CG of the donut in

a free form manner to create the desired animation

slide-12
SLIDE 12

Interface Modes

Manipulate Character Root Orientation

  • Drag a special rotation

widget for 3D rotational motions

slide-13
SLIDE 13

Interface Modes

Manipulate Joints

  • Keyframe a leaping action

for the worm

  • Set and maintain joint limits
  • Run a passive controller for

a soft landing

  • How? Set a single desired

configuration and low stiffness

slide-14
SLIDE 14

Interface Modes

Previewing

  • Observe the effect of

maintaining current command for a given period of time

slide-15
SLIDE 15

Interface Modes

Speed up, slow down, advance, back up the simulation

  • Trial and error to learn the character dynamics and achieve desired result
slide-16
SLIDE 16

Animating Dexterous Motions

Our observation: Different control modes are needed at different times to create animations sophisticated enough to tell a story Our solution: Put a variety of control modes into the animators hands and make them as intuitive as possible

slide-17
SLIDE 17

Overview of Our System

Character model: Coarse volumetric model -> fast simulation Fine surface detail for appearance, contacts and collision User Interface: Real-time, trial and error (e.g., Jump like this!) Results: Compute muscle forces for the character to best achieve the user’s goals

Junggon Kim and Nancy S. Pollard, “Direct Control of Simulated Non-Human Characters,” IEEE CG&A, 2011

slide-18
SLIDE 18

Interface Modes Under the Hood

The user is placing a variety of constraints on the character’s motion How do we determine how the character should behave, in a physically realistic manner, to best meet those constraints?? Our only “lever” is accelerations or torques that must be applied at the character’s joints to advance the simulation Algebra on the equations of motion?

Complex, local-minimum prone, prioritized optimization??

slide-19
SLIDE 19

Interface Modes Under the Hood

Most quantities we care to measure or control have a locally linear relationship to joint accelerations and joint torques

Evangelos Kokkevis, Practical Physics for Articulated Characters, Game Developer's Conference 2004.

slide-20
SLIDE 20

Example: Bone Constraints

Express bone constraint as a linear function of joint accelerations: Straightforward differentiation of equations of motion Desired bone accelerations Bone accelerations when joint accelerations are zero Obtaining desired bone accelerations:

slide-21
SLIDE 21

Interface Modes Under the Hood

(1) Express all constraints as a linear function of joint accelerations: (2) Solve a Quadratic Program to obtain joint accelerations: (3) Use these accelerations for the next timestep to advance the simulation

slide-22
SLIDE 22

Final Demos

slide-23
SLIDE 23

Realistic Physical Behavior?

http://www.youtube.com/watch?v=a-1AiExU3Vk Huai-Ti Lin, Tufts Biomimetic Devices Laboratory

slide-24
SLIDE 24

Notes

Constraint priorities: Mouse drags are satisfied after everything else Contact modeling: “hallucinate” constraints to account for pushoff forces Objective functions: minimize joint accelerations, torques, or velocities Speed: Simulations are real-time or better; users preferred 3X-8X slower Ease of use: Starfish escape animations created by novices in minutes

slide-25
SLIDE 25

What Control Modes are Intuitive?

slide-26
SLIDE 26

References

Junggon Kim and Nancy S. Pollard, “Direct Control of Simulated Non-Human Characters,” IEEE CG&A, 2011 Junggon Kim and Nancy S. Pollard, “Fast Simulation of Skeleton-Driven Deformable Body Characters,” ACM ToG, 2011

http://www.cs.cmu.edu/~junggon/ http://www.cs.cmu.edu/~junggon/

slide-27
SLIDE 27

Sticky Finger Manipulation With a Multi-Touch Interface

Ken Toh MS Thesis

slide-28
SLIDE 28

Motivation

  • User interaction is a key feature in most graphical

and robotic applications.

Sticky Finger Manipulation With a Multi-Touch Interface 28

Manipulating virtual cloth Teleoperating a robot with a multi-fingered hand

slide-29
SLIDE 29

Motivation

  • Traditional User Input Devices are effective for

many simple high-level interaction tasks..

Sticky Finger Manipulation With a Multi-Touch Interface 29

Common user input devices with simple command spaces

On/off Up, down, left, right

slide-30
SLIDE 30

Motivation

  • Dexterous manipulation of simulated/real world
  • bjects with high DOFs can however be quite

awkward to achieve with these existing input devices

Realistic cloth tearing requires more than a single cursor to execute

Sticky Finger Manipulation With a Multi-Touch Interface 30

A panel of buttons is not the most intuitive interface for dexterous tele-manipulation

slide-31
SLIDE 31

Motivation

Sticky Finger Manipulation With a Multi-Touch Interface 31

  • Key Question:

Can we design an intuitive user interface that allows us to feel natural when manipulating objects by proxy, almost as though we are interacting with them directly?

slide-32
SLIDE 32

Cloth Manipulation: Modes

Sticky Finger Manipulation With a Multi-Touch Interface 32

Creation Mode Sticky-Finger Mode Cut Mode

slide-33
SLIDE 33

Sticky Fingers for Cloth Manipulation

Sticky Finger Manipulation With a Multi-Touch Interface 33

Underlying cloth particles within radius of each active fingertip center are stuck to that finger and moves with it

slide-34
SLIDE 34

Sticky-finger Lifting

  • User activates toggle which changes the plane of

control from the x-z plane to x-y plane.

Sticky Finger Manipulation With a Multi-Touch Interface 34

slide-35
SLIDE 35

Pinch-lifting

  • Automatic detection of pinch event when two finger

touches are close together.

Sticky Finger Manipulation With a Multi-Touch Interface 35

slide-36
SLIDE 36

Cloth Simulation Model

Sticky Finger Manipulation With a Multi-Touch Interface 36

A mesh of particles connected by bend, shear and stretch constraints

slide-37
SLIDE 37

Verlet Integration

  • Key: Position-based dynamics essential because we

need to stick particles kinematically to fingers (ie. modify positions directly)

Sticky Finger Manipulation With a Multi-Touch Interface 37

slide-38
SLIDE 38

Iterative Constraint Satisfaction

Sticky Finger Manipulation With a Multi-Touch Interface 38

  • Must handle cases with stuck fingers

x2 r x1 Case (a): x1 and x2 not stuck Correction vector Case (b): x1 stuck, x2 not stuck Case (c): if both are stuck, both = 0

slide-39
SLIDE 39

Tearing

  • Sticky finger pins down relevant particles and constraints, allowing

unconstrained regions to elongate and eventually tear. Finger size matters too.

Sticky Finger Manipulation With a Multi-Touch Interface 39

slide-40
SLIDE 40

Cutting

Sticky Finger Manipulation With a Multi-Touch Interface 40

Similar to tearing but in a more controlled fashion

slide-41
SLIDE 41

Direct Cloth Manipulation

slide-42
SLIDE 42

Robotic Telemanipulation

Goal: intuitive interactive control of dexterous manipulation for a robot arm / hand system

  • Remote dexterous

manipulation

  • Scripting new

behaviors

  • Learning from

demonstration

slide-43
SLIDE 43

What is available?

http://www.youtube.com/watch?v=x9Bjs99A0k0

Master-slave systems: Origami with the DaVinci surgical robot

slide-44
SLIDE 44

What is available?

http://www.youtube.com/watch?v=jOnp2M5qibs&feature=player_detailpage TVO: Doing the Dirty Work: Robots for Hire on the NASA Robonaut

Glove interfaces for dexterous hand control: Cyberglove interface

slide-45
SLIDE 45

What is available?

http://www.youtube.com/watch?v=_R40j64C7t8 Video from Shadow Robot Company

Glove interfaces for dexterous hand control: Cyberglove interface

slide-46
SLIDE 46

Robotic Telemanipulation

Our observations: Manipulation operations often depend on precise fingertip motions Existing interfaces control them only indirectly Our solution: An inexpensive interface based on maintaining relative fingertip positions and trajectories

slide-47
SLIDE 47

Multi-touch Sticky Finger Teleoperation

Multi-touch for teleoperation of manipulation tasks

  • Portable
  • Intuitive
  • Accessible
  • Affordable
  • Capable of dexterous actions

Yue Peng Toh and Nancy S. Pollard, “Sticky-Finger Teleoperation with A Multi-Touch Interface”, submitting to ICRA 2012.

slide-48
SLIDE 48

Multi-touch Sticky Finger Teleoperation

slide-49
SLIDE 49

Pose Control

Jacobian pseudoinverse control with a nullspace constraint to reduce roll of the palm

Primary objective: control fingertip velocities Secondary objective: minimize palm roll Solution: joint velocities hand and arm

slide-50
SLIDE 50

Interface modes: Horizontal Scrolling

slide-51
SLIDE 51

Interface modes: Vertical Scrolling

slide-52
SLIDE 52

Demo! (3X speed)

slide-53
SLIDE 53

Reference

Yue Peng Toh and Nancy S. Pollard, “Sticky-Finger Teleoperation with A Multi-Touch Interface”, submitting to ICRA 2012.

slide-54
SLIDE 54

What Control Modes are Intuitive?

slide-55
SLIDE 55

55