Wearable Haptics CPSC 599.86 / 601.86 Sonny Chan University of - - PowerPoint PPT Presentation

wearable haptics
SMART_READER_LITE
LIVE PREVIEW

Wearable Haptics CPSC 599.86 / 601.86 Sonny Chan University of - - PowerPoint PPT Presentation

Wearable Haptics CPSC 599.86 / 601.86 Sonny Chan University of Calgary Grounded Kinaesthetic Haptic Displays Virtual Reality A new driving force for haptics technology The future of haptics? HaptX VR Glove How do we render haptics


slide-1
SLIDE 1

Wearable Haptics

CPSC 599.86 / 601.86 Sonny Chan University of Calgary

slide-2
SLIDE 2

“Grounded” Kinaesthetic Haptic Displays

slide-3
SLIDE 3

Virtual Reality

A new driving force for haptics technology…

slide-4
SLIDE 4

The future of haptics?

slide-5
SLIDE 5

HaptX VR Glove

slide-6
SLIDE 6

How do we render haptics

to a device that looks like this?

slide-7
SLIDE 7

Spectrum of “wearability”

[From C. Pacchierotti et al., IEEE Transactions on Haptics 10(6), 2017.]

slide-8
SLIDE 8

Wearability

  • Wearability can be increased by

moving the grounding of the system closer to the point of application of the stimulus

  • Tradeoff is that we gradually

lose the kinaesthetic component of haptic feedback

  • Therefore, we must investigate

effective cutaneous rendering

slide-9
SLIDE 9

Touch Perception

somatosensory system cutaneous receptors kinaesthetic receptors

slide-10
SLIDE 10

Cutaneous Perception

  • Inputs from different types of mechanoreceptors embedded in the skin
  • vibration and texture perception
  • pressure and skin stretch (grasped object)
slide-11
SLIDE 11

Receptor Terminal Feature Sensitivity Primary Functions Merkel

Sustained pressure, very low frequencies (<5 Hz)

  • very low frequency vibration detection
  • coarse texture and pattern perception

Meissner

Temporal changes in skin deformation (5-40 Hz)

  • low frequency vibration detection

Pacinian

Temporal changes in skin deformation (40-400 Hz)

  • high frequency vibration detection
  • fine texture perception

Ruffini

Sustained downward pressure, lateral skin stretch

  • direction of object motion due to skin stretch
  • finger position

[Adapted from S. J. Lederman & R. L. Klatzky, Attention, Perception, & Psychophysics 71(7), 2009.]

slide-12
SLIDE 12

Cutaneous Haptic Rendering

  • Four “primitives” of cutaneous

sensation:

  • Normal indentation
  • Lateral skin stretch
  • Relative tangential motion
  • Vibration

[diagram from R. S. Johansson & A. B. Vallbo, Trends in Neurosciences 6, 1983.]

slide-13
SLIDE 13

Wearable Haptic Interfaces

for the finger and for the hand

slide-14
SLIDE 14

Normal Indentation

  • Displays sensations of:
  • Contact and pressure
  • Curvature
  • Softness/hardness

[From F. Chinello et al., IEEE Transactions on Haptics 11(1), 2018.]

slide-15
SLIDE 15

Lateral Skin Stretch & Relative Tangential Motion

  • Shear force applied to the skin
  • Displays sensations of
  • Friction / slip
  • Indentation (e.g. Braille)
  • Caress

[From D. Leonardis et al., IEEE Transactions on Haptics 10(3), 2017.]

slide-16
SLIDE 16

Vibration

textures, materials, caress [Haptuator Planar, Tactile Labs, Montréal, QC]

slide-17
SLIDE 17

Kinaesthetic Stimuli

can be fully or partially actuated… [Maestro Hand Exoskeleton, ReNeu Robotics Lab, UT Austin]

slide-18
SLIDE 18

Kinaesthetic Stimuli

… or completely passive [From I. Choi et al., Proc. IEEE/RSJ IROS, 2016.]

slide-19
SLIDE 19

Haptic Gloves

with vibrotactile actuation [From Y. Kim et al., Proc. IEEE MultiMedia, 2009.]

slide-20
SLIDE 20

HaptX VR Glove

slide-21
SLIDE 21

Haptic Rendering

for wearable displays

slide-22
SLIDE 22

Consider the Primitives…

Normal indentation Lateral skin stretch Vibration

✓ ✓ ✓

slide-23
SLIDE 23

What about your avatar?

3-DOF 6-DOF

slide-24
SLIDE 24

How many degrees of freedom?

slide-25
SLIDE 25

Does “God-Object” still work?

A Constraint-based God-object Method For Haptic Display

  • C. B.

Zilles

  • J. K. Salisbury

Department of Mechanical Engineering Artificial Intelligence Laboratory Massachusetts Institute of Technology Cambridge, MA Abstract

Haptic display is the process of applying forces to a human “observer” giving the sensation of touching and interacting with real physical objects. Touch is unique among the senses because it allows simultaneous ex- ploration and manipulation of an environment.

A haptic display system has three main components.

The first is the haptic interface, or display device - generally some type of electro-mechanical system able to exert controllable forces on the user with one or more degrees of freedom. The second is the object model - a mathematical representation of the object containing its shape and other properties related to the way it feels. The third component, the haptic render- ing algorithm, joins the first two components to com- pute, in real time, the model-based forces to give the user the sensation of touching the simulated objects.

This paper focuses o n a new haptic rendering al-

gorithm for generating convincing interaction forces for objects modeled as rigid polyhedra (Fig. 1). W e create a virtual model of the haptic interface, called the god-object, which conforms to the virtual envir-

  • nment. The haptic interface can then be servo-ed

to this virtual model. This algorithm is extensible to

  • ther functional descriptions and lays the groundwork

for displaying not only shape information, but surface properties such as friction and compliance.

1 Introduction

The process of feeling objects through a force- generating interface is familiar in the context of using teleoperator master devices to touch and interact with remotely located objects [7]. Recent interest in en- abling interaction with virtual objects [l] has led us to investigate devices and algorithms which permit touch and manipulative interaction - collectively, haptic in- teractions - with these virtual objects. The PHANTOM haptic interface [4] permit,s users to feel and control the forces arising from point inter- Figure 1: This polygonal model of a space shuttle is

made up of 616 polygons. This is an example of the complexity of objects the god-object algorithm can allow the user to touch.

a,ctions with simulated objects. The point interaction paradigm greatly simplifies both device and algorithm development while permitting bandwidth and force fi- delity that enable a surprisingly rich range of interac-

  • tions. It reduces the problem of computing appropriate

interaction forces - haptic rendering - to one of tracing the motion of a point among objects and generating the three force components representing the interac- tion with these objects. In this paper the term haptic

interface point will be used to describe the endpoint

location of the physical haptic interface as sensed by the encoders. This work was done with PHANTOM-style device with a max force of 18N (4 lbf). We have found this to be enough force to make virtual objects feel reasonably solid without, saturating the motors. While exploring virtual environments most users tend to use less than 5N (1 lbf) of force.

0-8186-7108-4195 $4.00 0 1995 IEEE

146

[From J. Jacobs et al., Proc. IEEE 3DUI, 2012.]

slide-26
SLIDE 26

[From M. Prachyabrued & C. W. Borst, IEEE Trans. Visualization & Computer Graphics, 2016.]

slide-27
SLIDE 27
slide-28
SLIDE 28
slide-29
SLIDE 29

Summary

  • Wearable displays may very well be the future of consumer haptics
  • Virtual and augmented reality applications have renewed the desire to touch virtual stuff
  • Many companies are now trying to develop haptic gloves
  • Still mostly vapourware as of today…
  • The rendering techniques you learned in this course can be applied to many of

these novel haptic displays

  • We didn’t get into details of any specific algorithms here…
  • It’s still a very active area of modern haptics research!