Steerable Interfaces for Steerable Interfaces for Pervasive - - PowerPoint PPT Presentation

steerable interfaces for steerable interfaces for
SMART_READER_LITE
LIVE PREVIEW

Steerable Interfaces for Steerable Interfaces for Pervasive - - PowerPoint PPT Presentation

Steerable Interfaces for Steerable Interfaces for Pervasive Computing Spaces Pervasive Computing Spaces Gopal Pingali Pingali Gopal Gopal Pingali Claudio Pinhanez Pinhanez, Anthony , Anthony Levas Levas, Rick , Rick Kjeldsen Kjeldsen,


slide-1
SLIDE 1

Steerable Interfaces for Steerable Interfaces for Pervasive Computing Spaces Pervasive Computing Spaces

Gopal Pingali Claudio Pinhanez, Anthony Levas, Rick Kjeldsen, Mark Podlaseck, Han Chen, Noi Sukaviriya

IBM T.J . Watson Research Center Hawthorne, New York USA

Gopal Gopal Pingali Pingali Claudio Claudio Pinhanez Pinhanez, Anthony , Anthony Levas Levas, Rick , Rick Kjeldsen Kjeldsen, Mark , Mark Podlaseck Podlaseck, Han Chen, , Han Chen, Noi Sukaviriya Noi Sukaviriya

IBM T.J . Watson Research Center IBM T.J . Watson Research Center Hawthorne, New York Hawthorne, New York USA USA

slide-2
SLIDE 2

! ! Motivation/definition of steerable interfaces Motivation/definition of steerable interfaces ! ! Overview of key technologies Overview of key technologies ! ! Demonstration of prototype smart retail environment Demonstration of prototype smart retail environment ! ! Conclusions and discussion Conclusions and discussion

Talk Overview Talk Overview

slide-3
SLIDE 3

Motivation Motivation

Can a display appear wherever the user needs it rather than the user being enslaved to display devices (monitors, televisions etc.) ?

slide-4
SLIDE 4

Steerable Interfaces for Pervasive Steerable Interfaces for Pervasive Computing Environments Computing Environments

Steerable Interface: An interface to computing that can move around a physical environment to appear on ordinary objects

  • r surfaces or even in empty space.

Characteristics: Characteristics:

! ! Movable output interface

Movable output interface

! ! Movable input interface

Movable input interface

! ! Adaptation to user context

Adaptation to user context

! ! Adaptation to environmental

Adaptation to environmental context context

! ! Device-free interaction

Device-free interaction

! ! Natural interaction

Natural interaction Applications: Applications:

! ! Information access in public

Information access in public spaces -- retail, transportation spaces -- retail, transportation

! ! Instruction and information

Instruction and information delivery in military training delivery in military training centers, ships, aircraft centers, ships, aircraft

! ! Deviceless

Deviceless interaction in interaction in hazardous environments hazardous environments

! ! Augmented reality in hospitals

Augmented reality in hospitals

! ! Interactive real world games

Interactive real world games

slide-5
SLIDE 5

Steerable Interfaces -- Key Steerable Interfaces -- Key Technologies Technologies

WORLD MODEL (3D geometry, user location,display zones, calibration parameters, system state) Steerable projection Steerable vision Steerable microphones Steerable audio Environment modeling Calibration

A P I

Geometric Reasoning User Localization

slide-6
SLIDE 6

Steerable Projection: Steerable Projection: The Everywhere Display The Everywhere Display

A device that creates an interactive projected display on any surface in an environment. Prototype bright LCD projector rotating mirror pan/ tilt/ zoom camera A device that creates an A device that creates an interactive projected interactive projected display on any surface display on any surface in an environment. in an environment. Prototype Prototype bright LCD projector bright LCD projector rotating mirror rotating mirror pan/ tilt/ zoom camera pan/ tilt/ zoom camera

slide-7
SLIDE 7

Oblique Projection Distortion Oblique Projection Distortion

display image

slide-8
SLIDE 8

Correction of Oblique Projection Correction of Oblique Projection Distortion Distortion

display image

Goal: projected image is free of distortion when viewed by an “orthogonal” viewer

slide-9
SLIDE 9

Projector/camera Calibration

! computation of homography H/Hv given location of corners on a display surface ! need expected hand size in image and direction of entry for interaction ! manual specification of focus, pan, tilt, zoom ! ! computation of computation of homography homography H/ H/Hv Hv given location of corners on a display surface given location of corners on a display surface ! ! need expected hand size in image and direction of entry for interaction need expected hand size in image and direction of entry for interaction ! ! manual specification of focus, pan, tilt, zoom manual specification of focus, pan, tilt, zoom

slide-10
SLIDE 10

Ceiling Light s Task Light s St at us Light s Of f iceFr ont Display Primary Display I nf oPanel Display Everywhere Display Personal Air & Heat ing Wir eless Pressur e Sensor Envir onment Sensors

Integration into Office-of-the-Future: Integration into Office-of-the-Future: BlueSpace BlueSpace at IBM T.J . Watson at IBM T.J . Watson

slide-11
SLIDE 11

Scenarios in Scenarios in Bluespace Bluespace

Team work support Personalized wall display Email notification on desk Video conferencing anywhere

slide-12
SLIDE 12

Steerable Gesture Recognition Steerable Gesture Recognition for Input for Input

!

camera steered to projected display

!

motion detected in camera image stream

!

motion data matched to finger tip template

! !

camera steered to projected display camera steered to projected display

! !

motion detected in camera image stream motion detected in camera image stream

! !

motion data matched to finger tip template motion data matched to finger tip template

Projected light changes appearance of hand!

slide-13
SLIDE 13

Surface Independent Interface Surface Independent Interface Specification Specification

map to surface A

application space

! Independent specification of what and where ! Dynamic configurations of widgets; ! Map configuration to surface at run time;

map to surface B

image space

slide-14
SLIDE 14

Widget Architecture

Component types:

Image Processing Motion Analysis Instance Data

Motion Detection Change Detection

Image Region Definition (IRD) Event Generation (EG)

Fingertip Tracking

Touch Motion Detector

IRD EG

In-Region Detection

Widgets:

Surface Transformation (ST) ST IRD EG ST Return Value Transformation

Touch Button Tracking Area

System Functions

Image

Good high level abstraction - Efficient widget combination - Flexible widget definition

slide-15
SLIDE 15

First Prototype of Steerable First Prototype of Steerable Interfaces ( Interfaces (Siggraph Siggraph, Aug 2001) , Aug 2001)

Invitation board at entrance Color selection Interactive cans with M&Ms Picture in progress: placement, finger painting Completed picture

slide-16
SLIDE 16

User Experience at User Experience at Siggraph Siggraph 2001 2001

SIGGRAPH01.avi

slide-17
SLIDE 17

Interactive Applications Anywhere Interactive Applications Anywhere

Various interactive applications can be dynamically moved to Various interactive applications can be dynamically moved to different surfaces e.g.. MediaPlayer, PowerPoint presentations, different surfaces e.g.. MediaPlayer, PowerPoint presentations, Web Browser applications etc. Web Browser applications etc.

slide-18
SLIDE 18

User-following Display User-following Display

Goal: Automatically steer display to appropriate surface at the appropriate position, size and orientation based on user location. Goal: Automatically steer display to appropriate surface at the Goal: Automatically steer display to appropriate surface at the appropriate position, size and orientation appropriate position, size and orientation based on user location.

based on user location.

Enabling Technologies: Enabling Technologies:

  • 1. 3D model of the environment
  • 1. 3D model of the environment
  • 2. 3D calibration of projector(s), camera(s)
  • 2. 3D calibration of projector(s), camera(s)
  • 3. Dynamic localization of the user(s).
  • 3. Dynamic localization of the user(s).
  • 4. Geometric reasoning engine for:
  • 4. Geometric reasoning engine for:

virtual projection virtual projection virtual imaging virtual imaging virtual steering virtual steering

  • cclusion detection
  • cclusion detection

computation of warping function computation of warping function

slide-19
SLIDE 19

VRML model of

VRML model of environment: environment:

! ! Based on real world

Based on real world measurements measurements

  • 1. 3D Environment Modeling
  • 1. 3D Environment Modeling
slide-20
SLIDE 20

Modeling toolkit

! ! Basic 3D modeling software

Basic 3D modeling software supporting simple geometries supporting simple geometries (planes, cubes) and import of (planes, cubes) and import of more complex geometries more complex geometries

! ! Provides annotations for all

Provides annotations for all

  • bjects -- semantics can be
  • bjects -- semantics can be

attached to objects attached to objects

! ! Includes “projectors”, “cameras”,

Includes “projectors”, “cameras”, and “PJ surfaces” and “PJ surfaces”

! ! Model stored in XML format

Model stored in XML format (objects are tags and annotations (objects are tags and annotations are attributes) are attributes)

slide-21
SLIDE 21
  • 2. 3D Calibration: The Link
  • 2. 3D Calibration: The Link

Between the Virtual and the Real Between the Virtual and the Real

3D Camera Calibration

! Use correspondence between points in 3D model (known points in environment) and points in camera image plane ! Error in image to world mapping within 25 mm (maximum)

3D Projector Calibration

! Determine effective location and

  • rientation of projector

! Error within 5 mm (coplanar) and 15 mm (non-coplanar) in image to world mapping

3D Camera Calibration 3D Camera Calibration

! ! Use correspondence between points in Use correspondence between points in 3D model (known points in 3D model (known points in environment) and points in camera environment) and points in camera image plane image plane ! ! Error in image to world mapping within Error in image to world mapping within 25 mm (maximum) 25 mm (maximum)

3D Projector Calibration 3D Projector Calibration

! ! Determine effective location and Determine effective location and

  • rientation of projector
  • rientation of projector

! ! Error within 5 mm (coplanar) and 15 Error within 5 mm (coplanar) and 15 mm (non-coplanar) in image to world mm (non-coplanar) in image to world mapping mapping

slide-22
SLIDE 22
  • 3. User Localization
  • 3. User Localization

Real-time tracking of user position

! Use image/background differencing with morphological filtering to detect motion regions ! Extract bounding contours of resulting regions ! Check for presence of a head by analyzing extrema of curvature on the contour and circularity ! Track detected head ! Triangulate detections from multiple cameras to determine 3D position

Real-time tracking of user position Real-time tracking of user position

! ! Use image/background differencing with morphological Use image/background differencing with morphological filtering to detect motion regions filtering to detect motion regions ! ! Extract bounding contours of resulting regions Extract bounding contours of resulting regions ! ! Check for presence of a head by analyzing Check for presence of a head by analyzing extrema extrema of

  • f

curvature on the contour and circularity curvature on the contour and circularity ! ! Track detected head Track detected head ! ! Triangulate detections from multiple cameras to Triangulate detections from multiple cameras to determine 3D position determine 3D position

slide-23
SLIDE 23

Input: 3D model, user position, display zones, ED calibration parameters Output: Information on every display zone:

! physical size ! user proximity ! orientation of zone w.r.t user ! percentage of occlusion of a display

zone

! occlusion mask (a bitmap indicating

  • cclusion)

Input Input: 3D model, user position, display : 3D model, user position, display zones, ED calibration parameters zones, ED calibration parameters Output Output: Information on every display : Information on every display zone: zone:

! ! physical size

physical size

! ! user proximity

user proximity

! ! orientation of zone w.r.t user

  • rientation of zone w.r.t user

! ! percentage of occlusion of a display

percentage of occlusion of a display zone zone

! ! occlusion mask (a bitmap indicating

  • cclusion mask (a bitmap indicating
  • cclusion)
  • cclusion)
  • 4. Geometric Reasoning
  • 4. Geometric Reasoning
slide-24
SLIDE 24

! ! Pre-define a set of display Pre-define a set of display zones in relation to zones in relation to environment model environment model ! ! Calibrate projector in 3D for Calibrate projector in 3D for each display zone each display zone ! ! Locate user by tracking with Locate user by tracking with pre-calibrated camera. pre-calibrated camera. ! ! Select display zone based on Select display zone based on user location -- after user location -- after

  • cclusion check.
  • cclusion check.

! ! Map user position to 3D Map user position to 3D environment. environment.

Tying it all together: User- Tying it all together: User- Following Display Following Display

slide-25
SLIDE 25

Video: User-Following Display Video: User-Following Display

slide-26
SLIDE 26
slide-27
SLIDE 27

Demo: Everywhere Sales Associate Demo: Everywhere Sales Associate

Technical Goal: Adaptation of interface to user and environmental context.

! ! Interactive help anywhere in a store

Interactive help anywhere in a store

! ! Dynamic placement of display close to the user

Dynamic placement of display close to the user

! ! Adaptation of display content/ interaction mode

Adaptation of display content/ interaction mode

! ! Detection and correction for occlusion of display

Detection and correction for occlusion of display Examples: Examples:

! ! Ubiquitous Product Finder

Ubiquitous Product Finder

! !Adaptive Signage

Adaptive Signage

! ! Interactive Product Bins

Interactive Product Bins

slide-28
SLIDE 28

Video: Product Finder - Table Video: Product Finder - Table

slide-29
SLIDE 29

Video: Product Finder – Wall(1) Video: Product Finder – Wall(1)

slide-30
SLIDE 30

Video: Product Finder: Wall (2) Video: Product Finder: Wall (2)

slide-31
SLIDE 31

Video: Interactive Signage Video: Interactive Signage

slide-32
SLIDE 32

Video: Clothing Bins (distance) Video: Clothing Bins (distance)

slide-33
SLIDE 33

Video: Clothing Bins (interactions) Video: Clothing Bins (interactions)

slide-34
SLIDE 34

! ! Introduced and demonstrated feasibility of new class of Introduced and demonstrated feasibility of new class of steerable steerable interfaces interfaces ! ! Adds a new dimension to the vision of “computing woven into space” Adds a new dimension to the vision of “computing woven into space” without wiring the user/objects without wiring the user/objects Future Directions: Future Directions: Greater Greater multimodality multimodality, adaptation, context awareness , adaptation, context awareness ! ! Steerable Audio-visual interface Steerable Audio-visual interface ! ! Underlying distributed architecture for context awareness and adaptation Underlying distributed architecture for context awareness and adaptation ! ! Adaptation to surface characteristics Adaptation to surface characteristics ! ! Multi-user/collaborative situations Multi-user/collaborative situations ! ! Advanced calibration/model capture Advanced calibration/model capture ! ! “Natural” interaction! “Natural” interaction!

Conclusions and Future Directions Conclusions and Future Directions