Ubiquitous Computing CPSC 581 - Fall 2015 The Computer for the - - PowerPoint PPT Presentation

ubiquitous computing
SMART_READER_LITE
LIVE PREVIEW

Ubiquitous Computing CPSC 581 - Fall 2015 The Computer for the - - PowerPoint PPT Presentation

Ubiquitous Computing CPSC 581 - Fall 2015 The Computer for the 21st Century Mark Weiser, 1991 The most profound technologies are those that disappear. They weave themselves into at the fabric of everyday life until they are


slide-1
SLIDE 1

Ubiquitous Computing

CPSC 581 - Fall 2015

slide-2
SLIDE 2

The Computer for the 21st Century

Mark Weiser, 1991

slide-3
SLIDE 3

“The most profound technologies are those that disappear. They weave themselves into at the fabric of everyday life until they are indistinguishable from it. ”

slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

“Specialized elements of hardware and software, connected by wires, radio waves and infrared, will be so ubiquitous that no one will notice their presence. ”

slide-8
SLIDE 8
  • M. Weiser, Scientific American, 1991
slide-9
SLIDE 9

Two Crucial Issues

Location:

ubiquitous computers must know where they are, so that they can adapt their behaviour

Scale:

ubiquitous computers will come in different sizes, each suited to a particular task

slide-10
SLIDE 10

Tabs, Pads, Boards

slide-11
SLIDE 11

“This leads to our goal for initially deploying the hardware of embodied virtuality: hundreds of computers per room. ”

slide-12
SLIDE 12

Tabs

Expand on the usefulness of existing inch-scale computers pocket calculator

  • rganizer

etc…

slide-13
SLIDE 13

Pads

Intended to be “scrap computers” analogous to scrap paper “antidote to windows”

slide-14
SLIDE 14

Boards

Yard-sized displays that serve a number

  • f purposes

video screens bulletin boards bookcase etc…

slide-15
SLIDE 15

“The real power of the concept comes not from any one of these devices — it emerges from the interaction of all of them. ”

slide-16
SLIDE 16

Soooo…. are we there yet?

slide-17
SLIDE 17

“…even marketing firms could make unpleasant use of the same information that makes invisible computers so convenient. ”

slide-18
SLIDE 18

Information overload?

slide-19
SLIDE 19

“Machines that fit the human environment instead of forcing humans to enter theirs will make using a computer as refreshing as taking a walk in the woods. ”

slide-20
SLIDE 20

Proxemics

A brief introduction

slide-21
SLIDE 21

–Bill Buxton, 1997

“When you walk up to your computer, does the screen saver stop and the work- ing windows reveal themselves? Does it even know if you are there? How hard would it be to change this?”

slide-22
SLIDE 22

Theory of Proxemics

Edward T. Hall, “The Hidden Dimension”, 1966

slide-23
SLIDE 23

and explicit interaction techniques.

1) Ambient Display 2) Implicit Interaction 3) Subtle Interaction 4) Personal Interaction

Figure 2. Four interaction phases, facilitating transitions from

  • D. Vogel & R. Balakrishnan, ACM UIST 2004
slide-24
SLIDE 24

Figure 1: A proxemic ecology, including a mix of people, digital surfaces, portable personal devices, and information appliances.

Portable personal devices digital surfaces Information appliances People

coVer STory

slide-25
SLIDE 25

“While most devices are networked, actually interconnecting these devices is painful without extensive knowledge. ”

slide-26
SLIDE 26

“These devices are also blind to the non-computational aspects of the room — the people, other non-digital

  • bjects, the room’

s semi-fixed and fixed features — all of which may affect their intended use. ”

slide-27
SLIDE 27

interactions Januar y + Februar y 2011

44

coVer STory

  • perationalizing Proximity

for ubicomp

Figure 2; The five dimensions of prox- emics for ubicomp.

Distance Orientation Movement Identity Location

Five dimensions of proxemics for ubiquitous computing

  • S. Greenberg et al., ACM Interactions 2011
slide-28
SLIDE 28

Distance

We normally think of as a continuous measure, but can be discrete Specific zones along with implications for meaning

slide-29
SLIDE 29

Orientation

Captures nuances not provided by distance alone

e.g. facing toward, somewhat toward, or away from the other object

Makes sense only if an entity has a “front face”

slide-30
SLIDE 30

Identity

Can be a detailed measure

e.g. exact identity and attributes

Or less detailed

e.g. entity’ s type

slide-31
SLIDE 31

Movement

Captures distance and orientation

  • f an entity over time

speed, turning, etc.

slide-32
SLIDE 32

Location

Describes the physical context in which the entity resides

e.g. particular room and its characteristics

Meaning applied to the four other measures depends on contextual location

slide-33
SLIDE 33

Examples

slide-34
SLIDE 34

coVer STory The Social Surface: The Proxemic Face information and Controls in Hand: Proxemic Presenter

Figure 5: The Proxemic Face as a social entity. (a) The lonely proxemic

  • face. (b) It sees

Rob come in and greets him. (c) It looks at Rob when Rob looks at him (d) but is saddened when Rob looks

  • away. (e) Initially

fascinated by the flashlight beam, it is annoyed when Rob pokes it in the eye. (f) Rob is a bit too close for comfort.

Prototyping Proxemic Interactions: The Proximity Toolkit

There are many ways to capture proximity data. Methods include sensors, vision and scene analysis, motion capture via tags, time-of-flight measures, instrumented rooms, depth sensors, and others. No method is yet perfect, as there is a trade-off between important factors such as data accuracy, the type of information returned, equipment costs, difficulty

  • f configuration, and amount of custom coding required to

exploit the returned information effectively. Because we wanted to concentrate on the design of proxemic interactions instead of the underlying plumbing, we built the Proxemity Toolkit. Currently based on the expensive Vicon Motion Capture system, it tracks particular

  • bjects (via markers) and their proximity relationships

with each other. From that, we generate highly accurate distance, orientation, identity, and movement information as a series of easy-to-program events. Additional information processed from this data is also returned as events, such as the intersection ray of one object facing toward another

  • bject, or whether one object has “collided” with another
  • bject by crossing a distance threshold. Programming with

these events is straight-forward. We found that computer science students, after just an hour of training, could construct simple but quite interesting proximity-aware applications in a very short amount of time (a day or two). Figure 13 illustrates one of the controls in this toolkit, where it is displaying the current state of the living room ecology described in previous systems. The figure shows the fixed and semi-fixed features of the room (the room boundaries, the coach, side table, bookcase, and displays). It also dynamically shows the several moving entities in the room and their orientation (a wand and the person by his hat), and that the person is touching the display. Programmatically, it

interactions Januar y + Februar y 2011

46

coVer STory The Social Surface: The Proxemic Face information and Controls in Hand: Proxemic Presenter

Figure 5: The Proxemic Face as a social entity. (a) The lonely proxemic

  • face. (b) It sees

Rob come in and greets him. (c) It looks at Rob when Rob looks at him (d) but is saddened when Rob looks

  • away. (e) Initially

fascinated by the flashlight beam, it is annoyed when Rob pokes it in the eye. (f) Rob is a bit too close for comfort.

Prototyping Proxemic Interactions: The Proximity Toolkit

There are many ways to capture proximity data. Methods include sensors, vision and scene analysis, motion capture via tags, time-of-flight measures, instrumented rooms, depth sensors, and others. No method is yet perfect, as there is a trade-off between important factors such as data accuracy, the type of information returned, equipment costs, difficulty

  • f configuration, and amount of custom coding required to

exploit the returned information effectively. Because we wanted to concentrate on the design of proxemic interactions instead of the underlying plumbing, we built the Proxemity Toolkit. Currently based on the expensive Vicon Motion Capture system, it tracks particular

  • bjects (via markers) and their proximity relationships

with each other. From that, we generate highly accurate distance, orientation, identity, and movement information as a series of easy-to-program events. Additional information processed from this data is also returned as events, such as the intersection ray of one object facing toward another

  • bject, or whether one object has “collided” with another
  • bject by crossing a distance threshold. Programming with

these events is straight-forward. We found that computer science students, after just an hour of training, could construct simple but quite interesting proximity-aware applications in a very short amount of time (a day or two). Figure 13 illustrates one of the controls in this toolkit, where it is displaying the current state of the living room ecology described in previous systems. The figure shows the fixed and semi-fixed features of the room (the room boundaries, the coach, side table, bookcase, and displays). It also dynamically shows the several moving entities in the room and their orientation (a wand and the person by his hat), and that the person is touching the display. Programmatically, it

interactions Januar y + Februar y 2011

46

lonely greeting eye contact saddened annoyed uncomfortable

slide-35
SLIDE 35
slide-36
SLIDE 36
slide-37
SLIDE 37

Challenges

Assumes that a set of rules of behaviour exists to dictate what that entity should do There will always be many cases where applying the rule in a particular instance will be the wrong thing to do!

slide-38
SLIDE 38

Project 3:

Designing with Proxemics!