User interface technology CS 347 Michael Bernstein Announcements - - PowerPoint PPT Presentation

user interface technology
SMART_READER_LITE
LIVE PREVIEW

User interface technology CS 347 Michael Bernstein Announcements - - PowerPoint PPT Presentation

User interface technology CS 347 Michael Bernstein Announcements Articulating Contributions in HCI due today Project Brainstorm due next Friday Today at the end of class: team mixer Reminder: methods lecture on Wednesday, and stats session


slide-1
SLIDE 1

User interface technology

CS 347 Michael Bernstein

slide-2
SLIDE 2

Announcements

Articulating Contributions in HCI due today Project Brainstorm due next Friday Today at the end of class: team mixer Reminder: methods lecture on Wednesday, and stats session next Wednesday @ 6pm in Littlefield 103.

If you don’t know how to do a t-test, one way and two way ANOVA, or chi square, or how to write up the results and effect size for a paper, join!

2

slide-3
SLIDE 3

Course Overview

week 1 Intro to Interaction; Intro to Social Computing week 2 Intro to Design; Interaction week 3 Methods; Interaction week 4 Social Computing week 5 Design week 6 AI+HCI; Media week 7 Accessibility; ICT4D week 8 Foundations; Cognition week 9 Collaboration; Programming week 10 Visualization; Critiques of HCI

3

INTRO DEPTH BREADTH

slide-4
SLIDE 4

Recall: ubiquitous computing

“The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” — Mark Weiser In contrast to visions of machines everywhere, Weiser advocated a vision of calm computing where computing receded into the background.

4

slide-5
SLIDE 5

How to achieve this vision?

5

Interaction UI Technology (“UIST”) Input, output, and interaction modalities Ubiquitous computing (“Ubicomp”) Integration into life and into the lived environment

slide-6
SLIDE 6

User interface technology

How can the user interact fluidly with the world around them?

New input modalities: e.g., radar, acoustics New output modalities: e.g., fabrication, swarm robots New user vocabulary: e.g., voice, gestures

This research is often driven by, or involves the creation of, new hardware

6

slide-7
SLIDE 7

Recall: tangible computing

7

Y O U R E A D T H I S

[Ishii and Ullmer 1997]

slide-8
SLIDE 8

What makes an interface tangible?

Y O U R E A D T H I S

Ubicomp: integrated into the environment Tangible: stronger claim — input and

  • utput are both in the physical world

and are manipulable in the physical world, not purely on a screen

Activity sensing watch: ubiquitous but not tangible IoT fridge: neither (likely)

[Follmer et al. 2013]

slide-9
SLIDE 9

[Ishii, Mazalek, Lee 2001]

slide-10
SLIDE 10

Input: sensor-driven interaction

slide-11
SLIDE 11

Goals

How might people provide more fluent and effective input to interactive systems? Typical approaches

Come up with new signals Find new ways to recombine known signals

Always: demonstrate the technique in compelling interaction scenarios

11

slide-12
SLIDE 12
  • Bolt. “Put-that-there”: voice and gesture at the graphics interface.

SIGGRAPH ’80.

slide-13
SLIDE 13

Put That There

Contribution: combined gesture and voice input

In a closed world With a toy goal Using simple manipulation operations Using a laser attached to the wrist

In many ways, our goal since 1980 has been to relax those assumptions

13

slide-14
SLIDE 14

looks a bit like harry potter...

  • Wellner. Interacting with paper on the DigitalDesk. CACM ‘93.
slide-15
SLIDE 15

DigitalDesk

Contribution: fluid boundaries between digital and physical objects

In a constrained space On a small set of tasks With predefined behaviors

Again, we work to relax these assumptions

15

slide-16
SLIDE 16

16

Dietz and Leigh. DiamondTouch: a multi-user touch technology. UIST’ 01.

slide-17
SLIDE 17

General operating principle

17

k millisecond sample Root Mean Square (RMS) ratios between channels Frequency band z-score Derivatives, FFTs, etc. x N Sensors Machine learning model User specific fine- tuning (optional) Classification

Derived from [Saponas et al. 2009]

Features, e.g.: [Laput et al. 2015] [Laput et al. 2016] [Saponas et

  • al. 2009]
slide-18
SLIDE 18

18

Lien, Jaime, et al. "Soli: Ubiquitous gesture sensing with millimeter wave radar." ACM Transactions on Graphics (TOG) 35.4 (2016): 142.

slide-19
SLIDE 19

Harrison et al. "Skinput: appropriating the body as an input surface.” CHI 2010.

slide-20
SLIDE 20

EM-Sense

20

Laput, G. et al. 2015. EM-Sense: Touch Recognition of Uninstrumented, Electrical and Electromechanical Objects. UIST ’15.

slide-21
SLIDE 21

No, the other changing the world.

Output: changing the world

slide-22
SLIDE 22

How can we make the environment reactive?

We can make pixels dance — how do we make atoms dance too? What is a minimal instrumentation of the environment that we can perform to produce feedback that is as expressive as possible?

22

slide-23
SLIDE 23

Recall: inFORM

slide-24
SLIDE 24

Le Goc et al. “Zooids: Building Blocks for Swarm User Interfaces”. UIST 2016.

Y O U R E A D T H I S

slide-25
SLIDE 25

Mueller, Kruck and Baudisch. LaserOrigami: laser-cutting 3D objects. UIST 2012.

slide-26
SLIDE 26

Output: changing the virtual world

slide-27
SLIDE 27

Virtual experiences with IRL constraints

Physics, and our own perceptual systems, impose constraints on what can be believably conveyed. Research in AR, VR, and mixed reality often seek to push the boundaries of the realism of that experience.

27

slide-28
SLIDE 28

28

Orts-Escolano et al. “Holoportation: Virtual 3D Teleportation in Real- Time”. UIST 2016.

slide-29
SLIDE 29

Traxtion: perceived forces

[Rekimoto 2013]

Creates a haptic sensation without mechanical links to the ground

29

slide-30
SLIDE 30

Haptic Retargeting

[Azmandian et al. 2016]

Use “perceptual hacks” to make a single cube appear multiplied

30

slide-31
SLIDE 31

Discussion

Find today’s discussion room at http://hci.st/room