user interface technology
play

User interface technology CS 347 Michael Bernstein Announcements - PowerPoint PPT Presentation

User interface technology CS 347 Michael Bernstein Announcements Articulating Contributions in HCI due today Project Brainstorm due next Friday Today at the end of class: team mixer Reminder: methods lecture on Wednesday, and stats session


  1. User interface technology CS 347 Michael Bernstein

  2. Announcements Articulating Contributions in HCI due today Project Brainstorm due next Friday Today at the end of class: team mixer Reminder: methods lecture on Wednesday, and stats session next Wednesday @ 6pm in Littlefield 103. If you don’t know how to do a t-test, one way and two way ANOVA, or chi square, or how to write up the results and effect size for a paper, join! 2

  3. Course Overview week 1 Intro to Interaction; Intro to Social Computing INTRO week 2 Intro to Design; Interaction week 3 Methods; Interaction DEPTH week 4 Social Computing week 5 Design week 6 AI+HCI; Media week 7 Accessibility; ICT4D BREADTH week 8 Foundations; Cognition week 9 Collaboration; Programming week 10 Visualization; Critiques of HCI 3

  4. Recall: ubiquitous computing “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” — Mark Weiser In contrast to visions of machines everywhere, Weiser advocated a vision of calm computing where computing receded into the background. 4

  5. How to achieve this vision? Interaction UI Technology (“UIST”) Ubiquitous computing (“Ubicomp”) Input, output, and interaction Integration into life and into the modalities lived environment 5

  6. User interface technology How can the user interact fluidly with the world around them? New input modalities: e.g., radar, acoustics New output modalities: e.g., fabrication, swarm robots New user vocabulary: e.g., voice, gestures This research is often driven by, or involves the creation of, new hardware 6

  7. Y O U Recall: tangible computing R E A D T H I S [Ishii and Ullmer 1997] 7

  8. Y O U What makes an R E A D T H I interface tangible? S Ubicomp: integrated into the environment Tangible: stronger claim — input and output are both in the physical world and are manipulable in the physical world, not purely on a screen Activity sensing watch: ubiquitous but not tangible [Follmer et al. 2013] IoT fridge: neither (likely)

  9. [Ishii, Mazalek, Lee 2001]

  10. Input: sensor-driven interaction

  11. Goals How might people provide more fluent and effective input to interactive systems? Typical approaches Come up with new signals Find new ways to recombine known signals Always: demonstrate the technique in compelling interaction scenarios 11

  12. Bolt. “Put-that-there”: voice and gesture at the graphics interface. SIGGRAPH ’80.

  13. Put That There Contribution: combined gesture and voice input In a closed world With a toy goal Using simple manipulation operations Using a laser attached to the wrist In many ways, our goal since 1980 has been to relax those assumptions 13

  14. looks a bit like harry potter... Wellner. Interacting with paper on the DigitalDesk. CACM ‘93.

  15. DigitalDesk Contribution: fluid boundaries between digital and physical objects In a constrained space On a small set of tasks With predefined behaviors Again, we work to relax these assumptions 15

  16. Dietz and Leigh. DiamondTouch: a multi-user touch technology. UIST’ 01. 16

  17. General operating principle Derived from [Saponas et al. 2009] x N Sensors [Laput et al. [Laput et al. [Saponas et 2015] 2016] al. 2009] k millisecond sample Features, e.g.: Root Mean Square (RMS) Machine learning ratios between channels Classification model Frequency band z-score User specific fine- Derivatives, FFTs, etc. tuning (optional) 17

  18. Lien, Jaime, et al. "Soli: Ubiquitous gesture sensing with millimeter wave radar." ACM Transactions on Graphics (TOG) 35.4 (2016): 142. 18

  19. Harrison et al. "Skinput: appropriating the body as an input surface.” CHI 2010.

  20. EM-Sense Laput, G. et al. 2015. EM-Sense: Touch Recognition of Uninstrumented, Electrical and Electromechanical Objects. UIST ’15. 20

  21. Output: changing the world No, the other changing the world.

  22. How can we make the environment reactive? We can make pixels dance — how do we make atoms dance too? What is a minimal instrumentation of the environment that we can perform to produce feedback that is as expressive as possible? 22

  23. Recall: inFORM

  24. Y O U R E A D T H I S Le Goc et al. “Zooids: Building Blocks for Swarm User Interfaces”. UIST 2016.

  25. Mueller, Kruck and Baudisch. LaserOrigami: laser-cutting 3D objects. UIST 2012.

  26. Output: changing the virtual world

  27. Virtual experiences with IRL constraints Physics, and our own perceptual systems, impose constraints on what can be believably conveyed. Research in AR, VR, and mixed reality often seek to push the boundaries of the realism of that experience. 27

  28. Orts-Escolano et al. “Holoportation: Virtual 3D Teleportation in Real- Time”. UIST 2016. 28

  29. Traxtion: perceived forces [Rekimoto 2013] Creates a haptic sensation without mechanical links to the ground 29

  30. Haptic Retargeting [Azmandian et al. 2016] Use “perceptual hacks” to make a single cube appear multiplied 30

  31. Discussion Find today’s discussion room at http://hci.st/room

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend