1
Seeing, Hearing, and Touching: Putting It All Together
Sensory Integration Module
- Seeing and Hearing Events
Fisher
- Touching, Seeing, and Hearing
MacLean
- Integrating Applications: Tight Coupling & Physical Metaphors MacLean
- Integrating Applications: Designing for Intimacy
Fels
The morning talks gave a perspective on how vision science can be used to inform the design of visually complex interfaces such as those used in information visualization
- systems. The second half of the course looks at intersensory interactions and how they
can inform the move to multimodal environments. These environments combine visual, auditory, and haptic displays with a richer set of inputs from users, including speech, gesture,and Biopotentials. 2
Seeing and Hearing Events (Fisher) 2Moving to multimodality
Vision Virtual interaction model Force & tactile feedback
Psychophysics of vision, sound, and touch will change when environment is multimodal
Hearing Force display technology works by using mechanical actuators to apply forces to the
- user. By simulating the physics of the user’s virtual world, we can compute these
forces in real-time, and then send them to the actuators so that the user feels them/ 3
Seeing and Hearing Events (Fisher) 3Intersensory Interactions
- Intro and metacognitive gap
- Integrating Cognitive Science in design
- Cognitive Architecture
– Modularity and multimodal interaction
- Information hiding-- conflict resolution
- Cognitive impenetrability
- Performance differences between modules
- Recalibration
– Spatial indexes in complex environments
- Multimodal cue matching within modules
We begin with a justification for an increased role for theory in the design of these more complex interfaces. I will argue that the combination of a large design space and the structural inability of humans to introspect at the level of sensory and attentional processes makes conventional design techniques inadequate in these situations. This is followed by a brief discussion of the challenges of taking information from Psychology, Kinesiology and other disciplines that may fall under the broad banner of Cognitive Science into account in designing interactive applications. 4
Seeing and Hearing Events (Fisher) 4Vision systems to multimodality
- Ron: Vision systems and subsystems
– Pre-attentive vision (gist, layout, events) – Attention (grab ~5 objects for processing) – Combine for “virtual representation”
- Extend system concept to modalities
– Some are similar across modalities – Some are multimodal – Some are task-dependent
Extending the visual perception studies described by Ron and applied by Tamara to multimodal interaction is conceptually simple, since vision is composed of separate channels that can be thought of as modalities. The move to multimodality is similar to the move to multiple channels, or systems in vision.