Searching for Gesture and Embodiment in Live Coding Spencer - - PowerPoint PPT Presentation

searching for gesture and embodiment in live coding
SMART_READER_LITE
LIVE PREVIEW

Searching for Gesture and Embodiment in Live Coding Spencer - - PowerPoint PPT Presentation

Searching for Gesture and Embodiment in Live Coding Spencer Salazar California Institute of the Arts Valencia, CA ICLC 2017 | Morelia, Mexico Overview Tension between live coding, gesture, embodiment, and electronic music production


slide-1
SLIDE 1

Searching for Gesture and Embodiment
 in Live Coding

Spencer Salazar California Institute of the Arts Valencia, CA ICLC 2017 | Morelia, Mexico

slide-2
SLIDE 2

Overview

Tension between live coding, gesture, 
 embodiment, and electronic music production Historical approaches to addressing this tension Reshaping of source code media A continuation of these ideas:
 Auraglyph

slide-3
SLIDE 3

Live coding, Gesture, and Embodiment

At first glance, live coding and gesture seem at odds Levels of indirection Continuous control Physical and visual dimensions of music performance Performer / Audience

slide-4
SLIDE 4

Gesture

“bridge between movement and meaning”


(presumably, that of a musical nature) Jensenius et al., 
 “Musical Gestures: Concepts and Methods in Research,”
 in Musical Gestures: Sound, Movement, and Meaning.

slide-5
SLIDE 5

Gesture

Most attempts to define gesture admit keyboard activities Many definitions admit activities not visible to an audience

slide-6
SLIDE 6

(Aside)

The notion of gesture in musical performance most likely needs reconsideration in the context of live coding (But not right now)

slide-7
SLIDE 7

Embodiment

“[I]ndividual sensorimotor capacities […] 
 embedded in a more encompassing 
 biological, psychological and cultural context.”

Varela, Thompson, and Rosch, 1991, 
 The Embodied Mind: Cognitive Science and Human Experience.

slide-8
SLIDE 8

Tensions

Pursuit of “more humanly active live coding”

Collins 2011, “Live Coding of Consequence,” Leonardo

slide-9
SLIDE 9

Tensions

“A live coder […] is under pressure to do 
 something interesting in the moment” “[Abstraction and scheduling] can lead 
 to a lack of immediacy in how the 
 performer’s actions relate to the music”

Stowell and McLean 2013, 
 “Live music-making: a rich open task requires a rich open interface,” 
 Music and Human-Computer Interaction.

slide-10
SLIDE 10

Tensions

Embodiment in live coding surfaces in… “the physical interaction of our body […] with the machine” “how […] a programming language shapes our thinking about concepts and algorithms”
 (and the programming language is shaped
 by available hardware and end goals)

Baalman 2016, “Embodiment of Code,” Proc. ICLC

slide-11
SLIDE 11

Assumptions/Oversights

Assumption:
 We care about musical gesture in live coding Not considering:
 Embodiment of audience
 (e.g. dance)

slide-12
SLIDE 12

Tensions

Typing gestures not linked directly to music Rather, linked to production of a set of 
 instructions for producing music

slide-13
SLIDE 13

Argument
 Existing mechanisms for imparting gesture and embodiment in live coding are unsatisfactory Proposal
 Consider extensions to plain text as the
 universal medium for programming code

slide-14
SLIDE 14

Prelude: ChucK on iPad

Salazar and Wang 2014, “miniAudicle for iPad: 
 Touchscreen-based music software programming,” Proc. ICMC

slide-15
SLIDE 15

Gestural Dynamics of Mobile Touch Devices

slide-16
SLIDE 16

Gestural Dynamics of Mobile Touch Devices

(Not so good for entering text) (We’ll get back to that)

slide-17
SLIDE 17

Algorithm and Gesture

Algorithmic processes constrained 
 by what is readily formulated and encoded Gestural processes constrained by what is readily carried

  • ut by a body in the physical world
slide-18
SLIDE 18

Algorithm and Gesture

Intent Code Audible Result

slide-19
SLIDE 19

Algorithm and Gesture

Intent Code Audible Result Algorithm

slide-20
SLIDE 20

Algorithm and Gesture

Intent Code Audible Result Algorithm

slide-21
SLIDE 21

Algorithm and Gesture

Intent Gesture Audible Result

slide-22
SLIDE 22

Algorithm and Gesture

Intent Gesture Audible Result

Embodied Cognition

slide-23
SLIDE 23

Algorithm and Gesture

(Ok but what if algorithm is the intent?)

slide-24
SLIDE 24

Text as Abstraction

Code exists in many forms Programmer’s intent High/low-level languages Abstract syntax trees and 
 intermediate representations (Virtual) Machine code

slide-25
SLIDE 25

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-26
SLIDE 26

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-27
SLIDE 27

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-28
SLIDE 28

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-29
SLIDE 29

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-30
SLIDE 30

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-31
SLIDE 31

Alternative Abstractions

Dataflow programming


PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc.

Scratch, TouchDevelop


(Resnick et al. 2009, Tillmann et al. 2011)

Lisping Field


(OpenEnded Group)

Processing (tweak mode)

slide-32
SLIDE 32

Sketching Live Code

Auraglyph is a modular music patching system with sketching metaphor Hand-drawn figures -> reified into programming nodes Touch and drawn figures also used for control

Salazar 2017, Sketching Sound: Gestural Interaction in Expressive Music Programming

slide-33
SLIDE 33

Auraglyph

Goal 
 “physical” connection to sound processes

slide-34
SLIDE 34

Gestural Support

Scrubbing parameters Writing parameters directly Organizing/patching Step sequencer, waveform editor, orientation sensor,
 pen x/y/pressure


(secondary to coding activities) PULSE (2016)

slide-35
SLIDE 35

Gestural Support

Multiple simultaneous gestures
 (multiple performers) Physical dexterity Dynamic, real-time feedback

PULSE (2016)

slide-36
SLIDE 36

Gestural Support

Sense of 3D space Performer’s activities are fairly transparent to audience (if projected) Visual interest for audience

PULSE (2016)

slide-37
SLIDE 37

Auraglyph in Practice

How has this played out?

slide-38
SLIDE 38

PULSE (2016)

slide-39
SLIDE 39

Electronic Arts Ensemble (2017)

slide-40
SLIDE 40

A sonic ritual for Auraglyph (2017)

slide-41
SLIDE 41

Looking Forward

Live coding beyond the laptop Merging layers of abstraction:
 Text, graphical, dataflow, timeline, ?

slide-42
SLIDE 42

Conclusion

Tension between live coding, gesture, and embodiment Many paths for addressing this tension Auraglyph: sketch-based dataflow live coding In the future?

slide-43
SLIDE 43

Thanks!

Spencer Salazar
 https://spencersalazar.com/ 
 ssalazar@calarts.edu Questions?