searching for gesture and embodiment in live coding
play

Searching for Gesture and Embodiment in Live Coding Spencer - PowerPoint PPT Presentation

Searching for Gesture and Embodiment in Live Coding Spencer Salazar California Institute of the Arts Valencia, CA ICLC 2017 | Morelia, Mexico Overview Tension between live coding, gesture, embodiment, and electronic music production


  1. Searching for Gesture and Embodiment 
 in Live Coding Spencer Salazar California Institute of the Arts Valencia, CA ICLC 2017 | Morelia, Mexico

  2. Overview Tension between live coding, gesture, 
 embodiment, and electronic music production Historical approaches to addressing this tension Reshaping of source code media A continuation of these ideas: 
 Auraglyph

  3. Live coding, Gesture, and Embodiment At first glance, live coding and gesture seem at odds Levels of indirection Continuous control Physical and visual dimensions of music performance Performer / Audience

  4. Gesture “bridge between movement and meaning” 
 (presumably, that of a musical nature) Jensenius et al., 
 “Musical Gestures: Concepts and Methods in Research,” 
 in Musical Gestures: Sound, Movement, and Meaning .

  5. Gesture Most attempts to define gesture admit keyboard activities Many definitions admit activities not visible to an audience

  6. (Aside) The notion of gesture in musical performance most likely needs reconsideration in the context of live coding (But not right now)

  7. Embodiment “[I]ndividual sensorimotor capacities […] 
 embedded in a more encompassing 
 biological, psychological and cultural context.” Varela, Thompson, and Rosch, 1991, 
 The Embodied Mind: Cognitive Science and Human Experience .

  8. Tensions Pursuit of “more humanly active live coding” Collins 2011, “Live Coding of Consequence,” Leonardo

  9. Tensions “A live coder […] is under pressure to do 
 something interesting in the moment” “[Abstraction and scheduling] can lead 
 to a lack of immediacy in how the 
 performer’s actions relate to the music” Stowell and McLean 2013, 
 “Live music-making: a rich open task requires a rich open interface,” 
 Music and Human-Computer Interaction .

  10. Tensions Embodiment in live coding surfaces in… “the physical interaction of our body […] with the machine” “how […] a programming language shapes our thinking about concepts and algorithms” 
 (and the programming language is shaped 
 by available hardware and end goals) Baalman 2016, “Embodiment of Code,” Proc. ICLC

  11. Assumptions/Oversights Assumption: 
 We care about musical gesture in live coding Not considering: 
 Embodiment of audience 
 (e.g. dance)

  12. Tensions Typing gestures not linked directly to music Rather, linked to production of a set of 
 instructions for producing music

  13. Argument 
 Existing mechanisms for imparting gesture and embodiment in live coding are unsatisfactory Proposal 
 Consider extensions to plain text as the 
 universal medium for programming code

  14. Prelude: ChucK on iPad Salazar and Wang 2014, “miniAudicle for iPad: 
 Touchscreen-based music software programming,” Proc. ICMC

  15. Gestural Dynamics of Mobile Touch Devices

  16. Gestural Dynamics of Mobile Touch Devices (Not so good for entering text) (We’ll get back to that)

  17. Algorithm and Gesture Algorithmic processes constrained 
 by what is readily formulated and encoded Gestural processes constrained by what is readily carried out by a body in the physical world

  18. Algorithm and Gesture Audible Intent Code Result

  19. Algorithm and Gesture Algorithm Audible Intent Code Result

  20. Algorithm and Gesture Algorithm Audible Intent Code Result

  21. Algorithm and Gesture Audible Intent Gesture Result

  22. Algorithm and Gesture Embodied Cognition Audible Intent Gesture Result

  23. Algorithm and Gesture (Ok but what if algorithm is the intent?)

  24. Text as Abstraction Code exists in many forms Programmer’s intent High/low-level languages Abstract syntax trees and 
 intermediate representations (Virtual) Machine code

  25. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  26. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch , TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  27. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  28. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  29. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  30. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  31. Alternative Abstractions Dataflow programming 
 PureData, Max/MSP, Reaktor, Kyma, TouchDesigner, etc. Scratch, TouchDevelop 
 (Resnick et al. 2009, Tillmann et al. 2011) Lisping Field 
 (OpenEnded Group) Processing (tweak mode)

  32. Sketching Live Code Auraglyph is a modular music patching system with sketching metaphor Hand-drawn figures -> reified into programming nodes Touch and drawn figures also used for control Salazar 2017, Sketching Sound: Gestural Interaction in Expressive Music Programming

  33. Auraglyph Goal 
 “physical” connection to sound processes

  34. Gestural Support Scrubbing parameters Writing parameters directly Organizing/patching Step sequencer, waveform editor, orientation sensor, 
 pen x/y/pressure 
 (secondary to coding activities) PULSE (2016)

  35. Gestural Support Multiple simultaneous gestures 
 (multiple performers) Physical dexterity Dynamic, real-time feedback PULSE (2016)

  36. Gestural Support Sense of 3D space Performer’s activities are fairly transparent to audience (if projected) Visual interest for audience PULSE (2016)

  37. Auraglyph in Practice How has this played out?

  38. PULSE (2016)

  39. Electronic Arts Ensemble (2017)

  40. A sonic ritual for Auraglyph (2017)

  41. Looking Forward Live coding beyond the laptop Merging layers of abstraction: 
 Text, graphical, dataflow, timeline, ?

  42. Conclusion Tension between live coding, gesture, and embodiment Many paths for addressing this tension Auraglyph: sketch-based dataflow live coding In the future?

  43. Thanks! Spencer Salazar 
 https://spencersalazar.com/ 
 ssalazar@calarts.edu Questions?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend