rtsmcs
play

RTSMCS REAL-TIME SPATIAL MUSIC COMPOSITION SYSTEM Agenda - PowerPoint PPT Presentation

RTSMCS REAL-TIME SPATIAL MUSIC COMPOSITION SYSTEM Agenda Introduction what is the RTSMCS? Introduction built for collaboration How does it work? How to operate the system Systems analysis and design Structured


  1. RTSMCS REAL-TIME SPATIAL MUSIC COMPOSITION SYSTEM

  2. Agenda • Introduction – what is the RTSMCS? • Introduction – built for collaboration • How does it work? • How to operate the system • System’s analysis and design • Structured Analysis view – Data Flow Diagram • Object Oriented view - Class Diagram • Data flow between classes

  3. Introduction – what is the RTSMCS? The RTSMCS (Real-Time Spatial Music Composition System) is system for real-time music composing that utilizes hand movement / position data in conjunction with a database of categorized short musical segments and a predefined algorithm that can combine and restructure such segments, to produce novel musical creations. In short the system does the following: • Captures the system operator’s hand positions + orientation in space • Uses a subset of the position data to control the manner in which a semi-autonomous music composition algorithm, which makes its music up using pre-written short segments of music from a given database, will currently create its music. • Various musical transformations are applied to the music generated by the composition algorithm, based on the rest of the hand position data. • The real-time generated music is sent to a MIDI (Musical Instrument Digital Interface) compatible instrument to play it, and is stored in the end of the session to a MIDI file. Further more, the system also provides a reusable framework which can be used to build various real- time music creation and manipulation systems.

  4. Introduction – built for collaboration The modular design of the system allows for collaborators from fields such as Computer Science and Music to build unique music creation systems by supplementing the current components of the system with their own. It also allows each collaborator to contribute his part without necessarily having to know in detail how the other parts in the system work: • The music composer can compose the musical segments, and then export them from their favorite scoring software to MIDI files. Then, just share the files - no programming knowledge is needed. • The operator of the system don’t necessarily need to have any musical or programming knowledge (although musical knowledge can be helpful), just some basic understating of the system controls and some basic familiarity with the music segments database they’re going to work with. • The only part which requires knowledge in both music and programming is creating a new segment composition algorithm. Although some basic segment composition algorithms are already provided with the system, I wish to encourage musicians and programmers to create new interesting algorithms for musical segment composition, since it’s a great opportunity for a direct collaboration.

  5. Introduction – how does it work? To obtain the body position / orientation data, the system uses OptiTrack: Motive – a system which tracks reflective markers, and groups of such markers, using infrared cameras, image processing and computational geometry. The markers can be attached to various objects including the human body. Since we ’ re interested in hand positions, markers are attached to gloves wore on the hands. The hand position data is transformed by RTSMCS into internal system parameters, which in turn control various musical aspects in the music that is being generated in real-time by the system. The categorized short musical segments database consists of a directory which is structured in the following way: The directory has subdirectories, each representing a the musical category of the segments inside of it (e.g. Ascending Melodies, Descending Melodies). The musical segments inside each subdirectory are contained in MIDI files. The system allows the operator to give it a hint as to how the music, which is being generated, should sound at this given moment based on the operator ’ s hand movements. Those hints are given to the system by the rotation of the operator ’ s hand around its axis – the angle of the hand dictates from which category in the database the composing algorithm should take the segments which it will process and combine to create new music.

  6. How to operate the system • Moving the hand around its axis (roll) – Controls the segment category selection. Meaning, the angle of the hand in the current instance implies the category from which the system will currently choose musical segments which it will compose together. Categories are arranged lexicographically by their names from left to right. Y • Moving the hand on the X axis – Controls pitch shifting (transposition). Moving the hand rightwards will shift the pitch of the currently composed music up, while moving the hand leftwards will shift roll X the pitch of the currently composed music down. Z • Moving the hand on the Y axis – Adjusts the volume (dynamics) of the music currently being composed. Upwards increases volume, downwards decreases it. • Moving the hand on the Z axis – Adjusts tempo of the music currently being composed. • Forward speeds up the tempo, backward slows it down.

  7. RTSMCS operation schematics y axis – adjust dynamics Angle – select midi segment category z axis – adjust tempo x axis – adjust pitch

  8. System’s analysis and design Although the initial system requirements analysis didn’t explicitly call for it, the eventual design of the system turned out to be a one that allows for almost every system component to be replaced, as long as the replacing component implements all of the required interfaces. For instance, although the RTSMCS was originally indented to get it user parameters form the OptiTrack: Motive motion tracking system, it can in theory work with various human-machine interfaces (e.g. on screen GUI, mouse, joystick, webcam) given that the appropriate adapter- class is implemented. Similarly the music composition logic is also not bound to just manipulating pre-written segments from a database, but can be extended to any logic that complies with the given interface.

  9. Structured Analysis view – Data Flow Diagram Translate spatial Extract Raw Optitrack data x-y-z + angle of hands positions to Optitrack system objects internal positions system params Numerical hint Pitch, tempo, dynamics and sustain (User’s category requests) adjustment parameters Midi part requests Virtual Instrument Adjust Midi msg stream pitch, Categorized midi Compose tempo, segments DB segments dynamics Midi segments Midi message stream MIDI file & sustain notes

  10. Object Oriented view - Class Diagram

  11. Data flow between classes Raw input data Human Interface Device Numerical hints, Pitch, tempo, dynamics and sustain adjustment parameters Virtual Instrument Final MIDI messages Numerical hints (User’s category requests) MIDI file MIDI messages + MIDI messages Pitch, tempo, dynamics MIDI messages and sustain Adjusted adjustment parameters MIDI messages

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend