In this week’s lecture (Max & MIDI III) we continued improving the structure of our performance patch. We learned why it makes sense to separate the score from the processing part and the user interface. We found that there are parts of our program we do not need to look at during a performance, more specifically, the “computer score” and the “processing” part. The User Interface serves 2 main functions: To give a visual representation of the state of the program for monitoring purposes (VIEW), and possibilities for interacting, i.e. controlling the program (CONTROL).
We also looked at ways for dynamically control parameters using the clef.line abstraction. This is possible via a list of
In terms of structuring the “computer score” for our patch, we introduced the concepts of “Events” and “Cues”:
- An “Event” is a container for a sequence of instructions (e.g. a message-box preset to recall a certain state), temporal processes (e.g. dynamic parameter changes), routing of processing modules, and possibly algorithmic processes (e.g. using Max programming). An Event is enclosed in a sub patcher and assigned a descriptive name.
- A “Cue” is just an index (e.g. a number) pointing to a certain event, and possibly multiple events. Cues are typically ordered sequentially over time, so that during a piece you can step through them in a specific order, e.g. using a MIDI sustain pedal.
We have seen that this separation of Cues and Events allows for more flexibility of re-orderding, duplication/editing score data for a live electronics patch.
Please find the weekly patches in the Downloads section. I’ve also included a Midi-Sequencer module which is based on [borax] and [coll].
For the Improvisational Midi Study next week (described on myCourses) you can make use of any of the patches we have been using so far. Of course you can program new modules if you wish and use them for your piece.
In addition to the text by J.Chadabe from last week I would like you to read the texts by J.C. Risset “Composing in Real-Time” and K.H. Essl’s description of the “Lexikon Sonate” and write a short summary (about 1 page 12pt single-spaced), comparing the viewpoints of the authors. Please conclude with one paragraph of your personal reflections on real-time/interactive composition.
Quick explanation of the [clef.line] abstraction:
clef.line takes a list of tuplets to specify changes over time (in fact, describing line segments of a breakpoint function) in the format “x y x y x y x y …”
In human pseudo-code this tells [clef.line] to do the following:
• Go to x in y milliseconds
• Next, go to (next) x in (next) y milliseconds
• Next, go to (next) x in (next) y milliseconds
…
The total time of the breakpoint-function is the sum of the time of all the line segments, i.e. values for y.
A more concrete example:
If the message to [clef.line] is:
“3 1000 10 500 1 2000″
this would mean:
• Go to 3 in 1000 milliseconds
• Next, go to 10 in 500 milliseconds
• Next, go to 1 in 2000 milliseconds
Total time: 1000+500+2000=3500ms (3.5 secs).
If you wish that clef.line outputs decimal (floating point) numbers rather than integers, you need to write your destination values (x values) as decimal numbers, i.e.:
“3.0 1000 10.0 500 1.0 2000″.