This week’s lecture focused on programming style and strategies for structuring a patch for real-time performance situations.
After looking at different solutions for Class 3’s programming assignment we discussed strategies for detecting and removing programming mistakes (debugging) to make you programs behave as intended.
We then studied different approaches to realize more complex data-flows and introduced the [router] object, which allows sending mutliple sources of Max-data to multiple destinations in a n*m matrix. We used the [matrixctl] object to control the [router] object and used the [preset] object to store the presets for the [matrixctl]. We also discussed ways of structuring a performance patch by identifying and separating components responsible for Signal Routing, Processing, and User-Interaction. We also briefly mentioned the concept of Context Sensitivity, i.e. showing/hiding parts of an interface depending of the task at hand (the context) using the [matrixctl] example – which presents all possibilities to the user at any time (similar to a physical routing matrix). We’ll talk about this in more detail in upcoming classes.
We discussed the 3 components of an interactive music system, being:
- COGNITION (analysis, detection)
- MEMORY (data, knowledge representation)
- GENERATION (synthesis)
Taking the Regola dell’Ottava harmonization rule we looked at before as an example, the cognition (or analysis) part would consist of determining the direction of traversing the scale degrees (upwards or downwards), the memory part would be the storage of the required harmonies/intervals (using the [coll] object) and the generation part would be adding those intervals to the bass note. Please find this example in the weekly patches.
We then looked at different possibilities to analyze an incoming stream of MIDI note data, such as:
- conditional statements using logical operators and gates/switches
- conditional statements using the [if] object
- use of the [match] object to detect certain sequences of numbers (e.g. for detecting melodic fragments)
- statistics on an incoming MIDI-stream using the zl objects (mean, median, maximum, minimum, etc.)
- the use of the thresh/quickthresh objects to detect chords vs. individual notes
We then implemented a simple example of a MIDI key-split algorithm using if statements (included in the weekly patches).
Todd Winkler’s book (available in the Bibliography section) has a chapter dedicated to analysis of incoming MIDI-streams, titled “The Computer as Listener: Analyzing and Storing Performance Data”. This is a great resource if you wish to learn more about interactive MIDI-based music on your own. Similar examples as the ones we discussed in class can be found on pages 162-172.
We also introduced the [bpatcher] object, which can be thought of similar to an abstraction ‘you can look inside’. This object is particularly useful for designing user interfaces and we programmed a simple example of a bpatcher which can be used to control a parameter of choice specified via the first argument of the bpatcher.
The patches from last class are available in the Download section. I have also included patches from Todd Winkler’s book which are relevant for our current topics.
For next week I’d like you to ‘enhance’ the MIDI-performance patch from last week. You can/should use one of the weekly patches as a starting point. You can find the description on myCourses. In particular, I’d like you to analyze an incoming MIDI-stream for a parameter of your choice (e.g. pitches/melody/harmony) and use this for the control of one (or multiple) parameters of your processing. I have included an example for detecting the inter-onset-time (i.e. the time passed between the start of two successive notes) to control the delay time of your delay module.
Please also read the text “Interactive Composing: An Overview” by J. Chadabe (you can find this in the Reading List). This text will become part of your first Resumé assignment.