From: http://www.cirmmt.org/activities/newsletter/past/september12
A VIBROTACTILE SYNTHESIS FRAMEWORK FOR HAPTIC FEEDBACK IN LIVE-ELECTRONIC MUSIC PERFORMANCE
The goal of this project is the development of a vibrotactile synthesis framework to provide performers with information about the internal state of a live-electronics system. We propose to exploit the haptic modality as an alternative channel for information display. The use of live electronics, i.e. the live processing of sound during a performance, is a common practice in contemporary music performance; artists make use of novel input devices and digital signal processing (DSP) techniques to conceive new forms of expression. An important issue in these contexts is the synchronization between instrumental performance and the live electronics. A number of approaches have been developed to facilitate their interaction: apart from score-following techniques (which are often avoided for reasons of reliability and complexity), a common technique is the use of simple onstage input devices, such as MIDI foot pedals, which allow the instrumental performer to (discretely) synchronize with the real-time processing by sending triggers to the system.
The CIRMMT Live Electronics Framework (CLEF) is a modular system for composition and performance of live electronic music which has been used for the realization of electroacoustic works at McGill as well as in a number of CIRMMT projects since 2009. Control parameters for DSP processes in CLEF are created during the composition process which are typically triggered by the performer on-stage using a generic input device. A common difficulty with this approach, however, is the lack of feedback to the performer about the state of the live electronics system. This often results in a sort of “limbo” in which – for a certain amount of time – the performer may have no feedback regarding the results of their interaction. To improve this situation, different approaches have been taken, such as using on-stage visual or auditory feedback (demanding the performer to watch a screen or listen to a click-track), or having an off-stage assistant in charge of controlling the electronics. These solutions can be problematic though, since delivering additional information via the visual and auditory channels often distracts the performer from the actual musical expression. The presence of an external assistant, on the other hand, may render the performer’s interactions almost obsolete.
Our approach is then to investigate vibrotactile stimuli using appropriate combinations of haptic actuators and signals in order to provide a feedback system that can become transparent to the user, both in terms of physical obtrusiveness and cognitive load. Significantly, the proposed system will be specific to live-electronic performance practice, in that it will convey information about internal variables within CLEF (such as automation curves, algorithmic processes, analysis parameters, etc.). Leveraging existing works in the field of tactile actuators and synthesis of tactile events, we plan to investigate different mapping strategies. We will validate the system qualitatively and quantitatively through surveys and experiments with performance students, and integrate it into CLEF with the aim of using it for the public performance of works composed in McGill’s electroacoustic composition class in March 2013.
Marcello Giordano is a Ph.D. student in the Input Devices and Music interaction Laboratory (IDMIL) under the supervision of Prof. Marcelo M. Wanderley. His Ph.D. research is devoted to the study of tactile feedback and stimulation in music perfomance: how tactile information can be efficiently used to convey muscial content, and how this extra sensory channel can be used to extend musical practice with tradition and digital musical instruments, are two of the main questions he is trying to address in his Ph.D. Thesis.
Marlon Schumacher is an electroacoustic music composer and doctoral researcher in music technology. His research focus is on spatial sound synthesis and computer-aided composition. As research assistant for CIRMMT’s research axis for expanded musical practice he has been developing computer music tools and interfaces for composers, such as the prototype for the graphical user interface for Integra and the CIRMMT Live-Electronics Framework (CLEF), a software environment for composition and performance of live-electronic music. His current research investigates dictionary-based methods for compositional control of spatial sound.”