Gesture cutting through textual complexity:
A model and a tool for the embodied navigation of complex piano notation
The proposed paper introduces a model of embodied interaction with complex piano notation and a prototype interactive system for the gestural processing and control of musical scores.
In the first part, we present the post-cartesian foundations of an embodied navigation model, applied to symbolic notation as complex as Iannis Xenakis’s and Brian Ferneyhough’s: Learning and performance are conceptualized as embodied navigation in a non-linear notational space of affordances. The performer moves inside several dimensions of the score-space and manipulates through performative gestures the elements of notation as if they were physical objects. This manipulation forms indispensable part of the cognitive processes involved in learning and performing and actively transforms the notation. In this sense, gesture acts as an interface for notation processing and notation forms part of a dynamic system rather than the composer’s “brain in a vat”. Concepts from Gibson’s ecological psychology, Rowlands’s externalism, Lakoff’s metaphor theory, dynamic systems theory and, last but not least, Leman’s mediation theory are mapped upon Xenakis’s and Ferneyhough’s ideas on notation and performance, offering an embodied and extended alternative to traditional interpretation models.
The second part proposes a technological application of the above-mentioned model. It introduces a recently developed prototype interactive system for the real-time processing and control of complex piano notation through the pianist’s gesture. This system, by the name GesTCom, draws from latest developments in the fields of computer music representation (augmented and interactive musical scores via Fober’s INScore) and gesture modeling (motionfollower by Bevilacqua / ISMM Team IRCAM). Gestural, video, audio and MIDI data are captured, qualitatively correlated to the musical score and appropriately mapped back into it, turning it into a personalized, dynamic, multimodal tablature. This tablature may be used for performance analysis and documentation, learning through augmented feedback, and can contribute to the design of interactive multimodal systems, including score-following ones.
Concluding: We wish to present a performer’s perspective on the osmosis between contemporary performance practice, embodied cognition and computer music interaction, by way of a theoretical model of embodied navigation of complex notation and an interactive system dedicated to it. This presentation affirms the centrality of gesture as an interface between physical gesture and symbolic representations and hopes to contribute in the discussion concerning the ontological status of gesture and notation in a digitally mediated world.