Workshop organised by Pavlos Antoniadis in the context of
Journée d'étude « L'interaction homme - machine en musique »
introduces GREAM’s new studio for gesture capture and consists of two parts.
In the first part, the participants will be presented with latest technologies and concepts in the field of musical interactive systems by the invited leading researchers Frédéric Bevilacqua (head of the Sound Music Movement Interaction team at IRCAM), Baptiste Caramiaux (Marie Sklodowska-Curie Research Fellow between McGill University and IRCAM) and Andrew McPherson (Associate Professor in the Centre for Digital Music at Queen Mary University of London), and the pianist researcher Pavlos Antoniadis (doctoral student at GREAM and IRCAM). Bevilacqua will present an overview of research on gesture capture and analysis, including: augmented instruments developed in collaboration with composers and performers; tangible interfaces for interaction with digital sound environments (MO Modular Musical Objects); and applications related to sensorimotor learning and embodied music cognition. Caramiaux will present tools for data analysis using advanced computational techniques such as machine learning. He will introduce the motivations behind such approach and techniques, the practical use of these techniques through existing tools, and what we can get from them at both an analysis level and interaction level. McPherson will present TouchKeys, an augmented keyboard technology which turns the surface of every key into a multi-touch control surface and allows for the capture of position of the player’s fingers. Its applications include the study of gestural language of piano performance and the addition of new expressive control dimensions to each note, such as vibrato, pitch bends and changes in volume and timbre. Antoniadis will present applications of gesture capture in his performance and research, in the form of a real-time simulation of learning Brian Ferneyhough’s work Lemma-Icon-Epigram. His approach features the concept of “embodied navigation of complex notation” and the prototype system GesTCom for the gestural control of scores, developed with Bevilacqua at IRCAM.
In the second part, the participants will have the chance of a hands-on experience of the systems presented in the first part. At the same time, and on the occasion of the first public presentation of GREAM’s new studio for gesture capture, GREAM’s members are warmly invited to discover its possibilities for the documentation of the musical act, including a Disklavier equipped with McPherson’s TouchKeys, motion capture system, kinect and inertial sensors, next to basic audio recording facilities.
Biographies:
Frédéric Bevilacqua is the head of the Sound Music Movement Interaction team at IRCAM in Paris. His research concerns the modelling and the design of interaction between movement and sound, and the development of gesture-based interactive systems.
Baptiste Caramiaux is a Marie Sklodowska-Curie Research Fellow between McGill University (Montreal, Canada) and IRCAM (Paris, France). His current research focuses on the understanding of the cognitive processes of motor learning in musical performance and the computational modelling of these processes. Before, he worked on gesture expressivity and the design of musical interactive systems through machine learning. He conducted academic research at Goldsmiths, University of London, and applied part of his academic research works on industrial products at Mogees Ltd. Baptiste holds a PhD in computer science from University Pierre et Marie Curie in Paris, and IRCAM Centre Pompidou.
Andrew McPherson is a Senior Lecturer (Associate Professor) in the Centre for Digital Music at Queen Mary University of London. A composer and electrical engineer by training, he studied at MIT (M.Eng. 2005) and the University of Pennsylvania (Ph.D. 2009) and spent a two-year postdoctoral fellowship at Drexel University. His research focuses on augmented instruments, embedded hardware systems and the study of performer-instrument interaction. He is the creator of the magnetic resonator piano, an augmented acoustic piano which has used by more than 20 composers worldwide, and the TouchKeys multi-touch keyboard which has shipped to musicians worldwide through a 2013 Kickstarter campaign and 2015 production run. In 2016, his lab launched Bela, an ultra-low-latency embedded platform for creating musical instruments and interactive audio systems.
Pavlos Antoniadis is a Berlin-based pianist and doctoral researcher at IRCAM and LabEx GREAM. He has performed in Europe, the Americas and Asia and has recorded for Mode and Wergo records. He was a Musical Research Residency fellow at Ircam in 2014 and has been invited for lecture-performances at important European institutions. Pavlos holds degrees in piano performance (MA, UC San Diego) and musicology (Athens National University). He has studied on LabEx GREAM, Fulbright, UC San Diego, Nakas conservatory, IEMA Frankfurt and Impuls Academy Graz scholarships.
2 thoughts on “24.03.2016 / Workshop « Mener une étude expérimentale de l’interaction homme – machine en musique : concepts, outils et équipement »”