PERFORMATIVE EXPERIMENTS IN XR ENVIRONMENTS


In my PhD work at IPEM-University of Ghent (2015-2018), I focused on the integration of virtual and augmented technology to develop a method to track pianists' physicality and 'choreography' in relation to score interpretation (Caruso 2018). At that time, motion capture technology (MoCap) was used in my performance practice to support the reflection on my artistic practice (Schön 1984) and analysis of my gestures in relation to the score annotation. The MoCap system consisted of reflective markers that covered the performer's body, and a series of infrared cameras that tracked the movement of these. The data acquired during the motion tracking recording reported biofeedback responses, such as the displacement, acceleration, velocity, and quantity of motion, which were then elaborated and processed by specific software for motion analysis (Goebl et al. 2014). The system turned the captured gestures into a digital 3D stick figure, i.e., a virtual agent, that duplicated the body of the performer in virtual reality.

Fig. 1 and 2. Experimentation with motion capture and body resolution in the virtual space at the ASIL-Lab, IPEM University Ghent

Compared to the 2D images of normal video cameras (frontal, lateral, or whatever perspective is used), the 3D viewpoint allowed for rotation and visualisation of the recorded movements from different perspectives, in a virtual space. In this way, MoCap recordings provided both measurements of the movements and precise 3D reconstructions and visualisations of the performer's bodily activity. A mixed-method approach was employed, integrating qualitative data (descriptions from my self-observation and movement evaluations) with quantitative data (measurements of my gestures in relation to the sound results). In this way, virtual and augmented technology was used to develop a system to improve my own performance because it potentiated the visualisation of gestural involvement and body movements in performance practice and provided an alternative non-verbal documentation, analysis and dissemination of music performance practice as research (Caruso 2018).

This initial project, entitled My Avatar and Me (2016), was then co-developed as a public performance, entitled Avatar Piano Project (2019), in collaboration with and with the support of LWT3 Srl. Milana company for data analysis and visualisation. This project showed, on stage, the real-time motion data produced by the performer and how it is transformed into an avatar.

Fig. 3. LWT3 Srl. Milan team (Paolo Belluco, Samuele Polistina and Andrea Randone) and the author

From the analytical perspective provided by the use of the MoCap technology as a biofeedback tool to study musical gestures, my interest moved to the performative perspective; exploring the creative process of immersive and extended music performances. Thus, the next step was to investigate the potential of augmented and virtual technology in the creation of a piano performance where my movements were dematerialised (Lippard & Chandler 1968), captured as pure data, and rematerialised in the form of visual and sound effects.

A pioneering experiment and creation of an XR piano duet in the metaverse was conducted during my postdoctoral artistic research project T\*ActiLE at the Royal Conservatoire Antwerp, commencing in 2021, in collaboration with the LWT3 team, including engineer Paolo Belluco (PhD), designer Samuele Polistina, and VR developer Andrea Randone. The goals were the development of a method to merge artistic, scientific and technological knowledge in the creation of music performances in a phygital environment and the exploration of embodied interaction in extended performance spaces.

Preliminary Investigation

This experimentation finds its conceptual root in my cooperation with the empirical research on body interactions, flow and presence of two music performers in virtual reality, conducted at the ASIL Lab, IPEM, Ghent in 2019. In this preliminary investigation, the performance of the minimalistic work Piano Phase (1967) for two pianos by the American composer Steve Reich (1936-) was chosen because it provides an excellent musical case of interaction and counterpoint between two performers. In this piece, Reich applied his phase-shifting technique; two pianists play a melodic pattern of twelve notes by starting in unison, while the second pianist is asked to gradually increase the phase shifts in an accelerando modality on the first pianist's part, who keeps a constant tempo. These intermittent tempo changes cause, over the course of the performance, a dynamic variety of interlocking counterpoints and a variety of harmonies until both pianists come back in unison.

Fig. 4. Experimental study on the interaction between two pianists at the ASIL-Lab, IPEM University Ghent

Reich himself explains the compositional and performative instruction:

The first pianist starts at 1 and the second joins him in unison at 2. The second pianist increases his tempo very slightly and begins to move ahead of the first until (say 30-60s) he is one sixteenth ahead, as shown at 3. The dotted lines indicate this gradual movement of the second pianist and the consequent shift of phase relation between themself and the first pianist. This process is continued with the second pianist gradually becoming an eight (4), a dotted eight (5), a quarter (6), etc., ahead of the first until he finally passes through all 12 relations and comes back into unison at 14 again. (Reich 2002: 1)

The goal of this first empirical experimentation with a piano performance in VR was to establish 'a methodological framework for assessing social presence in music interactions in virtual reality' (Van Kerrebroeck et al. 2021). The protocol consists of a comparative analysis of Piano Phase performance across two pianists playing in three musical layers and contexts; in a real, a mixed, and a virtual environment.

Fig. 5. The first page of the score Piano Phase by Steve Reich

VR technology emerges as a different empirical research strategy offering significant advantages over traditional experimental approaches or research techniques. It allows for precise control of multimodal, dynamic, and context-rich stimuli (Kothgassner & Felnhofer 2020). The simulation characteristic of VR addresses the inherent contradiction found in conventional empirical research methods. To gain valid insights and results, researchers are driven to study phenomena in their natural state, without any interference. This is possible while maintaining the level of authenticity necessary for reliable responses.

The purpose of the VR visor in this research was to explore how the boundaries between the real and virtual space can be broken, also in music performance. Looking at the musical interaction between bodies, the digital process involves the creation of a suitable sensorimotor environment for musical interaction. Therefore, the capture of the whole body is an essential prerequisite for using VR as a research tool to create the illusory sensation of genuinely being together, and meaningfully establish a sensory connection with human-embodied avatars or computer-controlled agents in VR. The environmental characteristics, actual musical behaviour, and physical performance of embodied VR, among other variables, are crucial aspects to recreate this in a virtual space-frame.

The result of this first empirical experimentation indicated that a feeling of social presence was experienced during the real-time interaction between the two pianists in the real environment as well as in the mixed environment, where the performers were embodied as avatars. At the same time, a relatively successful performance output was achieved in the real-time interaction between the actual pianist and the computer-controlled agent. The outcome shows the inadequate consistency of both embodied co-regulation and subjective experience, in the interaction with a computer-controlled agent (Van Kerrebroeck et al. 2021).