This accessible page is a derivative of https://www.researchcatalogue.net/view/2521851/2525158 which it is meant to support and not replace.

HOW TO PREPARE AN XR PIANO PERFORMANCE

Video description: Video 2. A short video shows the different stages of preparation for the XR Music Performance at the LWT3 Performing Lab in Milan. The author is seen rehearsing the piano piece, both with and without the VR headset, being fitted for and wired up in the MoCap suit and playing in front of a projection of the realtime rendering of her movements.

Click on https://www.researchcatalogue.net/view/2521851/2525158#tool-2526608 to watch the video.

Metaphase, in biology, is the process of cells splitting.

Corporeal Awareness concerns the sense of our body through information which comes from both inside and outside.

Virtual Embodiment is the status of psychological sense of presence, provoked by the illusory feeling of actually being in and interacting within XR environments.

From a Real to a Virtual Piano Performance Practice.

The framework of the previous empirical experiment provides the foundation for a new perspective on the role of VR technology, as a research tool to assess the level of social presence in virtual musical interactions. This experience led me to formulate additional, more artistically-oriented research questions, ultimately guiding the development of the method and theory for preparing an XR music performance. This new perspective was also inspired by the biological stage of MetaPhase, symbolizing the duplication of the self as an avatar in the metaverse. To address these questions, I came across the idea to investigate myself performing in the phygital environment of the metaverse, by playing Steve Reich's Piano Phase, in a real-time duet with my embodied avatar. The first attempt of the MetaPhase project happened on a hybrid stage, with the pianist performing while wearing the VR headset in a virtual and immersive environment, and spectators attending from an augmented perspective. This setup entails the projection on a screen of the three-dimensional space of two avatars playing on the virtual stage while the pianist plays in real-time on stage.

Steve Reich's initial  exploration of phase-shifting compositional techniques, inspired the investigation's purpose, the musical interaction in hybrid spaces, and also influenced the methodological setup and concept. Reich developed his shifting ideas starting with the duplication of voices using a tape loop. Notable examples include compositions such as Gonna Rain (1965) and Come Out (1966). From these experiments, he gradually adapted the method for a live performance by conducting a hybrid experiment in the piece Reed Phase (1966), in which he blends a soprano saxophone with magnetic tape. One year later, Reich started experimenting with two pianos in Piano Phase (1967). Lacking access to two pianos, he recorded the first piano part on tape and attempted to play in near synchronisation with the recording, introducing slight shifts or phases, and occasionally realigning the twelve successive notes. Reich found this process gratifying, demonstrating that a musician can achieve phasing with focused effort (Potter 2000).

Stimulated by Reich's creative process, on my first attempt, I also began studying Piano Phase using a tape loop to reproduce my audio-recorded first part. However, first attempts to synchronise my part with the audio-recorded first part failed because the gestural expression, which can help the second musician correctly match the counterpart, was absent. This practice helped me experience the importance of the visual and embodied aspects in musicians' interaction, especially within the context of a virtual environment. Many studies, indeed, have shown that corporeal engagement between performers playing in ensemble performances in real spaces is a triggering aspect for musical interaction (Navickaitė-Martinelli 2019), synchronisation and entrainment (Leman 2016). In Reich's Piano Phase, for example, the first pianist takes the role of the leader, playing in constant tempo and guiding through corporeal strategies such as head nods and eye contact during the accelerando made by the second pianist, which causes the dynamic shifts.

From the challenges I faced in my practice, I imagined how productive it could be to practice with a virtual body playing the counterpart. However, the results of the first experiment I conducted in 2021, at the ASIL Lab in Ghent, show the difficulty of interacting with a programmed virtual agent for a musician. For this reason, I conceived the idea of interacting with an embodied avatar by duplicating myself in virtual reality, mimicking what occurs in the biological 'metaphase process', i.e., in cell duplications. I planned to pre-record my movements and animate the embodied avatar playing as the first pianist, with whom I could then interact in real time while playing as the second pianist.

My experience as a pianist-researcher involved in the empirical experimentation, conducted at IPEM with Bavo Van Kerrebrouck and Pieter-Jan Maes, led me to define a method and procedure for preparing an extended piano performance in the metaverse that entails achieving three tasks:

  1. establishing the corporeal awareness of playing in a real dimension, in relation to the score interpretation
  2. defining the gestural expressions from the perspective of virtual interaction
  3. developing the state of virtual embodiment by playing in the metaverse

Image description: Fig. 6. An image, captured from Video 2. above, shows the author's hands playing the piano, her face and the sky behind it are reflected in the piano's high gloss surface. The words 'Metaphase From Real' appear on the image. The caption reads: Imagining how to interact and play in duet with yourself as a virtual performer.

Click on https://www.researchcatalogue.net/view/2521851/2525158#tool-2526551 to see the image.

First Task: Corporeal Awareness

In this specific experiment, I played both parts of Reich's Piano Phase; the first as my pre-recorded virtual embodied avatar and the second as a real-time animated avatar. This task required meticulous preparation of the movements to animate the avatar, which simulated in virtual reality the corresponding corporeal expressions of a first pianist. In my previous experience with an XR performance, I had discovered that the virtual environment refused me distinct sensory stimuli, notably the absence of real eye contact and facial expressions with a virtual agent. Consequently, it was essential to focus on total body movements, as the preparation of corporeal engagement while playing an instrument in a virtual environment needs to be well established. During my solo practice, I decided to learn to play both parts by heart by observing my movements while playing on an upright piano, which reflected my body in its surface. This 'mirroring' practice in real time gave twofold insights; firstly, how to move in real time while playing both parts of the piece and secondly, how to create the interaction with a second counterpart in virtual reality. In my practice, I specifically focused on the gestures in correspondence with the shifting phase and the synchronisation as they occur in a real performance, and I annotated all this information on the score. This procedure allowed me to increase my corporeal awareness from the perspective of playing both parts of Reich's score in both conditions, real and virtual spaces and environments.

Second Task: Gestural Expressions

Another challenge was to practice with my counterpart to rehearse the interlocking of each pattern based on twelve notes, as suggested by Reich himself in his introduction to the score. To be prepared for playing in a virtual performance, I decided to avoid the common practice of rehearsing with a real pianist. As explained in the previous section, I opted for the pre-recorded audio track of me playing the entire first part of the piece. Rehearsing with an audio counterpart is one method in the learning process of the phase-shifting practice and interaction with other performers (real or virtual). The procedure proved very useful for this project, as it enabled me to prepare and record the repetitive first part of the piece, which was later synchronised with the embodied virtual pianist, for the video rendering in VR.

However, the corporeal visualisation, which is the crucial reference point in the phase-shifting practice, is completely missing, increasing the risk of getting lost or playing incorrectly the interlock of the patterns. To monitor and embody the correct posture, position and movements of myself playing from the perspective of both real and virtual conditions, I also video-recorded my rehearsals. This method allowed me to collect video data, which I used for the stimulated-recall procedure of my self-evaluation and self-observation (Bloom 1953). The application of my 'mirroring' method enhanced by technology (Caruso et al 2016; 2021) allowed me to increase the analysis of my 'body image' and 'body schema' (Gallagher 1986)—the conscious representation of the body and how it interacts with the environment—to define my sensorimotor control and establish my gestural expressions in my virtual performance.

I collected qualitative data consisting of self-reflections, stimulated by the video and audio recordings. The content of this data is reported in the form of written systematic self-reports and score annotations, and was used to create my performance model (Caruso 2018: 79). This performance model was remodulated for an XR music performance, which led to new level of embodiment in the virtual space (Hornecker 2011).

Image description: Fig. 7. A diagram shows the steps of the 'performance model' as a self-report.

Click on https://www.researchcatalogue.net/view/2521851/2525158#tool-2527108 to see the diagram.

My Performance Model

Musical structure relates to how the piece is written

Technical features relate to how the piece is performed, in terms of bodily expressions

Interpretative cues relate to how the piece is performed, in terms of sound results

Third Task: Virtual Embodiment

From my solo performance practice, I moved to the virtual environment of the metaverse to experiment with the interaction between myself and my embodied avatar. The protocol and method for preparing an XR performance were co-developed with the LWT3 team at their lab in Milan. This entailed:

  1. video, audio and motion tracking recordings of me playing the first part of Reich's piece to build the movements of the first virtual performer who embodies my corporeal expressions
  2. synchronisation of the audio, video and motion tracks to create the performance of the first virtual pianist
  3. implementation of this data into a virtual environment by using Unity, a platform for the creation of video games, animations, and immersive experiences in VR and augmented AR
  4. visualisation of the two virtual pianists in the virtual scene; one animated by my pre-recorded corporeal expression and one animated in real-time by myself playing with the motion capture suit and VR headset
  5. practice in XR as a second pianist by interacting in real-time with the embodied avatar, as a first pianist

Image description: Fig. 8. An image, captured from Video 2. above, shows the author, wearing a MoCap suit, playing the piano in front of a projection of the virtual rendering. The words 'To Virtual' appear on the image. The caption reads: Playing and seeing your body schemata, embodied as a virtual agent.

Click on https://www.researchcatalogue.net/view/2521851/2525158#tool-2525505 to see the image.

The reproduction of both avatars was built in total body to enhance the visualisation of movements, needed for the interaction between myself and the virtual pianist, playing in real-time. This also ensured both authenticity and a close mirror of a real performance in the virtual environment. At the LWT3 lab, I made a series of recordings to evaluate the gestural expressions of both avatars; the pre-recorded pianist and the second avatar playing in real-time through my body tracking. This part of the experiment allowed me to evaluate my corporeal engagement in playing both parts of the piece in VR, drawing on my previous solo practice. After testing different corporeal strategies in the virtual environment previously developed during my solo practice, I established the 'choreography' of my embodied first avatar, adapting my gestures to the virtual environment. I refer here to having achieved a specific 'choreography' because I needed to define the sequence of my movements, while animating the avatars in relation to the score. Reich's piece is divided into two parts and a final coda, each requiring different modalities of corporeal interactions between the two performers. The embodied first virtual pianist had to guide my real-time interaction as the counterpart, determining when to play in phase and when to shift with the accelerando. The choreography established in my solo practice and subsequently assessed for the virtual performance entailed two actions; the horizontal head nod, to indicate the shifting moments of the first part of the score, and the vertical head nod to indicate the shifting moments of the second part of the score and the final coda.

Regarding the audio, the pre-recorded track was directly projected into my VR headset without any predetermined spatialisation. The interaction between the pre-recorded audio track and my real-time performance increased with the visualisation of the developed virtual gestural animation of the embodied avatar.

The cinematography in the virtual environment projected on a screen was limited to three camera perspectives; the upper part and the two perspectives of the pianists, which provided the audience attending in the real environment with an overview of the virtual space. Furthermore, the virtual visual perspective was shown to provide an image of the VR world as it is seen through the eyes of the performer. In this specific configuration, the XR environment projected onto a screen was zoomed in and out from different angles to show the audience the real interaction between the bodies of the two avatars and the real body of the performer in the physical space. This allowed the audience to engage in the process of virtual embodiment in real-time. This initial XR experimentation did not consider the ambisonic format, which involves multichannel spatial projection. This decision was influenced by the locations and venues, which were not fully equipped for such a setup, and the goal of maintaining a sustainable, low-cost, and transportable project (already quite complex) to be executed in various locations.

Wearing the VR headset transported me into an immersive virtual space, a new experience for me in performance. The concept of immersion is a multifaceted experience in which players feel a strong sense of presence within the virtual world. In gaming, for instance, this experience is characterised by a reduction in perceived distance with the virtual objects and a sensation of actually in habiting the space depicted on the screen. Immersion is achieved through various elements such as engaging narratives, interactive gameplay and realistic graphics, which together create a compelling and absorbing experience for the player (Calleja 2007). Other studies consider that the boundaries between the digital and physical worlds are dissolving as more physical ways of interacting with computing are being developed. The importance of physicality in interaction design, which is a key aspect of phygital experiences, demonstrates how physical objects can be used to interact with digital systems in intuitive and engaging ways (Hornecker 2011).

In the case of my XR performance, I felt absorbed and engaged in achieving a successful performance of Piano Phase by focusing on the sound and the visualisation of the corporeal movement of the second avatar. Due to the tactile contact of my fingers with the keyboard and the awareness of a real audience in the hall, the medium content did not distract me from the contact with the physical space (Filippelli 2025: 300).