WHY TO CREATE AN XR PERFORMANCE?


A New Opportunity for Artistic Experimentation

In the emerging scenario and ontology of our contemporary practice and space-frame, musicians, composers, performers, artists, and researchers are searching for renovated modalities to express themselves. The possibility to perform in hybrid stages, virtual and real, is a trigger experience and opportunity that has been inspiring new actions in (re)constructed performance spaces and the associated social frame.

The output of this auto-ethnography from the perspectives of my investigation and extended performance brought about novel applications of augmented and virtual technologies in artistic research in music as:

  1. research tools to document, analyse and disseminate music performance practice
  2. performative tools to create XR music performances in hybrid spaces, where humans and avatars harmonise in a unique phygital experience

Both applications enable artists and researchers to create and disseminate the artistic and scientific findings of their research. The use of digital technology in my artistic practice and research served as a research tool to collect data on my performance practice, thereby stimulating my self-reflection and self-evaluation, as outlined in the paradigm of the 'technology-enhanced mirror' (Caruso et al. 2021). Using an avatar as a body image, performers can look at their performance from an outsider's perspective and re-formulate their bodily approach, discovering different potentialities and adaptations of gestural choreography and expressions in interactive contexts and extended performance spaces.

The paradigm of the technology-enhanced mirror applied in an XR space opens up new reflections and aesthetics regarding the novel concept of 'virtual embodiment' (Maes et al. 2024). This new concept entails the perspective of active engagement of real bodies, embodied avatars and audiences in hybrid musical environments, which can be implied in music creation, research on music interaction and in music education, for example, to analyse students' movement or to manage distance learning (Waddell & Williamon 2019).

Motion data, transformed into visuals, supports practising but also stimulates the creation of new XR performances, which is an opportunity and a challenge that opens innovative ways of playing, attending and reproducing a live concert.

Video 4. Excerpts of the XR performance MetaPhase at the Cremona International Music Exhibition 2022

A New Opportunity for Public Engagement

The impact of the advancements in digital technologies is changing and potentiating, on the one hand, the modalities of musicians' creation, and on the other, spectators' fruition, i.e., the way they experience and interact with the performance space. The ontology of the performance space is characterised by conventions and cultural practices that define a social frame and determine the identity of how music performance is perceived and lived by spectators (Goffman 1974). This social frame encompasses the symbolic status of a performance and underpins an implicit agreement between performers and spectators, within codified cultural roles (Bourriaud 1998). The scenario of our contemporary societal context, affected by globalisation, and recently by the experience of the pandemic restrictions and remote art fruition, sees performers and spectators in a remodelled space-frame (Onderdijk et al. 2023). Augmented, virtual, and mixed reality place spectators in a reconfigured position within the performance space. They are often called to act in the first-person and interact with objects in the real and virtual space-frame (e.g., interactive installations, online performances and so on). This new way of perceiving and participating within the performance space brings about a re-modulation of spectators' imaginary (regarding the perceived objects), and their sense of presence, flow and time, which differs from a traditional performance space (Slater & Wilbur 1997; Pitozzi 2014).

The first attempt of the MetaPhase project happened on a hybrid stage, with the pianist performing while wearing the VR headset in a virtual and immersive environment, and spectators attending from an augmented perspective. This setup entails the projection on a screen of the three-dimensional space of two avatars playing on the virtual stage while the pianist plays in real-time on stage.

The cinematography in the virtual environment projected on a screen was limited to three camera perspectives; the upper part and the two perspectives of the pianists, which provided the audience attending in the real environment with an overview of the virtual space. Furthermore, the virtual visual perspective was shown to provide an image of the VR world as it is seen through the eyes of the performer. In this specific configuration, the XR environment projected onto a screen was zoomed in and out from different angles to show the audience the real interaction between the bodies of the two avatars and the real body of the performer in the physical space. This allowed the audience to engage in the process of virtual embodiment in real-time. This initial XR experimentation did not consider the ambisonic format, which involves multichannel spatial projection. This decision was influenced by the locations and venues, which were not fully equipped for such a setup, and the goal of maintaining a sustainable, low-cost, and transportable project (already quite complex) to be executed in various locations.

With the support of the LWT3 team and technology, spectators could attend the 'live' performance in VR in postproduction. This is possible not only after the real-time concert but also on other occasions, such as seminars, exhibitions, installations, etc. This new virtual modality of reproducing and attending a live performance in an XR environment attracted different audiences (not only young spectators but also younger and older adults). In the postproduction modality, the audience experienced the immersive elements of a live XR performance, engaging in real-time with the extended space by visualising in detail the dynamic and responsive environment to the musical input. The scope of the visor VR headset served to immerse both the performer and, then, the audience in the virtual environment, allowing them to experience the performance from diverse angles, thereby enhancing the sense of presence and engagement with the digital elements. In a VR setting, the audience's experience transcends the limitations of a traditional concert hall, and it is transformed into an immersive journey within the performance space. The VR headset allows for exploring various perspectives, distances, and unconventional settings where the visualisation is normally fixed. This virtual attendance modality enhances the level of freedom and engagement for spectators, allowing them to move around the virtual stage and approach the pianist and her avatar, thereby observing the details of theirinteractions and movements. The audience feels as if they are part of the unfolding narrative, surrounded by the music and visuals, with an impact on their sensory perception. This new way of presence in the virtual space creates a connection to the performance that is not possible in a traditional live performance because it doesn't offer the same level of interactivity and immersion. Conventionally, the audience is confined to their seats, and their perspective is limited to the view from their location. VR opens up a new realm of possibilities, allowing the audience to experience the performance in a way that is both innovative and deeply engaging.

Fig. 12. MetaPhase in post-production. After the real-time performance on stage, spectators were invited to put on the VR headset to enter the metaverse and watch the recording of the piano duet from the perspective of the immersive virtual scenario.

Video 5. The audience exploring the virtual environment

Future Challenges

Building a bridge between arts, science, and digital technology is the challenge for future horizons of expression and creation through multidisciplinary languages (Lapointe 2016; Stévance & Lacasse 2019; Lessaffre & Leman 2020). This exposition aims to present a perspective on the aesthetic impact and challenge of XR technology in artistic research in music. By referring to the recent studies on embodied music cognition, mediation technology and innovative performative strategies, the exposition, proposed through my artistic practice and research (Schwab and Borgdorff 2014), encourages collaborative and co-creative methods between disciplines and researchers.

As explained, the MetaPhase XR performance is a development of the empirical study on social presence in virtual environment by Van Kerrebrouk et al. (2021) at IPEM. The aesthetic choice regarding the repertoire was established as a consequence of the previous study that focused on Piano Phase by Steve Reich to investigate the level of synchronisation and interaction between bodies in the Metaverse. Due to the limited extension of the use of the keyboard (just the repetition of twelve notes), this minimalist piece guarantees the use of the VR visor for pianists immersed in the virtual space. In this setting, the virtual reproduction of the keyboard in VR is provided to maintain contact with both the real and the virtual instrument. Future XR performances can implement the AR/VR visors further, enabling a phygital scenario, depending on the perspective of the visualisation. This will allow the performance of more complex repertoires and other genres for the piano. The novelty of MetaPhase XR performance lies in the integration of a phygital experience which combines different layers; the physical presence of the pianist with the digital presence of her avatar. In addition, this offers a unique interpretation and interaction that traditional performances do not provide; the physical presence of the audience attending a virtual performance. Further research can integrate more interactive avatars animated by Artificial Intelligence (AI) that can include unpredictable responses, as well as more immersive 3D audio spatialisation that can further engage the audience in perceiving the sound patterns and approaching the performance space.

An XR performance has the advantage of creating new virtual environments, locations, theatres or venues that can be conceived following the aesthetics required by the different projects. The audience can be more engaged in immersive spaces and perceive the performance from various perspectives, unlike traditional stage settings that create a certain distance between performers and spectators. Future XR projects can bring together performers from different locations and create interaction using holograms.

The disadvantages of traditional concert settings include the complex technical equipment and the requirements of specific venues. The next step may be the development of better user-friendly methodologies and easy-to-use applications and methods for musicians and artists to enhance their creative process and dissemination of their findings (Turchet 2023). In this perspective, I am already working on future developments in the research on XR technologies in artistic practice, focusing on the perspective of a more inclusive and active participation of spectators in virtual environments, in presence or in remote conditions.

Interactive technologies are revolutionising music performances, creating a synergistic 'phygital aesthetic' that bridges art, science, and technology. This transformation is paving the way for more inclusive and accessible experiences, empowering artists and audiences alike to explore new horizons of creativity and expression.