What is a XR music performance? How and why are interactive technologies and phygital (digital-physical) interactions implemented in artistic practice and research in music? What is the engagement of musicians and the audience in this new hybrid (virtual-real) format?
This exposition explores methodologies in the creation and audience fruition of a XR music performance where a pianist plays in duet with her virtual counterpart. This is the artistic output of a research on phygital interaction of performers in the metaverse.
XR stands for 'Extended Reality', which encompasses any kind of 'X' variable technology used to enhance or replace our view of the world.
Avatar comes from the Sanskrit word Avatāra, which in Hinduism refers to the material appearance or incarnation of a powerful deity or spirit on Earth. The term can also describe the act of a revered human being making their appearance.
Embodied Avatar describes a virtual agent, animated by the corporeal expressions of humans.
The Metaverse is a post-reality universe where the physical and digital converge.
Phygital refers to the connection between the physical and digital world.
Technological mediation offers a stimulating resource for enhancing creativity and performance practice, especially after the pandemic (Breese et al. 2020, Khalid 2020; Onderdijk et al. 2023). Interactive technologies for augmented reality (AR), mixed reality (MR) and virtual reality (VR) are capable of modifying the perception of the physical world, allowing the interaction between humans and digital entities in immersive environments.
The space where the physical and virtual meet is nowadays defined as extended reality with the acronym 'XR'. In extended reality, different experiences can be created, such as gaming, social interaction, or the replication of real-life, in the context of the metaverse.
In spite of the abuse of the term 'metaverse', normally adopted in the context of social networks to indicate the interconnections between people on the internet, its definition had already appeared in the dystopian narrations Snow Crash (1992) by Neal Stephenson and Ready Player One (2011) by Ernest Cline. The precise meaning implies the idea of computer-generated and networked XR spaces where augmented and/or virtual technologies enable the interaction of physical people with digital objects.
This blended approach was originally coined as 'phygital' by Chris Weil (Del Vecchio et al. 2023), who was inspired by the modern idea of consumers as connected protagonists who seek to fulfil their needs for entertainment, information and connection in both physical and digital worlds. Consequently, phygital has become a relevant concept in both marketing and communications, generally referring to strategies for building real and virtual relationships with customers. Nowadays, this concept is extended to various technological applications that bridge the digital and physical worlds, providing unique interactive experiences for users.
The phygital experience is also relevant in research into movement and music (Sutil 2015; Bleeker, 2017; Miller 2019). New methodologies and applications concerning human-machine interaction with 'embodied avatars' (Beaufils & Berland, 2022) have been developed, inspiring the creation of multisensory artwork. The goal is to transcend the boundaries of real space into virtual and extended spaces, allowing people to explore diverse environments and experiment with innovative art-making approaches that are more immersive and interactive. These aesthetic directions have inspired contemporary performances designed to extend the performative possibilities from the real to the virtual world.
Wiki Piano Net is an artistic experiment by pianist Zubin Kanga and composer Alexander Schubert (Kanga 2020), involving the active participation of the audience on the internet. The pianist performed a piece by reading the contributions of the audience displayed in real-time on the internet page, which continuously influenced and modified the score and the piece. This artistic experiment included both audience and performer in the virtual environment of the metaverse, generating a co-creative artwork, which is unique in its execution. All elements were either spoken or performed by the player, except for small comments, that explained editing choices in detail. As a result, each performance reflected the dynamic interplay between the audience and the evolving composition.
Empty Mind by Kristof Timmerman and Ine Vanoeveren (2022) is an example of a performance that was entirely realised in the metaverse during the lockdown. Initially composed by Wim Henderickx as a piece for oboe and electronics in 2014, and then adapted for flute and electronics in 2019, Empty Mind is divided into six movements, which include written passages and open free passages created by the performer in real-time. The works and ideas of American artist Agnes Martin inspired the performance, which was realised at the Academy of Fine Art, in Antwerp and designed to be experienced online with an active participation of spectators. The audience could determine the order of the piece by choosing symbols, while navigating the virtual environment via the User Interface (UI) layer, over a livestream.
Delirous Departures by CREW is another immersive and interactive performance, conceived during the lockdown by Eric Joris and Isjtar Vandebroeck (2022). At that time, travelling abroad was impossible, and people began to imagine new ways to express the possibility of moving and experiencing different places. Inspired by this need, the performance took place after the lockdown at the Royal Museums of Fine Arts of Belgium, in Brussels, transforming the space into a virtual train departure hall, where visitors adopted the role of travellers. They experienced encounters and interactions with other travellers, as immaterial avatars, virtual agents animated by the machine, and embodied avatars animated in real-time by dancers.
Dwelling Xenakis: An Augmented Reality project on Evryali for piano solo was an augmented reality piano concert, held at the Université Paris 8 by pianist and researcher Pavlos Antoniadis and his team from IRCAM in 2022 (Antoniadis et al. 2023). This concert aimed to explore the tensions between textual interpretation and embodied performance, offering a vision for the new generations of Xenakis performers. This study introduced a novel paradigm for pianists' interaction with Xenakis' notation, termed 'embodied navigation', inspired by studies on embodied cognition (Antoniadis and Chemero, 2020). The integration of live full-body motion capture and augmented reality applications created a hybrid space that blends symbolic and physical elements, generating an interactive scenography 'palimpsest.'
Building on this foundation, my research addresses the urgent gap of the embodied aspect in human-machine interaction by leveraging the expressive potential of digital technologies in artistic research in music and in the creation of hybrid physical-digital performances. The aim is to enhance the theoretical discourse on music embodiment and offer practical artistic insights regarding the new level of agencies for musicians and audience involved in XR performances.