Instrumental Interfaces for the Live Performance of Musicalized Visual Media
The analogy between sight and sound has long been an object of fascination within the Western European intellectual tradition, motivating both speculative theorizing and experimental instrument-building (Hankins and Silverman 1999; Brougher et. al . 2005). Instruments designed for the performance of musicalized visual media have been colloquially referred to as color organs, but as Whyte (2019) observes, the term imperfectly describes twentieth- and twenty-first “post-keyboardal” iterations of this idea. Contemporary visual media instruments are not necessarily modeled on the organ, nor is the audiovisual analogy limited to a mapping between color and pitch. Instead, these instruments exhibit an astonishing variety of cross-domain correspondences that engage a broader range of visual and aural parameters. Close analysis of both keyboardal and post-keyboardal interfaces reveals novel approaches to mediating translational gaps between domains.
Drawing upon these observations, I discuss my collaborative, iterative solutions to the challenge of designing instrumental interfaces suitable for the live concert performance of musicalized media. Visual media is “musicalized” when it imitates the properties, structures, and processes of music instead of serving a purely dramatic or illustrative function. Visual cues must be instantaneous to facilitate rhythmic articulation and nuances of tempo and rubato, and instruments must be responsive to a performer’s intuition regarding intensity, color, saturation, texture, and effect. I discuss two specific interfaces: a keyboard layout that cued a lighting installation capable of layering twelve colors and seven special effects, and a post-keybordal palette of buttons and sliders that manipulated the rate of movement, intensity, and accent in a small corpus of video assets. In my practice, these instruments are ephemeral and transient, built to support a specific creative work and dismantled soon after. Therefore, interfaces must feature audio-visual mappings that feel intuitive to the performer, with operational layouts that are capable of being mastered quickly, facilitating a friction-free transfer of a performer’s musicality to the visual domain. I conclude that dynamic musicalized visual media can be produced by controlling just a few visual parameters, and that a smaller number of controls that can be layered or combined is preferable to a large number of independent controls.
Anna Gawboy
Visual Media
Anna Gawboy’s work explores cultural history and multimedia, with a special focus on visualized music. She is internationally known for her collaboration with lighting designer Justin Townsend on their reconstruction of Alexander Scriabin's color symphony Prometheus, Poem of Fire, most recently performed with the Royal Concertgebouw Orchestra in Amsterdam and the Boston Symphony Orchestra. Gawboy has also worked with Alex Oliszewski of Theatre and the Advanced Center for Computing and Design to create a live multimedia accompaniment to a performance of Carl Orff’s Carmina Burana at The Ohio State University, and she consulted on Scriabin in the Himalayas, a multimedia tribute to the composer in Ladakh, India.
Gawboy has written about the English concertina as an instrument of acoustic science, synaesthesia and audiovisual media, esotericism and musical modernism, public music theory, and music theory pedagogy. She is the author of nearly 200 video tutorials in music theory, including those accompanying A Concise Introduction to Tonal Harmony by Joe Straus and Poundie Burstein. She was a co-founding associate editor of SMT-V, the first videocast journal in music, as well as co-founding editor of Engaging Students: Essays in Music Pedagogy. She has served as president of Music Theory Midwest and on the Executive Board of the Society for Music Theory. She is currently working on a book, Vibrational Metaphysics and the Making of Musical Modernism.