A crucial moment in my work with electroacoustic music was, when around the year 2000 I started writing software for transforming sound files on a desktop computer. Sound files are input and output, and there is “nothing” in-between, inasmuch as the software, baptised FScape, runs offline—pure production or rendering and no sound reproduction. This was of course a predominant way computer music software worked historically, when machines were too slow to render at real-time speed, known examples including CSound and also SoundHack, which was a direct inspiration for FScape that began its existence as a processor for SoundHack’s phase vocoder files. While real-time operation is often seen as superior, permitting instantaneous interaction and a short circuit for listening and adjusting or indeed performing a piece, I argue that both the successive relaying of the work from human to computer and back, as well as the manifest storage of sound in files enable a different kind of experience of and reflection on the compositional process. I have often called FScape experimental software, for two reasons. First of all, the digital signal processing (DSP) modules it implements are often unconventional and embody tools necessitated by particular compositions instead of realising “known” or general transformations and effects. And secondly, the work flow tends to show patterns analogous to conducting experiments, in the way one observes and adjusts, leaves and comes back, accumulates artefacts, etc.

In this talk, I will retrace the trajectory that FScape has undergone from its inception to present day. This trajectory can also be read as one describing the drift in compositional thought. For example, as I move away from electroacoustic composition to focus more on live improvisation and sound installation, one question that interests me is how the concept of sound file rendering can be incorporated in a generative setting that presents materials while it runs. What is the particularity of the interface afforded by this software, and what happens when it is coupled to other systems? Also, the radical atomicity of the number streams in signal processing permits the use with other modalities such as still image or moving image. This is especially made possible through an ongoing rewrite of the software project, in which the fixed macroscopic modules of the original standalone application are exchanged for a unit generator architecture and a domain specific language for constructing DSP programs.

Hanns Holger Rutz

Computing at Intrinsic Speed within a Culture of Real-Time

{kind: title}

---

meta: true

event: Simulation

author: Hanns Holger Rutz

---