To go from here to

in|fibrillae

in|fibrillae is an attempt to reconfigure the physical-material installation in|filtration into the online browser space. It unravels the previous piece into hundreds and thousands of small fibres, in order to transpose the original materiality for a personal space at somebody’s home. Just as the viewer / listener lets the piece into their personal and private space, in|fibrillae is also becoming a personal piece of ours, restarting with the sound composition that is taken from Hanns Holger’s part in the original piece, and with new visual and textual work by Nayarí.

The piece is currently a work-in-progress that will evolve, as we try out what works and what does not work in the browser. The page you are viewing is the thought and research process on that piece. For now, it is embedded in the overall documentation of in|filtration (which you can access by clicking “Return”, or by going down the exposition's pop-up menu in the top left corner).

The developments of browser technology, in particular the introduction of the Web Audio API and the Web Assembly virtual machine, make it possible to run ambitious real-time generative sound pieces in the browser. In the end of 2020, we ported the SuperCollider sound server (scsynth) to this new technology, allowing SuperCollider-based sound pieces to run on the audience’s browser without the need to stream audio from a web server. In a second step, Hanns Holger’s computer music framework SoundProcesses was ported to the browser as well, allowing a translation of the original installation code base to the new situation.

While an obvious approach for transporting sound pieces would be to setup the software on a server, and understanding the server as a kind of remote space that is statically present like a virtual gallery one can visit any time, in|fibrillae makes a deliberate decision to run purely on the front-end side of the browser, exploring this volatile “space” that is created ad-hoc as a person opens the project’s URL in a tab. Using local storage, the piece’s state is not only individualised for everyone who visits the project, but one may return to the piece at a later point in time and find it in a similar state as one left it. in|fibrillae plays with this volatile permanence.

{function: introduction, author: hhr, date: 210224}

Narrated notes on in|fibrillae, a longer version of what was eventually used for the short xCoAx presentation.

{group: hhr210512}

After a brainstorming session with Naya today, the web “space” structure seems much more tangible to me. Understanding the piece as a move towards more personal things, the original trunk drawings will become part of the visible piece, more precisely the six variants that served as the foundation of the sound programming process. They act as “layers”, arranged in a carousel like manner (circular). We are using the original ellipsoid drawings rather than the polar/cartesian translation used for the videos in Inner Space. But they are never fully visible, instead a fixed zoom scale is used, thus depending on the browser window’s size, the view is more or less restricted to a part of the image (one may move the view port around). While now only the dry sound of two channels is heard, the idea is that the “absent” layers accumulate sounds in a shaded or reverberated way, becoming a background to the foreground sound and giving depth to the space. Perhaps it will be possible to store / recall those memories from the website storage? Naya will compose six texts, or rather collections of words that belong together. They will appear in permutations inside the trunk drawings, using a simple typography. Select words function as connecting points between the layers—when they appear, one moves between the layers. Before the piece begin’s, perhaps the visitor can choose whether they want to open their microphone to the sound interaction or not.

{group: sketch210224, author: hhr, date: 210224}

Sketch of carousel concept.

{group: hhr210414}

{hhr, 210124}

In order to create a first working version for the web, I have to remember how the piece was working in esc.

The main logic and algorithmic procedure of in|filtration was implemented directly in SoundProcesses, while the corpora of sound synthesis structures are stored in auxiliary workspaces Trunk${trunkId}graph.mllt where trunkId is one of 11, 12, 13, 14, 15, 18, the six trunk scans (and videos) selected as most varied within themselves. The last node is the one connected to a microphone. The auxiliary workpaces contain a Grapheme that is actually just an indexed sequence of the sound structures (for efficient random access), the ordering that was obtained through Lin-Kernighan on the total graph of sound similarities. While there is Timeline API for the Ex/Control language, analogous API for Grapheme is still missing. It should be easy to add (at least look-up by nearest neighbour or at a given index). I wonder how big all those workspaces would be if represented by the new “Blob” workspace that was introduced for the browser? The blob workspace is like an InMemory workspace but actually using Durable with an in-memory “database” that is simply a key-value map with serialised (binary represented) objects.

{group: hhr210124}

{hhr, 210111}

in|fi …

(finite ; fibration)

---

meta: true

artwork: infiltration

keywords: [web installation]

---

The Proc instances stored in grapheme are expanding to four channels, and each ending with ScanOut("out", <sig>). Up to five parameters are stored in the attribute map with keys p1, p2, … and corresponding ParamSpec objects p1-spec, p2-spec, …

The main patch then goes on the configure Wolkenpumpe; while the visual interface is only used to observe what is happening, Wolkenpumpe also has some nice API for managing the connections between processes, so unless we compile that separately for Scala.js, this functionality must be implemented again with Ex/Control.1 We use six output channels, the last two going to the neighbouring nodes in esc.

The processes defined for Wolkenpumpe: A filter adapt that sits between the input (coming from neighbouring nodes or microphone), it has switches to rectify the signal (take the absolute), a gain, a clip, and a target signal range. Two filters s+hF and s+hT insert a sample-and-hold element into the parameter controls. The input process in has a globally allocated control bus associated that can be used to set the balance between the “left” and “right” neighbouring node. The output process O-inf does a coarse loudness or “ringiness” measurement through weighted band-pass filters.

{group: hhr210124}

{hhr, 210224}

Several people are exploring the web space at the moment. For the server-based pieces, there was the project “Klangraum” launched last year.2 Looking at client-side based pieces, already a while ago, Luc Döbereiner made a feedback-based web piece “Compression”.

And just today, I was made aware of the piece “ь?”, by Yuri Bulka. It is not only a well done piece, it also touches on several interesting questions for this type of work. How do you design a “front page” for such a piece? I was lucky enough to press the “Start” button quite early before reading all about the piece. Although I think it is good and important to provide background information on the piece, I always prefer—when I go to a gallery or to a concert—to read relatively little about the piece ahead of experiencing it. Explanations can easily take away your ability to explore a piece, it can put too much deliberation on what can be read and what cannot, what the attention should be focused on and what not. Not knowing how the piece unfolds, not knowing what the supposed interaction with the piece is, allows for a much richer experience, in many cases. At least if, as is the case of “ь?”, the movement through the piece’s space is intuitive. Here we have a dark screen, and a small rectangle fades in, illuminated by a green glow. I wait a little, I think it is probably a button element, I click on it, yes it is a checkmark element. Sounds begin to appear. Then a second box starts to appear, and so on. For a while, I keep following the appearence of the boxes, I check them all. Only after a few minutes, I see what happens when I uncheck a box again, I wait and see if something happens when I stop interacting with the interface. I like this interface, because it is very minimal, it does not take away attention from the sound composition. But also, thinking of infibrillae, I would like to compose something that can rest in a small window away from the user's visual focus, away from the need for them to act and click.

The procedure is defined in the Algorithm class. There are constant Proc instances for input, output, and adaptation, a variable but always present instance for generator, and optional instances for filter and filter-fade. The filter sits between generator and output.

As is common practice in Wolkenpumpe, copies of processes are made whenever they are inserted (added to a Folder). Function changeNegatum picks the initial generator. That function distinguishes between a random walk among neighbours in the Lin-Kernighan path, and abrupt jumps, depending on sensor data and time elapsed. A weighted random choice is used to determine one of the generator's parameters to be patched into the adaptation process. The other parameters either remain at their preset values, or they are subject to random changes, or they “stick” to previously used parameter values (of other generators).

In random intervals between 10 and 60 seconds, a next action is performed, which in 1 of 5 cases is changeNegatum, in 4 of 5 cases is toggleFilter, which is either calling removeFilter if an existing filter exists, or otherwise calling insertFilter. The removal works by patching a linear ramp from one down to zero in the mix parameter of the filter process. Upon completion, the now inaudible filter is removed. The insertion does the opposite: add the filter with mix zero (inaudible), and then fading in the filter over time. The filter is picked randomly among five characteristics: L-hpf (two high-passes cross-fading across the spectrum in steps coupled to mix), L-lpf (the same with low-pass), notch (frequency and Q independent of mix), reso (resonant bandpass), filt (low- or high-pass).

{group: hhr210124}

{hhr, 210504}

The “loading” and “saving” of the blur-space is far from trivial. Assume, there are two asynchronous processes BufferPrepare and BufferWrite; due to lack of libsndfile support, they will be very slow in the browser, as only small chunks of samples can be transferred at a time, on top of which the IndexedDB is slow (currently). That could mean we have to “switch spaces” immediately, and simply add the neighbouring blur-space when “it’s ready”; likewise, keeping the old buffer writing even after “switching spaces”, therefore not begin recording the new space before the old space has finished writing.

{hhr, 210505}

The “loading” and “saving” of a monophonic blur-space of c. 90 seconds would take a few seconds now; so they could be initiated upon the visual transition, while still making sure that asynchronous each step completes before the next.

Parallel to this, an analysis process looks at the sound signal sent out. It measures peak amplitude and f0 estimate based on median-smoothened zero-crossing count, and reports every four seconds the amplitude and whether stable “bad” pitch was detected (reasonable amplitude, and running minimum and maximum f0 over a threshold and within a given ratio). These values are observed in function analysisUpdate. For example, when the signal is deemed (too) silent, after a while random adjuments to the generator's parameters are made. They either succeed by bringing up the signal's energy, or the algorithm gives up and induces move in the trunk space, calling changeNegatum. Similar actions are taken if bad pitch is present for too long, 

{group: hhr210124}

{hhr, 210509}

An interesting sense of working on a “game” is emerging for me. I would never call this piece a game, it is “unfunctional”, an aesthetic exploration. But the way the interface programming goes along, inspires this thought. There are stages in which things happen, moments of interface events and passivity, there is a virtual landscape, through which one moves, there is state that is threaded through the piece. All in all, a more “dramatic” form than a room installation.

What happens when transitioning from one trunk space to another.

{group: switchspace}

The infrared sensor data comes in via OSC and is passed to function sensorUpdate. It is four values for each side from the node's axis, and they are compared against a noise threshold. The balance bus is set based on the tilt to either side. Then for each of the eight sensors, a state triggered / non-triggered is calculated based on given thresholds and a minimum hold duration. The trigger states are used when “spreading” generator parameters across the four speaker channels. Trigger time stamps are remembered, and the number of recently occurring triggers also determines whether changeNegatum moves to adjacent or farther away sound structures. Whenever new triggers occur, we look at pairwise triggers from both sides of the sheets within the last five minutes, and assign an 8-bit “amount” to the temporal difference between two sides. The combined encoded amounts are passed to function updateFlip which alters the adapt process.

Pair-triggers more recently on the “left” side issue the “flipping” and adjustment of the low/high range of adapation, pair-triggers more recently on the “right” side issue the “flipping” of mode (clipping vs absolute value). Furthermore, a forgetFlip function is scheduled: Unless other triggers occur within two minutes, parameters are slow moved back to ranges synchronised across channels, and the absolute mode is eventually cleared. Forgetting is repeated over time, until ranges are the same across channels (or new excursions are produced by the sensor data).

{group: hhr210124}

Alignment of bridging-word to “counter-bridging” word.

{group: switchspace}

{hhr, 210128}

After adding the minimum set of grapheme operations, the first simple workspace that allows to numerically skim through a trunk space in the browser. What would be the visual layer? Perhaps it would be possible to make new photographies of the fabric, different close-ups that could overlay each other, just as I imagine more than one row is sounding at a time, with the remote ones dimmed.

 

in|fibrillae

{group: hhr210128}

{hhr, 210712}

After seeing how beautiful the physical window installation for Kontakt… became, and how it works very well for passersby at Reagenz, I wanted to see if I can make “again” a physical form for in|fibrillae as well. I returned to the only larger TFT screen I have, a 19" one in 4:3 ratio, and the square frames we had originally done for Inner Space. This would go with two transducers “stereophonically” positioned at separate window panes. One Pi 4 can “just” handle the load, the image movement is just fluent enough. I'm just worried that the sound may cause issues with neighbours, as they can get quite rough. So my thought is to add a batch of the original infrared sensors and mute/unmute the piece depending on the movement of people in front of the window.

{hhr, 210717}

A simple light-dependent resistor is now used as a sensor to activate the sound, which dies out after two minutes of inactivity.

{hhr, 210202}

After the addition of missing operations to SoundProcesses’ Control language, the curious translation begins. There is a pattern to move from the plain code to using Act actions that describe the steps of the different reactions and scheduled behaviour. All probabilities and time constants will have to be adjusted eventually for the new web space.

{group: hhr210202}

{hhr, 210203}

Putting the filters in the special If-Then block and tuning the fading behaviour.

val sig =
  If (tpe sig_== 0) Then { // 0: off
    in
  } ElseIf (tpe sig_== 1) Then {  // 1: L-hpf
    val freq  = (pMix * freqN).linExp(0, 1, 22.05 * 2, 20000)
    val wet   = HPF.ar(HPF.ar(in, freq), freq)
    mkBlend(in, wet, pMix)
  } ElseIf (tpe sig_== 2) Then { // 2: L-lpf
    val freq  = (pMix * freqN).linExp(1, 0, 22.05 * 2, 20000)
    val wet   = LPF.ar(in, freq)
    mkBlend(in, wet, pMix)
  } ElseIf (tpe sig_== 3) Then { // 3: filt
    val normFreq  = freqN.mulAdd(2, -1)
    val lowFreqN  = normFreq.min(0.0)
    val highFreqN = normFreq.max(0.0)
    val lowFreq   = LinExp.kr(lowFreqN, -1, 0, 30, 20000)
    val highFreq  = LinExp.kr(highFreqN, 0, 1, 30, 20000)
    val lowMix    = Clip.kr(lowFreqN  * -10.0, 0, 1)
    val highMix   = Clip.kr(highFreqN * +10.0, 0, 1)
    val dryMix    = 1 - (lowMix + highMix)
    val lpf       = LPF.ar(in, lowFreq ) * lowMix
    val hpf       = HPF.ar(in, highFreq) * highMix
    val dry       = in * dryMix
    val flt       = dry + lpf + hpf
    mix(in, flt, pMix)
  } ElseIf (tpe sig_== 4) Then { // 4: notch
    val freq  = freqN.linExp(0, 1, 30.0, 16000.0)
    val rQN   = 1 - qN
    val rq    = (rQN * pMix).linExp(0, 1, 1.0/25, 1.0/0.5)
    val wet   = BRF.ar(in, freq, rq)
    mkBlend(in, wet, pMix.cubed)
  } Else /* If (tpe sig_== 5) Then */ { // 5: reso
    val freq  = freqN.linExp(0, 1, 30.0, 13000.0)
    val mixC  = pMix.cubed
    val q     = (qN * pMix).linExp(0, 1, 0.25, 25)
    val rq    = q.reciprocal
    val makeUp= q.sqrt * 2
    val wet   = Resonz.ar(in, freq, rq) * makeUp
    mkBlend(in, wet, mixC)
  }

{group: hhr210203}

{hhr, 210205}

A riddle… a function to add random spread to an expression. First thought was to return an action and the target expression.

def spreadVecLin(in: Ex[Double], lo: Double = 0.0,
                 hi: Double = 1.0): (Act, Ex[Seq[Double]]) = {
  val r   = (hi - lo) * 0.05
  val rr1 = rng.range(0.0, 2 * r)
  val rr2 = rng.range(2 * r, 0.0)
  val rr3 = rng.range(-r, r)
  val sq  = Vector.tabulate(NumGenChannels) { ch =>
    val ta    = trigStates(ch)
    val tb    = trigStates(ch + 4)
    val rand  = If (ta ^ tb) Then {
      If (ta) Then rr1 Else rr2
    } Else {
      rr3
    }
    (rand, (in + rand).clip(lo, hi))
  }
  val (actSq, vecRes) = sq.unzip
  (Act(actSq.map(_.update): _*), vecRes)
}

{group: hhr210203}