In addition to the more direct methods of sonification that I’ve used in this project, I’ve found it necessary to try to describe the processes of VLBI in a more associative manner. I’ve been trying to figure out how I can make a sound that in a convincing way embeds an essence of the correlation process. The real correlation process of VLBI takes two essentially random signals and slides them over each other to see where there is a tiny bit of a match. When I say essentially random, I mean that if I listen to those two signals, at any playback speed, there will be no perceptual cues to distinguish one from the other. The real correlation process uses thousands of iterations, over which a running sum (residue) is identified, signaling when a match is found. If I were to use this process directly on sound, on signals of a frequency that we can perceive as pitch or rhythm, it would take literally forever and the subtle change when correlation was attained would not make a perceptible difference.
Still, I have felt it essential to capture this part of the process in some way in the sonic work.
Rhythmic correlation, or rhythmic morphing
I tried taking two random rhythms, evaluating the difference between them, and making an algorithm that would allow them to slowly converge towards each other. This could prove a useful musical composition tool sometime, but I was not able to really solve some inherent problems with the approach. It is easy enough to interpolate between two patterns, letting each value in the pattern slowly approach the corresponding value of the other pattern. However, how would you decide which value in one pattern to “couple” to which value in the other pattern? Let’s say that the rhythm patterns may have different length (a reasonable assumption, I think), or perhaps a different number of attacks/events. Then there is no one-to-one relationship between the members of those two patterns. It gets a bit complicated, and as of this time I have not been able to find a solution that I could happily say is perceptually relevant.
So, my next attempt at this rhythmic correlation process was to simplify the rhythms as much as possible, down to a stream of single events of equal duration. Like a metronome. Starting several instances of this rhythm generator at different tempi, I let them gradually come into synchrony. I implemented this in such a manner that the rhythm clocks has a built in feature of looking for the beats of other clocks, and readjusting its own pace to approach that of the others. This is commonly called soft synchronization, because it is a process that is allowed to unfold over some time, as opposed to hard synchronization where we immediately “snap” to the next beat. This soft synchronization is attained by first adjusting the tempo, then the phase (phase in this context meaning “where in the period between two beats are we now?”). Without phase synchronization, we would have rhythms of the same tempo, but still not playing exactly together. The time it takes for synchronization to be attained can be adjusted (“sync gravity”), typically in this installation, I let the process take a minute or a bit less. To prolong the musical situation, I continuously fade in new unsynchronized events, and fading them out when synchronization has happened. I tend to think of this as a process akin to Jean-Claude Risset’s endless glissando. Using different pitches for each rhythmic voice, it also sounded as a sort of chorale to me, so for my own use I termed this theme a “Risset Chorale”: