Landscape with figures I



Flutes and strings morphing

 

This category consists of chords with orchestral attack envelopes. Pitches and dynamics are based on tam-tam analyses. Several transformation steps change pitch contents, in this order:

 

  • The first 750 milliseconds are choosen,[1] to give a focus on the attack phase of the sound.[2]
  • Only partials longer than 50 milliseconds are kept,[3] to maintain the most stable components of the sound.
  • The spectrum is used with original or inverted intervals (subharmonic spectrum).[4]
  • Intervals are multiplied[5] with a number between .25 and 5.5.
  • The whole sound is transposed somewhere between 3 octaves down and one octave up.
  • 15-50 percent of the notes are selected randomly.[6]
  • A bandpass filter[7] removes notes outside the practical range of the sampled sounds.
  • Only notes softer than MIDI velocity 40 are kept.[8] This will remove some of the loudest partials and make the chord more balanced when performed by string and flute samples.
  • A random chord note is transposed down and used as a virtual fundamental for overtone series spanning through the whole register,[9] used as a new pitch grid for the now strongly distorted tam-tam spectrum.[10] Floating number interval multiplications already applied can make the spectrum more harsh, artificial and less resonant. A harmonic series as intonation can bring back resonant qualities to the resulting sound.
  • Notes of the spectrum are sorted by pitch, and sculpted by an orchestral attack envelope[11].

 

Synthesis of the versions

Settings were adjusted by trial and error to find more than 200 unique chords for the installation. The chords were exported as a Csound score for synthesis with samples. The samples were flutes (normal, breathy sound or flutter), and strings (sul ponticello, sul tasto, and high harmonics). To maintain as much as possible of the original sound characters, samples were automatically selected for a minimal transposition interval.

 

This is where the morphing happens: For each note, a string sample and a flute sample are tranposed to the same pitch, a curve controls the morphing trajectory between two timbral sources, through morphing techniques available in Csound.[12]

 

Spatialization

 The notes individually go through a detailed spatialization. The Csound opcode space [13] distributes sound between 4 channels. Sounds at the simulated center of the room will be strongly amplified. To create a realistic virtual orchestra with balanced chords, notes are randomly distributed along the sides of the room, not too close to the virtual listener. The notes do not change positions, even though more surreal scenarios could have been attempted, as musicians flying around in the hall. Every note fades in various shapes between 2 different types of reverb, as if spaces were coexisting.

 

Reverbs were tested in advance[14] with sounds of strings, tam-tam and woodblock. The woodblock was especially revealing. I set early reflections short and discrete enough to create continous reverbs, not as echos to disturb rhythmic patterns. It was possible define dimensions of virtual spaces, from tiny bathroom, elevator shaft, halls of different sizes to huge valleys with endless resonance. Hanging 500 meters over a valley sound with no nearby walls, sounds justs disappear into virtually endless resonances, while nearby walls leaves impressions of confined spaces. Not just the sizes, but qualities and colours of the heard spaces can contribute to the compositions.

 

"...the reverberation effect , while perfectly described by acoustics, can also be applied to the domains of social communication or to rituals or mythologies."[15]

 

I gave up using sampled impulse responses for this project. The spatial richness of an orchestra surrounding an audience within variable spaces was simulated.

 

Sound morphing could be the most surreal aspect about this projection, but it could be perceived as not too different from crossfading within a conventional orchestration. I did choose the morphing techniques giving the least strange results.

 

When projected through the 16 channel installation, these 4 channel spaces will be moving through what Trevor Wishart calls 'frame motion'[16], including rotation, contraction and expansion. The distribution into a larger setup was done through ’distance based amplitude panning’ (DBAP), an algorhithm developed by Trond Lossius and implemented in Ircam Spat.[17]

 

 Through experiments with sound diffusion over larger speaker setups, I found DBAP most suitable for the ideas of ’frame motion’, as it does not rely on a centrally placed listener, nor does it strongly amplify the 'sweet spot' in the middle of speaker setup. It gave the most neutral shiftings of the 4 channel spaces. It is however not entirely realistic to move reverbs around.


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Visualization from Ircam Spat. The movie demonstrates movement patterns.

 

Some examples

 There is no published or edited score for the sound installation. Most of the situations exist within Open Music patches, and I will bring out a few notations with 1/8-tone approximation.

 

Flute-morph-strings 14:

 

 

 

 

 

 

 

 

 

 

Flute-morph-strings 18:

 

 

 

 

 

 

 

 

 

 

 

Flute-morph-strings 70:

 

 

 

 

 

 

 

 

 

 

 

The timbral richness of the sampled sounds can make the chords sound larger than they seem in notation.

 

 Envelopes of attack and decay sculpt these spectral orchestral masses. I have done this on paper in past pieces. This a sketch for a detail of the piece Circles (2006), showing visual shapes with approximation to rhythms.

 

 

 









 

 

 

 

 

 

 

 

 

Through work with Open Music, I implemented methods I had so far used on paper, and found new ones, benefiting from the power of rapid calculations from curve shapes. The Open Music envelope functions are doing the same thing. Envelops for attack and decay are drawn.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 







The shapes are approximated to rhythms.

 

 

 

































 

Inspirations for these envelope shapes were individual attack and decay lines of partials within a timbre. 

 



[1] Ruben-OM; select-multiseq-range.

[2] Even though some of the analyzed sounds were more continous scraping sounds.

[3] Ruben-OM; select-longest-multiseq.

[4] Ruben-OM; invert-spectre-multiseq.

[5] Ruben-OM; interval-multiply-multiseq.

[6] Ruben-OM; random-selection-multiseq.

[7] Ruben-OM; bandpass-multiseq.

[8] Ruben-OM; select-softest-multiseq.

[9] Ruben-OM; spectralize-random-chordnote.

[10] Ruben-OM: approx2mode-multiseq.

[11] Ruben-OM: multiseq-env-poly.

[12] The Csound opcodes pvcross, pvsmorph and pvsmix were chosen in this case.

[14] I used code collected by Josep Comajuncosas as a starting point for my own adjustments: http://www.csounds.com/jmc/

[15] [Multiple_authors], 2005, Sonic experience A Guide to Everyday sounds, p. 14

[16] Wishart, 1996, On Sonic Art, p. 228.