Appendix: The Journal


This is the raw audiovisual journal I kept for this project. It contains video experiments, documents, presentations and other material from the two master's years that resulted in this exposition. It does not contain everything, I have sketches on paper, in sound files and so on. It can be understood as an overall wireframe of the process that shaped this project. Below is where I took off (but not where I ended up) ↓

Master's project 2020/2022

Hi, my name is Julius and I compose electronic music. I am very much into the idea of space and spatialization as musical parameters. This artistic research project is me venturing deeper into these subjects, this is me searching for spatial identity.

Spatial identity

As described in the Oxford Dictionary of Human Geography: “Spatial identity refers to the identity or perceived image of a place, as opposed to the identities of individuals who live there. Each place has characteristics that make it unique and which help shape a sense of place and how it is viewed and perceived. This spatial identity changes over time, evolving with material changes to the environment, and discursive portrayals of the place in the media and through place marketing campaigns.” 1 

I use this term wide and loose to help describe what I am searching for in this project. Replacing the word place with the word music in the description above sums up what I am aiming for quite accurately, I suppose. On one side I am looking to elaborate on the spatial characteristics within my compositions using spatialization and surround sound techniques. On the other side I am looking for apt ways to perform and release these ideas and to find space for them to evolve.


Audiovisual journal

I have decided to keep a public audiovisual journal of this project as a sort of middle ground in all the spaciness. I picture it as a internal and external tool for conversation. As a way to share my ideas before I fully know what they are, as a way to experiment with what a release and a composition actually can be to me, and hopefully as a way to share knowledge.


I have a background in video production (filming, editing, making visuals for clubs etc.). Apart from a few music videos and short experiments my own music and visuals have never really met. Listening to and learning from teachers and fellow students with different backgrounds this first semester has opened me up to use the experience I have rather than to keep parts partially hidden (seems kind of obvious, but it has not been for some reason). I think visually about music, but I am just not shure how exactly and I hope this journal can help me find out. I hope the visual content can help emphasize the spatial identity of my music, reinforce ideas and help create movement forward in my experiments.















The plan

From January 2021 I will start to share my experiments writing new pieces, exploring techniques and learning new tools on a regular basis (every other week). The audiovisual journal will be shared on Youtube alongside information about how and why it was made. By the end of the project my goal is to have found a spatial identity and to have it presented in both a live performance setting and as some kind of release.


Questions guiding my work (as of Dec 16, 2020):

  • How can I utilize spatialization to elaborate the spatial characteristics in my pieces?
  • In what ways can my compositional process be reenforced by the making of visual content?
  • What aspects do sharing works and thoughts in progress bring to my practice?
  • What is a piece of music, a performance and a release to me? How can they be realized without compromising the spatial identity?

I really like Laurie Spiegel and her album The Expandning Universe, especially the track Patchwork. To me it has a clear spatial identity. The hard panned rythmic yet melodic patterns sort of tells the story twise in a beutiful way. The technical limitations of that time help colorise the piece in spatial regards but doesn't impede the music, rather the opposite. I will use techniques inspired by the Bell Labs synthesizer she used on that album as a takeoff for my experiments.

On the audiovisual side I feel inspired by 
NONOTAK and their way of making large scale performances out of tightly synced simple graphical elements. I also enjoy the dark monochromatic textures that crafts. The setting of both acts is something close to what I intend to do further down the line in my project.

Tools (that I use and/or want to learn more about)
Modular synths - Sequencers - Ableton - Touchdesigner - Ambisonics/Other surround sound techniques 

First semester presentation (video)

How to space my place and where to place my space?

Rings (A/V Journal #1)
A stereo idea using two instances of the same patch in Ableton, one panned left the other to the right with a dotted 8th delay. I was trying to obscure the melodic development by having the delayed channel evolve first. I sort of like the idea of a background leader. I triggerd the arpeggio clips using Push and recorded a take, the midi was sent to Touchdesigner triggering the rings in the video. The bass and drum layers happened as test while I listned/watched the recording of the first layer. 

Patch notes:

1. Arp - Ableton Sampler, a sample of me saying me that is FM-modulated with a square wave (left)

2. Arp - The same + Echo device ,100% wet, dotted 1/8 (right)

3. Bass - Moog Sub Phatty. Two square waves an octave apart, synced LFO to filter cutoff.

4. Drums - Moog Mother-32. Two layers of a pinged highpass filter on full resonance going in to a Taakab VCA that is clipping. I'm tweaking the envelope release time by hand to


Midi from the first two tracks in Ableton sent to Touchdesigner using TDAbleton. The rings size respond to the velocity and the Y-placement by what note. A feeback loop adds scaling movement. I followed a great tutorial by Bileam Tschepe for the basic setup:

Phases (A/V Journal #2)
Another take on the dotted eight panning offset but I started with the video this time, as a test to make some choices before turing to audio. Tempo and a basis for the rhythm was decided in the making of the visuals.

Patch notes:

Two instances of Wavetable in Ableton panned left and right, a simple four bar loop offset a dotted eight to the right. Using the Max for live device Les Concepts in Olafur mode to add sequenced notes based on the chords (generative lite). Mapped a midi encoder to control filter cutoff and amp envelope decay and just had some fun tweaking. I used the Echo device as a pingpong delay and reverb.


Webcam into Touchdesigner, a feeback network based on blur and edges top, retriggering the feedback at the start of every barusing beat chop. I trying to learn the basics of TD and how I can use the software. Recorded two takes that I placed side by side (with an offset) in Final Cut.


Turing Machine Jam (A/V Journal #3)

This is my first test using the Music Thing Modular - Turing Machine MKII with expanders and a quantizer to create pitched sequences. I'm slowly opening up for randomness and noise to enter my compositional process, this is a very controlled/locked/safe takeoff.

Patch notes:

- A locked Turing Machine sequence, quantized by 2HP Tune, multiplied 3 times by 4ms Buff Mult. 3 oscillators controlled by the sequence (Mother 32 + 2x Doepfer).

- Controlling the bias parameter on the quantizer (shifting the starting note in the scale) with the Volts expander by hand.

- Opening the VCA controlling the third oscillator with the 1+2 output of the Pulses expander and an envelope from Maths.

- Using Clouds in looping delay mode

Keep scrolling to reach my summary of this semester

Noise (A/V Journal #4)

My very first test using photogrammetry to generate visual content. I feel I need to learn more about how to control 3D objects in Touchdesigner before this can become fruitful.

Patch notes:
A beat from Elektron Analog Four going thru Mutable Instruments Clouds in resonator mode. Analog Four is also sending the pitch information via CV.

I did a bad 3D scan of myself using the app Capture for iOS and imported it to Touchdesigner. Played around with an LFO to control camera movement and added a feedback loop. Touchdesigner is receiving midi from Analog Four, the same CC knob that is controlling reverb send is adding brightness to the feedback loop. The video is a screen recording.

Ambiguous Ambient (A/V Journal #5)

I'm continuing to explore the Turing Machine, trying to find different ways of interacting with the generated content. This is a test letting the machine decide what note should be played on top of a note I'v chosen. Sort of: 50% me letting go. Tried to use similar approach to the visual side, finding a soothing movement and letting it flow freely. I feel as if the perspective shifted a bit with this one, the listening and viewing is a bit more the centerpiece of the experience in a good way.

Patch notes:
- Low notes is a predetermined sequence on my Mother-32, the harmony notes are generated randomly with a Music Thing Modular Turing Machine that is being quantized to a scale with a 2HP Tune. Mutable Instruments Clouds is adding delay and reverb. - This is pattern is recorded twice, panned left and right. - A third iteration of the same semi-random sequence is added towards the end, one octave lower.


Two free moving noise TOPs interacting with each other in TouchDesigner. Using the script absTime.seconds*0.2 to add motion to the noise. Noise and color correction done after recording in Final Cut Pro X.

Ambiguous Ambient II (A/V Journal #6)

The visual movement was a bit two fast in the last one, so I made a new patch on my Eurorack system and made the visuals a bit audio reactive, this needs some fine tuning. More and more I'm trying to see my compositions as rooms or small houses, where I set up the framework and maybe the walls and then let the machines, and the process itself arranges the furniture and so on. Basically I'm having a go at making generative music on my modular synth and seeing what I can learn and make out of it.

Patch notes:

- A continuously random sequence from Turing Machine with a limited range being quantized by 2HP Tune. Two voices an octave apart, the lower voice is triggered less frequently using a gate output from the Pulses expander and an envelope from Maths going into a VCA.

- Delay and distortion from a SOMA Lyra8-FX module. - Two takes recorded separately, panned left and right in Ableton.

- Some shimmer a fifth above and reverb added using Output Portal.


- Two moving noise TOPs interacting with each other in TouchDesigner. Using the RMS levels from the left and right channel to add motion to the noise this time, it's not perfectly in sync but the pace is more alined with the movement in the music I think. - Film grain and color correction done after recording in Final Cut Pro X.

Zoom (A/V Journal #7)

Today I did a improvised live set for my class over Zoom, this is the "soundcheck" I did just before. I'm testing out some new gear and experimenting with ways of controlling my Eurorack modular synth. Here I’m using the app Polyphase on my iPad to generate pitched and rhythmic material. I'm liking the way I can manually draw and manipulated the random patterns and the probabilistic aspects of the app, but not best friends with the touchscreen workflow, yet. I have to rehearse this setup way more. I’m challenging my subtractive ways of using filters to control timbre, insted I used an additive approach tweaking the wavefolding and harmonic structure (from even to odd) by hand.

Patch notes:

- Two channels of stepped and quantized random patterns from Polyphase on iPad being converted to CV using Shuttle Control.

- Two sine wave oscillators from Furthrrr generator modulating each other (ring mod, FM and some wavefolding).

- One oscillator is also being hard synced to the other. Since they don't share the same sequence the effect comes and goes in a nice way (new technique learned!)

- Envelopes from Maths.

- Distorted pitch modulating delay and distortion from Lyra 8-FX

- Long decaying delay and reverb from Clouds

- Some noise burst from Mother-32


- The visuals are low key/slow moving, ment to be showed in large format in the future.

- Two noise TOPs in Touchdesigner interacting with each other. The audio is making the movement happen and controlling the brightness very subtle.

- I’m interested in making the visuals and the music breathe the same air, be connected but not over synced. I want to create space (mellanrum) in between for emotions etc

My experimental phases (RGB)

Questions and aspects regarding the construction of a interactive generative system. Overlapping phases - a chaotic process. Keywords=           

R.   Audio

  • How to generate sound, what tools and techniques?
  • Control. Over what aspects, when/how?
  • Timbre > Notes
  • Minimal, Drone or Beat?
  • Patterns. A pulse-based ambiance
  • Digital and analog components in conversation
  • Software/Hardware
  • Randomness as a foundation
  • Making a performable patch = a composition

What goes into space, how


G.   Video

  • Why visual content?
  • Ebb/flow with the music, share the same breath
  • How should they influence each other?
  • Audio, MIDI, OSC or other messages?
  • Abstract vs. concrete
  • Noise/randomness as the main building block
  • 2D or 3D?
  • Textures/patterns
  • Add film/photo?
  • Driven by the same ideas under shared control
  • The pixel sequencer

Turing Machine Jam II(A/V Journal #8)

A 16 minute practice run on my Eurorack setup, no external processing, simply using Ableton to record the stereo output. This is me slowly moving towards being comfortable using this instrument to improvise in a live setting (in surround sound). To me the fun starts somewhere after 05:00. I'm trying to see what I can get out of two voices and effects. I feel like I need to learn how to steer this setup more in depth before I add more voices and elements. But I'd like to add a sequential switch to the setup just to be able to move between divisions (without pulling out cables) and get more interplay between the different sequences. My decision to use FM synthesis as the foundation in my patches recently has led me to write/play in higher registers than I'm used to, not sure why, maybe it's the ease of dialing in FM and the precise harmonic control my dual oscillator provides that gives me a new perspective. I'm liking this development, it's keeping me motivated to go on with these experiments.

Patch notes:

- Clock and a random sequence modulating the amount of FM on the oscillator coming from Shuttle Control.

- Highest voice: Furtherrrr Generator being sequenced by Music Thing Modular Turing Machine going thru a 2hp Tune quantizer. The Volts expander (a 5 bit sequence from the Turing Machine) is what I'm using to transpose steps in the sequence by hand. Envelope from Maths.

- Low voice: A filtered square wave on the Moog Mother-32, sequenced by the same sequence and a simple tonic/fifth pattern on the internal sequencer. The internal envelope is also used to control the fall time of the envelope on the high voice.

- Soma LYRA8-FX is used for a bit of distortion and a hard panned delay. - Clouds is adding a small delay in another division and reverb.

B.   Spatio

  • What rooms? (digital/physical)
  • Should only I interact with the system? Audience?
  • Quad of more?
  • Crafting a void (mellanrum)
  • A recording = a film?
  • To seek empowerment in the ephemeral (Katt Hernandez)
  • To harness caotic procedures
  • +++


Exploring spaces with the system


  • These keywords (noise/randomness, patterns, control) can mean a lot of different things depending on context. What do they mean to you in this moment?
  • Aspects you think could be fruitful for my last phase.
  • Readings/other tips about noise/randomness/chaos.

Displaced lines (A/V Journal #9)

Expanding my visual palette, trying out a more glitchy look than what I usually gravitate towards.

Patch notes: - Turing Machine sequencing Furthrrr Generator and sending a pulse to a Match envelope controlling the VCA.
- Soma LYRA-8FX for delay, distortion and hard panning of the delayed signal.
- Reverb from Clouds in Parasite firmware.

- Audio spectrum being visualized as a grind of rectangles using instancing in Touchdesigner. Left and right column corresponds to the audio stereo image.
- Displacing the image with a low resolution noise TOP being driven by the audio spectrum.
- Simple feedback network with a Difference composite adding the image over itself.

VCV Fragments (A/V Journal #10)

I held a workshop around my Eurorack system recently and made a VCV Rack version for my classmates to interact with. Here's a link if you'd like to give it a go:

The link contains a guide on what you need to install and a vcv project file. As a start you will need a copy of the latest version of VCV Rack (Free download from

This is the end of the second semester

Click the image below to read my summary and find out how I will move forward etc.


Presentation September 7 2021

Hi everyone! In my project I am piecing together a generative audio-visual system. It is based on a modular synth setup in combination with the software Touchdesigner. The backbone of the system is clocked and controlled noise/randomness (both for visuals and audio). I use it to generate patterns that can be shaped, looped, and alter in different ways, or be left to flow as they arise.


I am exploring ways to make the system interactive, so let us talk interactiveness, visuals in venues, placement, projections, and so on. How would you like to experience something in the lines of the clip? How do you invite to interaction without anything feeling forced?

Keyword: Temporal

New audiovisual inspiration:

Ko Hui

T. Gowdy


Vincent Houze

Squares (A/V Journal #11)

First step towards interactive visuals in my generative system. In this experiment the audio was recorded prior to the visuals. Interconnected examples are coming.

Visuals: Brief explanation. Simple motion detection using webcam as input and a cached delayed signal being blended together using Difference as blend mode. The signal is patched thru a low resolution noise making the first squares. The audio is making the noise move. The squared noise is being displaced by the webcam. A copy of the square noise top in monochrome is making the bigger layer of squares by displacing the first two signals in time using Texture 3D and TimeMachine top.

Patch notes (as much as I remember): - Two square wave oscillators being fed different sequences. One is also transposing the other. - Deep bass is one of the oscillators going thru a clock divider, switching between the different output divisions using a Intellijel Mutamix acting as a switch/sequencer. - Mid-side processing, eq and compression in Ableton.

Translating motion to control signals: embrace a flow


Eno oscilloscope - 100% generative

My ambition going forward:
Concerts turned installations turned concerts...

A temporal conversation, some building blocks:

The music makes the visuals move. Motion in visuals opens up visuals and audative trajectories, branches out, lives on. 

A bunch of words:
Real time composition/Improvisation, Temporal displacent, delay, granular processing. Cache. 
Frame rates, fragments, recent past, less recent past.

Time Machine TOP, Noise, chance, logic, comparators, rythm. Optical flow.


An exercise in agency

"Despite their obvious differences, what connects and equals humans and objects is the simple fact that they both influence others; they have agency. Agency is the capacity of an actant to change the environment. Through agency, humans and objects are able to interact." - Latour


The exercise

With limited prior knowledge improvise with two instances of the max for live device Feedback Network. Play the feedback patch as instrument inside a network of actants. - The idea was to create a head on, literal model of a musical improvisation seen as an actor-network. 

The device: Feedback Network consists of five Feedback Units, each of which has an independent bandpass filter and delay line. The output from each Feedback Unit can be routed to any of the other five, at an independent volume - Cycling74 

Actants in the network

H1 - Human Julius
H2 - Human Joakim
C1 - Midi-controller 1 (modwheel, pitchbend and slider on a M-audio keyboard)
C2 - Midi-controller 2 (three encoders on a Ableton Push1)

F1 - Feedback network 1

F2 - Feedback network 2

EF1 - Envelope follower 1 + amp
EF2 - Envelope follower 2 + amp

HP1 - Headphones 1

HP2 - Headphones 2

"Anything that does modify a state of affairs by making a difference is an actant...

each actant is at the same time also a network." - Latour

Feedback network - The human designer of the device, the digital nature, the auto-randomization, the interface etc.

Human Julius (?)

Submilieus adapted to improvisation


The external milieu
Comprises the musicians’ surroundings, the venue or
studio, the acoustics of the room, the equipment, etc. Furthermore, it includes their
socio-cultural and concrete musical backgrounds.

The interior milieu
Refers to whatactually characterizes these musicians: musical experiences, playing techniques, personal idioms, and, more generally, their ways of being.

The intermediate milieu
regulates the transformations and exchanges between the inside and the outside world.Through new information, aural and visual stimulation, and the sensations these induce, musicians relate, to a greater or lesser degree, to their external environment.

The annexed milieu
a segment of the external milieu, the interior one establishes connections and energy exchange. Much of the operating power in improvisations is based on a more or less direct connection with external milieus, on an energy exchange between musicians, but also between musician and instrument, musician and acoustics, musician and cultural background,

- (Cobussen via Deleuze and Guattari in Costa)

The Field of Musical Improvisation

Marcel Cobussen

Field, as in: 
A field of possibilities/a field of potential, “A complete dynamism of structure”.


"The FMI is a field of relations, a field of actions, actions of agents involved in the production of musical improvisations. The FMI forms and is formed by an uncertain interplay that concerns these relationships. These complex relations and

interactions do not constitute a simple, stable system; it is all about dynamics." - Cobussen


"The dynamics of systems and networks operate at levels “above” or “below” that of the human subject. When networks structure and organize the world, humans become nodes, subject positions for a host of active structural network formations that act and interact outside or beyond their direct control.” - Cobussen

Audio (may be a bit harsh 

1. Feedback improvisation






2. Feedback improvisation without auto-randomization

The assemblage

The in-between

Perpetual motion -- Feedback -- The complexity lies in the organization


"A healthy brain is not static... As every complex system thus constantly creates new opportunities, it can never

be in equilibrium; it is always unfolding, always in transition, always dynamic." - Waldrop


A system can exhibit complex, lifelike behavior only if it has the right balance of

stability and fluidity" - Waldrop

"Complexity theories look at interacting elements and ask how they form patterns and how these patterns unfold, instead of being interested in classifying and dissecting individual elements." - Deleuze and Guattari 

Joakims thoughts

- Trying to find stability

- "Ah now I get it, now I don't" - Frustrating but exiting

- Regulate, not only white noise

- The exploration is addictive

- Not more control but more insights