Keywords - note 1

resonance - room resonance

long term evolution / where are cues coming from?

no fixed form / dependent on outer events 

modular system / segmentations + bridges

exchanging values between modules (and with others, but not thinking about it for now) / organic, related to the modular system of the segmentations

spatial - connection between internal/external sound space - shared values/ shared points


[29/01]Brain storming - note 2:

Two (or more) different paths -> creating different zones (8 ch, probably 2 (or more) zones cross fading.

 

1. Resonance 

2. Noise

 

+events -> triggers

 

+between two paths

-shared values / shared points 

-common modules / different modules

==>common modules: expanding (normalizing)

 

+still flexible tempo and timing, depending probably on the external sound events

 

+Where is the resonance of the room going? Where is the noise source going?

 

+Time differences -> from botton to top stairs, so that there is a sense of time flow. 

 

{author: JYK, date: 200129, function: brainstorming, keywords: [resonance, noise, event,sound, room, staircase, flow]}

JYK: notes 2

[30/01]

From the counting idea - I'd like to include a specific rhythm pattern that could be drawn from external sound events by audience. (yeah how?)

Depending on the type of the sound, a different rhythm can be introduced. Rhythm here isn't necessarily associated with 'beat' but some how, can be reminded as a 'stepping on the stairs' and at the same time gives a certain time domain that vary the entire system's time duration.

How can I do that. Think harder.


-Brain storming for this

+make a module(synth) can generate a pattern whose tempo can vary from a certain characteristics of incoming sound, especially of a different external sound events.

+apply that into a different sound module. What module. let's find out.


{author: JYK, date: 200130, function: brainstorming, keywords: [counting, rhythm, steps, footsteps, staircase, pattern, tempo, module]}

{hhr, 200211}


I like this, and it seems similar to what Daniele was doing with resonances (though I haven't looked at the code). In my case, there will probably be no close feedback loops, as always offline processing is involved, so I will have to use other ways to extract resonances and work with them.


{author: HHR, date: 200211, function: comment, keywords: [resonance, feedback]}

[30/01]

Resonance - harmonizing (in a way) (code)

-Once the resonance frequency is heard, then it becomes a chord, which can vary due to the expanded sound by unstable bins constantly distract (or sometimes not really constant), and the external sound event.

I can't test this with the recording as there is no feedback, but I could do it in my studio; might be quite different in the location.

 

{author: JYK, date: 200130, kind: caption, keywords: [resonance, harmonizer, chord, sound, prototype, feedback]}

 

[31/01]

Just did some basic test on sending osc to nodes as still couldn't find a meaningful process. I'll have to narrow down from all that said. Maybe need some intuitive pictures about sound. Maybe much simpler. But still the resnance is the basic source material. I don't want it to be too drony. Where should this go.

Keep thinking about rhythm.

 

{author: JYK, date: 200130, function: brainstorming, keywords: [resonance, rhythm]}

{poz, 200211}


sounds cool! I especially like the richness of the small fluctuaitons between chords, at times sounds like microtonal detuning.. btw  you probably know already, [but you can do feedback through the impulse responses from the staircase if you close the loop by connecting the output of jconvolver to the input in SC. This way you create a 'virtual microphone', and you can can get a sense of how your feedback will be affected by the actual resonances in the space


{author: POZ, date: 200211, function: comment, keywords: [resonance, feedback]}

{hhr, 200211}


I also don't want to forget to think about rhythm, even if to me it's still unclear how to "respond" to it; I can see for example the event detection in Daniele's experiments as useful to an analysis stage.


{author: HHR, date: 200211, function: comment, keywords: [rhythm, rms]}

 

No media selected. Please add or select a media file for this tool.

[jyk 2002121]

I didn't know about jconvolver. Just tried it out and quite nice! Thanks for the tip!!

[03/02]

I've tried to exchange the flow of the signals (mainly re-organizing the synths, to see any possibilities to do it in some cases), especially aiming to create some percussive effects that could vary the resonance frequency, and eventually become a source for a further development.  The external sounds, especially the ones like door slam will be the best trigger  and input for this, but still I am not fully happy with the result, and still working on it. (Code)

 

{author: JYK, date: 200130, kind: caption, keywords: [flow, resonance, trigger, experiment, prototype]}

 [09/03] Test rendering excerpt

---
meta: true
author: JYK
artwork: ThroughSegments
project: AlgorithmicSegments

---