Computer assited composition has been part of my compositional practice since 2000, from simple algorithms suggesting pitch patterns, to fully developed systems for audio processing. It is possible to transform materials in ways which would be near impossible on paper, and use curve shapes to control developments over time on multiple parameters.

I have transcribed visual shapes and sound sculptures by hand. 1 Programmings were focused on implementing and expanding compositional techniques I was already using.

By sonification, any form of data can be remapped as musical information. My first experience in this area was assisting Tulle Ruth in her project "Speaking mountains". Map data was used to shape dynamics, pitch and transitions between vowels in speech synthesis. It was never pure data and structure speaking, there were always  artistic desicions about how information is remapped. The seven mountain sounds are musical interpretations coming from her description of the personality of each of these seven mountains.

Analyzed recorded sounds and their spectral information have been used in instrumental compositions since the early spectral works by Gerard Grisey and Tristan Murail. Information can be musical and non-musical, acoustic and non-acoustic. Anything can be reused as musical materials, going through the same set of treatments.

Musical sketches can suggest a diverse flow of information from multiple sources. A next stage will be developing tools for handling this information flow. I continued until I had enough tools to sculpt materials, in time (re-ordering, stretching, compressing) and pitch (stretching, compressing, filtering).  I spend time as a computer programmer, until coming back to try it out on musical materials, and critically evaluate and select results.

I had already developed a library for Ircams Open Music made available as open source software. Compositional techniques were demonstrated for my research project "Between instrument and everyday sound", and I will not again go in full detail about all of those.

By 2015, it was clear that graphical programming 2 was not an ideal solution, as the functions would often not work with the next version of Open Music. After recommandations from Ircam, I started learning Lisp programming during the first half year of the "Wheels within wheels" project. All functions from the old patch library were ported to lisp, with new names to avoid naming conflicts in the process. Example: 'time-scaler' is now called 'r-time-scaler'.

Ircam's Open Music can be downloaded here:
http://repmus.ircam.fr/openmusic/home

These are the download links for the OM-Ruben library:
https://sourceforge.net/projects/omruben/
http://rubensverregjertsen.com/researchsoftware.html

OM-Ruben has also been linked from Ircam as a third party library:
https://openmusic-project.github.io/libraries.html

The download contains all necessary documentation for those interested in learning to use this library. The included folder of demo patches contains examples of using OM-Ruben for score following, 3 many kinds of filtering, csound synthesis, compositional techniques of selected composers from the past century, sculpting and transformation of materials, gesture composition, orchestral envelopes, rhythm quantification, curve shapes, random distributions (built on the OMAlea library), ornamentation and conversions.

Every function is documented. Find it in the OM-Ruben library menu, place it into a patch, select it and type 'd'.You will have a description of inputs and outputs, background and what the function is doing. You can also read the lisp code if you like (type 'e').

The 'r-udp-player' was made for for microtonal playback from Open Music:
https://sourceforge.net/projects/r-udp-player/

Lisp coding proved to be more efficient than the old generic patch functions. A test patch first used four hours to process some orchestral material. With the new lisp functions it was taking just a few minutes. So far I have had no problems using this patch library with new versions of Open Music.

It was now ready to be expanded with challenges from the "Wheels within wheels" project; principles of ornamentation based on Historically Informed Performance, score following and control of audio processing. This library was central in creating variants of sound processing and raw materials for the composed pieces.

Computer assisted composition