To be able to develop a scientifically usable and artistically appropriate system that makes the key processes behind plant sounds identifiable and tangible, our research project demanded that everyone engage deeply with the mindset of the other disciplines involved in the project.


Aesthetic questions therefore already determined the structure of the experiment – for example, where and how the cameras should be installed on the tree, or how the ecophysiological sensors should be mounted to enable an adequate and comprehensive presentation of the life processes and environmental conditions in an immersive environment. On the artistic side, the scientific and technological conditions had to be considered; beyond this, the core artistic task was to develop adequate sonic representations of the non-auditory ecophysiological processes. The reconstruction and staging of the life processes and environmental conditions of a tree in an artistic-technical environment has led to new forms of observation and design: correlations between measured values and patterns in natural processes become aesthetic effects – abstract measurement data are reflected in images and sounds.


The image of nature created with digital technologies demanded an artistic nuancing of the acoustic and visual presentations, so that, for example, the large number of sounds present in the system did not disturb or overlap one another. Data had to be interpolated and filtered to make individual processes perceptible.

 

The actual core of our project became ever clearer during the research work: trees is concerned with the production of a new form of holistic knowledge that mediates not only via the verbalisation of contexts in a research report but also in a directly tangible auditory and visual (medial) form. The intention of the implementation of an artistic-scientific observation system in a media art installation was to create a comprehensive experience from very different, multidimensional data sets, and thus to sketch a holistic picture of the life processes and environmental conditions of a tree subjected to the pressure of changing climatic conditions. Striking a balance between the knowledge and practices used was essential – the artistic imagination of scientific objects had to receive the same attention as the scientific foundation of the aesthetic objects.

 

Our image of nature, in particular our perception of the plant kingdom, is still dominated by a perspective that treats natural-life processes as if they were the mechanistic workings of inanimate objects. However, the animated object often reveals itself only with a change in perspective, a reduction in distance, and the suspension of difference. Current media technologies put us in a position to experience and interpret anew both nature and natural objects and processes in an immersive situation.

 

At the moment, there exist four versions of our installation: The stereo/IP cam version for two speakers and/or three headphones and three TFT monitors (figure 1), the larger spatial audio installation (figure 2), and the adaptation for ICST’s Immersive Lab (figure 3), as well as the adaptation for ICST’s FlowSpace (figure 4). The same sonification algorithms are implemented in all versions but are differently mapped onto the speakers/headphones and present different video footage. In the stereo version, one sees video footage from two IP cams on the stem of our measurement tree, each focusing on one of the plant’s branches.

 

The spatial audio system (figure 2) consists of an octagon carrying thirty-six self-built omnidirectional speakers. It is designed to be an accessible three-dimensional speaker array, where virtual sound sources are moved and placed within a defined space and listeners can walk around inside the system. The speaker matrix that we have developed is a hybrid sound system: an Ambisonics sound field is mapped onto the tube matrix, but some of the speakers are driven discretely. A 24-cm touch screen at the centre makes the installation an explorative, self-explanatory artistic system: the visitor is able to switch sound sources on and off to identify individual phenomena and their sonifications (see ‘Data sonification’ above). A time-lapse video of the tree and its surroundings informs the visitor visually about the local climatic conditions (weather, time of day, and light intensity). These images were taken by a so-called tree canopy camera – that is, a fish-eye camera system that biologists use to measure, for instance, the light intensity coming through a canopy to investigate a tree’s state of health or the light exposure of plants growing on the forest floor. Furthermore, visitors may observe a graphic representation of the local climatic measurements and the correlations between the individual phenomena as well as a graphic display of the peak frequencies of the measured acoustic emissions at different locations along the plant’s trunk and branches.

 

The adaptation for ICST’s Immersive Lab (figure 3) marks an important shift regarding immersion. The projection surfaces of the system are touch sensitive: This enables the visitor to touch single branches of the tree, immediately hear the resulting acoustic emissions, and see a graphical representation of the local measurement data.

Figure 1: Stereo/headphone version of trees: Pinus sylvestris.

Figure 2: Spatial audio version of trees: Pinus sylvestris.

Figure 3: Adaptation of trees: Pinus sylvestris for ICST’s Immersive Lab

Figure 4: Adaptation of trees: Pinus sylvestris for ICST’s FlowSpace.

Exhibitions:

 

Swiss Federal Institute for Forest, Snow and Landscape Research WSL, Birmensdorf
September–December 2014

ICMC SMC 2014 Conference, Athens
National Museum of Contemporary Art, Athens
September 2014

Creative City
Zurich University of the Arts ZHdK
25 October 2014

SoundReasons Festival, New Delhi, India
Outset India
31 October–10 November 2014

Clarke House, Bombay, India
3–7 June 2015

ICAD Conference on Auditory Display 2015

ESC Medien Kunst Labor, Graz, Austria

7–11 July 2015

IDEAS – Calit2 Performative Computing Lab, Atkinson Hall, UC San Diego, USA
9 October–10 November 2015

Immersive Lab at Gray Area, San Francisco
13–30 November 2015

UN Framework Convention on Climate Change COP21, Paris

Le Bourget, Hall 3

30 November–11 December 2015