New projects like the Data Sonification Archive 1, are an attempt to bring together the exponential amount of artworks being created in the field of sonification. In this archive, one can find podcasts like "Loud Numbers" (2021) 2- the first sonification podcast in the world, to artworks like "A Galaxy of Suns" (2016) 3 - a performative interpretation of stars' positions.
Generally speaking, sonification projects are usually inspired by topics that deeply concern humanity and their goal is to show another way of understanding, but, as questioned by Dr. PerMagnus Lindborg, “can we gain a deeper understanding [of science] by listening to a sonification of the data that scientists use?” 4.
While the idea of understanding science can be very broad to define through sound, what in fact sonifications aim for is accessibility. In this line, Dr. Palle Dahlstedt expresses his thoughts in his chapter Evolution in Creative Sound Design 5, stressing the need for research to focus on accessibility and a more intuitive synthesis process. This research could be applied to how sonifications interact with the users, creating interactive human-machine systems to reinforce access to information. We can use as a case study Lindborg's Leçons 6 research paper on interactive systems. In his paper, Lindbog creates and analyzes an interactive performance system, where the sound produced by a performer is analyzed by a machine learning algorithm, creating a live-music response. The result is a co-creation between a human and a machine.
This idea of co-creation is a strong feature when combined with data from our environment. Sonifications have the potential to become responsive and interactive when combined with AI techniques and sensors. So, how do sounds and other materialities come together in a multimodal representation to convey the inherent meaning in the data?
The sonification artist and researcher Sara Lenzi 7 states: “we see the use of sound together with other sensory modalities as a step forward toward the engagement of the public with technoscientific knowledge such as that surrounding climate change”, where sensorial devices are used to reinforce a scientific message. In addition, Moritz Stefaner 8 has been doing notorious work in data visualization, working with the limits between data visualization and data art, especially in his recent investigations using the AI MidJourney.
There is still a great potential in the exploration of interactive and intelligent sound design, in relation to data sonification. And that is the way that Sounding Numbers will follow after this period inside RMC.
These investigations could potentially focus on the embodied perception of the data, working around an interactive or physical sounding experience, where the focus is set towards participation and engagement through the emotional qualities of positive and critical aesthetic experiences.
A state-of-the-art resource to carry out this future project is Flucoma 9, an open-source project where a toolset of machine learning algorithms is made available for sound processing. At the moment, they have integrated into SuperCollider, algorithms that go from clustering and classification to neural networks, like Perceptron.