Treffer: ARTIFICIAL LIFE IN INTEGRATED INTERACTIVE SONIFICATION AND VISUALISATION: INITIAL EXPERIMENTS WITH A PYTHON-BASED WORKFLOW
International Community for Auditory Display (ICAD)
Weitere Informationen
Presented at the 29th International Conference on Auditory Display (ICAD 2024) ; Multimodal displays that combine interaction, sonification, visualisation and perhaps other modalities, are seeing increased interest from researchers seeking to take advantage of cross-modal perception, by increasing display bandwidth and expanding affordances. To support researchers and designers, many new tools are being proposed that aim to consolidate these broad feature sets into Python libraries, due to Python’s extensive ecosystem that in particular encompasses the domain of artificial intelligence (AI). Artificial life (ALife) is a domain of AI that is seeing renewed interest, and in this work we share initial experiments exploring its potential in interactive sonification, through the combination of two new Python libraries, Tölvera and SignalFlow. Tölvera is a library for composing self-organising systems, with integrated open sound control, interactive machine learning, and computer vision, and SignalFlow is a sound synthesis framework that enables realtime interaction with an audio signal processing graph via standard Python syntax and data types. We demonstrate how these two tools integrate, and the first author reports on usage in creative coding and artistic performance. So far we have found it useful to consider ALife as affording synthetic behaviour as a display modality, making use of human perception of complex, collective and emergent dynamics. In addition, we think ALife also implies a broader perspective on interaction in multimodal display, blurring the lines between data, agent and observer. Based on our experiences, we offer possible future research directions for tool designers and researchers.