Information Retrieval in Embedded Systems for Audiovisual Artistic Processes

Status:

active

Start date:

2022-10-01

End date:

2026-09-30

The IRESAP project explores new methods for musical interaction between simultaneous participants with the goal to prove that a dislocated musical performance can create a sense of immersion in merged reality performances. To achieve these methods for sharing and accessing data in handheld devices and embedded electronic musical instruments as well as the understanding of the structures of artistic processes needs to be improved. Knowledge about communication in this kind of collaborative music making is useful for other applications where audiovisual artistic processes and performances in electronic music through new musical instruments, and the future sixth generation of mobile network technology (6G) is an excellent testbed for time critical communication systems. 

Well-designed information retrieval that is integrated with the creative flow is essential for the artistic process. We explore what design that enables artists to use and reuse artistic material from vast repositories; and what design support reflection on artistic decisions taken throughout the artistic process and how this process can be reinforced by artificial intelligence. The artistic process may consist of ensemble improvisation, where players are distributed in space, connected through novel applications of network technology. The preparation phase of a performance is based on what the artist can play, they adapt to what is possible, and expand the boundaries. Thus, the introduction of new possibilities inevitably changes the artistic practice. However, other sense parameters are involved in creating a sense of presence despite physical dislocation, where the spatial properties of the audio are crucial. A performance traditionally includes a place and an audience. We are interested in how the design of instruments support site specific experiential qualities, and how these performances are mediated to an actively participating dstributed audience creating an immersive experience. In the documentation phase of a performance, we focus to support information retrieval on what artistic decisions that actually matters, and to better understand the artistic choices made to apply this knowledge along with the automatic classification of the artistic material. 

A case that will be studied is a music performance in the depth of wildwoods, broadcasted via prototypical 6G mobile network to audiences and other musicians. This study provides valuable information about what material in this kind of performances is time critical which in turn allows for controlled and dynamic latency. This essentially provide artists and audiences alike to reconsider the sender-receiver logic of traditional artistic events and allow for a more interactively involved multimodal experience. Artists will have the freedom to be anywhere and to have access to vast repositories of material and to play with others remotely, for a remote audience. 

[Show all publications]

The Unfinder (Nov 2023)
Rikard Lindell, Henrik Frisk
International Symposium on Computer Music Multidisciplinary Research (CMMR)

Inner City in the Listener’s Auditory Bubble (Aug 2023)
Hedda Lindström , Tanja Jörgensen , Rikard Lindell
AM '23: Proceedings of the 18th International Audio Mostly Conference (AM'23)

PartnerType
Royal College of Music in Stockholm Academic
Ericsson AB Industrial
Spotify AB Industrial
teenage engineering AB Industrial

Rikard Lindell, Associate Professor

Email: rikard.lindell@mdu.se
Room: U1-090
Phone: +46-21-151759