The Intelligent Instrument lab as a few contributions to Audio Mostly 2024 conference, taking place September 18-20 in Milan, Italy.
Audio Mostly Conference 2024, Milan, Italy
Research labs are peculiar phenomena consisting of a defined research programme, technical infrastructure and social context — all crucial for generating, preserving, and disseminating new knowledge. A lab functions as an ecosystem in which we think and develop ideas, but this requires a productive technical and social platform. This paper presents the Organium, a system for improvisatory design thinking in the domain of musical instruments and the experimental humanities in general. The Organium is a dynamic library of technical elements, spatially arranged for rapid prototyping and systemic experimentation, serving as a central hub of our Intelligent Instruments Lab. Furthermore, the paper discusses the lab culture and research methodology for which the Organium was designed, framing the lab as an experimental ecosystem where technological assemblages act as boundary objects for transdisciplinary collaboration and discourse. We outline how the Organium supports our research methodology and present selected projects that demonstrate its functionality and impact.
The no-input mixer is an appealing instrument for practitioners seeking a constant negotiation of their musical desires with the instrument’s complex agency and affordances. Offering a simple setup, a re-invention of a traditional instrument, it resignifies an utilitaristic control space to a more obscure and surprising field of exploration, a fertile ground for artistic practices and languages to emerge. In the context of musicking with feedback instruments, i.e. their design, performance and experimentation, this project proposes a new hybrid device: an analog no-input mixer embedding a digitally controllable interface, introducing previously impossible interaction and mapping techniques (from storing, recalling, interpolating and sequencing of control points, to semi-automatic audio-driven mapping devices and agents), to evaluate their affordances on musical practices with the instrument and to cultivate the opportunity for novel languages and practices to emerge from them.
Musical instruments are typically played by one person only. There are practical and ergonomic reasons for this in traditional instruments, however the development of digital musical instruments destabilises this relationship. As well as creating opportunities to inter-connect performers’ systems, technology has opened a field of entangled instruments - where the players are enmeshed enough to be considered as forming a single, mutually played, instrument. A developed discourse around post-human thought and theories of entanglement within HCI has emerged, however there remains little exploration of its practical application to digital instrument design. There is even less research into applying entanglement to multi-player instruments. This paper presents elements of on-going practice research into creating such entangled instruments. Comparing instruments and participants’ responses to them is arguably a key element in an instrument’s development, however absolute descriptors can be problematic when the relative nature of experience and situated knowledge are taken into account. The paper proposes that considering collaborative instruments as being of a spectrum of wider possibilities is helpful and suggests using several lenses concurrently to do so. Two examples are suggested, Interpretive Mapping and Intra-action. A novel entangled instrument, The Perceptron, that seeks to enshrine and test these perspectives through a double layered machine learning approach is presented.
Stacco is a novel Digital Musical Instrument (DMI) with embedded magnets and sensors. It attracts and repels magnetic spheres and detects the changes in its magnetic fields upon interaction. It is designed to perform with neural audio synthesis models such as RAVE, in which sound features are represented and distributed within entangled multidimensional sonic spaces. Stacco allows drawing and embedding scores into the interface itself, and by bridging gesture and notation, it overcomes some of the inherent limitations of traditional notational methods as applied to neural synthesis. Our demo will provide the opportunity to try the instrument first hand. Moreover, we will invite participants to compose musical sketches for neural synthesis models, by drawing, embedding and layering tracing paper sheets on top of Stacco’s surface.