Notochord

Using the notochord homunculus terminal app with fluidsynth.

Using the notochord homunculus terminal app with fluidsynth.

Notochord is a MIDI model for live performance.
By Victor Shepardson

Python Package Documentation

AIMC 2022 Paper

GitHub Repository

Notochord is a machine learning model for MIDI data: once trained on a corpus of MIDI files, it can generate music. What makes Notochord interesting compared to similar MIDI models (like Magenta’s PerformanceRNN or the Metacreation Lab’s MMM) is that it can generate one event at a time with very little delay, while allowing precise interventions into the note, timing, velocity, and instrumentation of each event. This makes Notochord a kind of software backbone for building intelligent MIDI instruments which you can touch with your hands. An application can query Notochord to provide accompaniment, harmonization, AI improvisation, or weirder forms of MIDI mangling – see our video above for examples.

A block diagram describing the Notochord model architecture.

Architecture of the Notochord model (from the paper).

A screenshot of multiple code windows and video capture of a hand playing on a MIDI controller.

Using notochord with a MIDI controller and SuperCollider

AIMC 2024 Paper with Magetic Resonator Piano and Arca