Notochord

Interacting with the Notochord model using a MIDI controller.

Interacting with the Notochord model using a MIDI controller.

Notochord is a MIDI model for live performance.
By Victor Shepardson

Notochord is a machine learning model for MIDI data: once trained on a corpus of MIDI files, it can generate music. What makes Notochord interesting compared to similar MIDI models (like Magenta’s PerformanceRNN or the Metacreation Lab’s MMM) is that it can generate one event at a time with very little delay, while allowing precise interventions into the note, timing, velocity, and instrumentation of each event. This makes Notochord a kind of software backbone for building intelligent MIDI instruments which you can touch with your hands. An application can query Notochord to provide accompaniment, harmonization, AI improvisation, or weirder forms of MIDI mangling – see our video above for examples.

You can also read the AIMC paper describing Notochord, or grab the code and model checkpoints from our GitHub repository.

A block diagram describing the Notochord model architecture.

Architecture of the Notochord model (from the paper).