Deep Improv
Real-Time MIDI and Spectrogram Pattern Analysis for Augmented Improvisation
← BackOverview
This the main project I have been working on for a while. The aim of it is to capture and analyse Audio and MIDI data in real time. The data includes anything that can be found in the MIDI data exchange protocol as well as well as the usual audio features (things like spectral centroid, spectral bandwidth, spectral roll-off, spectral flux, zero-crossing rate, RMS energy, MFCCs, chroma features, pitch etc.)
All data gets recorded, timestamped, and analysed using real-time deep learning models. These models then generate new MIDI and audio data which can be fed back to performers as musical stimulus during improvisation.
The idea here is not to automate performance or simulate some kind of computer-facilitated improvisation. Instead, what we are trying to do is to enhance the improvisational environment by allowing performers to interact with latent musical structures as they emerge. This ends up being a real-time feedback loop between musician, data, and sound.
This is a big multidisciplinary project, that has come about after playing music with people for many years and reflecting on the experience of improvisation, and what it is like when it is going really well. It combines improvisation practice, live sound design, computational modelling, and symbolic music representation. Along the way the system becomes a responsive participant in the musical dialogue.
Finally, the name Deep Improv is a nod to Deep Listening, the lifelong project of Pauline Oliveros. I have always liked that idea, pushing the experience of hearing and practicing music, to its limits.
Background Reading
Ongoing background reading and related literature can be found here. Just the tip of the iceberg stuff really, but we found it handy when we were started exploring these ideas
Related Code
At the moment, just keeping it all in a single repo. You will find Python scripts for MIDI and spectrogram capture, database storage, and configuration of remote performance environments (e.g. Jamulus) etc. Link to related code
Getting Involved
To get involved in this project, contact me at jamie@jamiegabriel.org.