Score Following Visualization

As part of my work through the Trusted AI Grant from NSA Crane, I have developed systems for visualizing what score following systems are doing "under the hood". Such visualizations promote trust in performers by giving insight into how a machine-based musical collaborator is working. Furthermore, if something goes wrong during score following, visualizations can help musicians "debug" issues and take corrective action.

The top video sheds light on the inner workings of a score follower algorithm where each score position is associated with two types of states: "found" states and "lost" states. "Found" states, in green, indicate the score follower is certain of the player's position. "Lost" states indicate the score follower is unsure of the player's current position, with the "last known" position shown in red. This type of system could provide real-time feedback to performers on whether a score-following based application is working properly or needs tuning.

The bottom video shows the behavior of a score follower algorithm that allows for jumps and skips (online dynamic programming). This score follower paradigm could be useful in applications tracking performers during a practice session, continuing to ascertain their location despite the performer jumping to different parts in the score. Paths in the visualization show where the algorithm is certain of the performer's past position, and where it is still unsure.

Video Accompaniment

We developed a “video accompaniment” system capable of closely aligning a pre-made music video to a live, tempo-varying musical performance. Traditionally, “tight” video-to-audio synchrony is only capable with a musician-restricting system like a click track, or making a human operator responsible for the timing of visual content. Our system automatically aligns the video to a live music performance. It uses the Informatics Philharmonic automatic accompaniment software to 1) follow a musician’s score position in real time and 2) predict when the next score position will occur. These position predictions are used by a Max/MSP patch to stretch the video such that animated gestures align with their musical counterparts.
We worked with clarinetist and animator Nikki Pet to adapt two musical works by composer Joan Tower to this new medium. These works were performed at multiple venues with our video accompaniment system. Two of these performances can be seen in the embeded videos.
Work published at Sound and Music Conference 2023, linked here.

Generic Accompaniment System

Similar technology used for video accompaniment can also be utilized in the context of closely synchronized live audio effects in Max/MSP. See embeded video for a demonstration of how pre-recorded sound effects and live audio effects can be triggered in close synchrony with pre-determined positions in the musical score.
The embedded video and Video Accompaniment project (above) all use a "networked" version of Informatics Philharmonic for audio/multimedia scheduling. We are currently in the process of developing a MaxMSP external of the Informatics Philharmonic application. This external will allow users to input a single-line music score. Then, when a user plays the piece, the external will output real-time predictions of when the musician will place future notes. This will allow for a range of applications including real-time generation of music videos, automatic stage performance light control, and improved score-based synchronization of new music compositions.

Virtual Ensemble Assembly

Remotely assembled music today depends on a pre-determined ``reference track". While reference tracks ensure all parts sound `in sync' when combined, they restrict a musician's ability to pick their own tempo or perform other timing-related musical gestures. Working with expert ensembles, we present experiments that address this problem. We first created a remote assembly pipeline allowing artists to have temporal freedom during the recording process. We then develop three algorithms that preserved players' original timing intentions while ensuring relative synchrony between parts: 1) direct modeling of chamber music expertise, 2) optimization of desirable performance qualities, and 3) performance simulation based on competing goals. Though the resulting music we assembled did not capture the full expressive range of an in-person performance, our work both increases the quality and ease of making remote recording and introduces exciting new applications for aiding synchronous rehearsal and performance.
Work published at Web Audio Conference WECML Workshop 2022, linked here.

Network Music Visualization

A network-based music visualization tool developed in D3 Observable, "NCBot" has been used for interactive visualizations and music videos. See demo on Observable server .
The videos on the right, produced in conjunction with Nikki Pet, use this type of network animations- in this case generated to match pre-recorded music. I first used a score-follower to find note onset times in the recordings, then those note onset times were mapped to network animation.

"Flexible Media" Adaptations

I worked with composer JacobTV and saxophonist Connie Frigo to adapt three of his pieces, Billie, Garden of Love, and Farewell Feathered Frieds, for flexible media using the Informatics Philharmonic software. I coined "Flexible media" as fixed media piece outfitted with score prediction software so it can speed up and slow down to match the tempo of a live player. Through this process, we explored the benefits and drawbacks of this new medium of performance.
The top video shows Jamey Guzeman performing JacobTV's Farewell Feathered Friends with our flexible system. The bottom video is a recording of a presentation at the North American Saxophone Allience 2019 conference, where saxophonist Connie Frigo presented and performed a flexible verison of Billie.
Work published at Sound and Music Conference 2021, linked here.