Composition by navigation

Anyone who wants to create music faces two questions:

What kind of music?

This question is usually answered by referring to existing pieces of music. For example, something sort of like this piece, but a little more like that piece, and so on.

Most listeners who actively think about music could imagine answers to this question. Pilot makes this the only question that the user needs to answer.

How to compose it?

This question regards construction of note patterns that capture desired characteristics and requires specialized skills for analysis and composition of musical structure.

Pilot itself answers this question by applying novel music theory and algorithms that analyze existing note patterns and synthesize new ones. The user thereby generates and controls music in real-time by navigating within a spectrum of musical influences.


Pilot


Pilot is a generative music application with the unique ability to steer its output toward specific riffs and melodies. The user navigates a musical landscape that forms a spectrum between selected music inputs.

image

The user selects note patterns from existing compositions, and morphs between them in real time to produce new variations (here note pattern means a MIDI Clip in a Live set).

Pilot analyzes the selected patterns and regenerates their most basic, generic building blocks. Some building blocks are audible on the musical surface; others exist only as deeper scaffolding for those surface patterns. Combinations of these building blocks have different probabilities for each composition. Pilot allows the user to control which combinations of building blocks will prevail in the musical output.

Pilot is implemented as custom Max-for-Live devices and Max/MSP patches/externals networked to a MacOS Obj-C application via OSC.

Shapes on a plane

The user selects MIDI Clips from a Live set and positions them on a plane that represents a 2D musical landscape. Pilot analyzes each note pattern placed on that landscape.

image

During playback the user creates new music by click/dragging the pointer on the landscape. Each location generates a stream of notes based on the relative distance to each music input.

The input note patterns can themselves also be moved around to warp the musical landscape in real time. The distance between the pointer and each icon determines how much influence that note pattern will have on each generated melody. Dragging the pointer from one clip to another will gradually morph the generated notes between the note patterns in those those clips.

image

Pilot generates and navigates a continuum of virtual musical patterns between the selected MIDI clips during playback.  By contrast Live itself provides strictly either-or switching between MIDI clips. without any musical middle ground between those clips.  

Navigate musical patterns

Original music is created by morphing between riffs that have been analyzed by Pilot. As Live plays, Pilot generates note patterns that are shaped by dragging the pointer among MIDI clips positioned on the plane.

image

As an example, the user can select three bass MIDI clips on one Live track, and three lead MIDI clips on another track, Each clip is selected by clicking on the clip in Live and then hitting one of the assignment buttons on the Pilot device. Three bass icons and three lead icons will then appear on the landscape, and their icons display the note analysis performed by Pilot.

During playback Pilot morphs between the lead clips on one track, and between the bass clips on another track, calculating two note streams in parallel within a common harmony and tempo determined by the Live scene.  

Steerable generative algorithm

image

Single notes branch into pairs of notes, and continue branching and pruning at different metrical levels to produce musical rhythms. The leaves of this branching form the notes that are actually heard, and there can be different generative pathways to the same result.

image

Pilot uses a novel theory that collapses all possible sequences of branching and pruning into a manageable set. Thereby it can immediately determine all generative pathways to a particular note pattern.

image

Pilot calculates a probability transition table for these competing generative pathways. A Markov chain generating attacks and pitches out of this table reproduces the original note pattern. Adjusting the probabilities produces related note patterns that preserve structural characteristics of the original.

image

Transition tables calculated for completely different melodies and riffs are added together to calculate probabilities for new note patterns that are hybrids of the originals, controlled by the weight assigned to each table. This is how the musical landscape described above is constructed: weight equals distance on the plane.

image

Since the transition tables are always only concerned with the next note (never with the overall pattern at once) it is computationally feasible to generate and control the musical output in real time.  

Integration into Live scenes

Pilot dissects and regenerates the music in a strictly organic fashion, without patching together prefab musical figures, transitions, rules, etc. The only musical building blocks are those that result purely from the generative process.

image

It is however designed to coexist with static MIDI clips on other tracks in a Live scene, and to respect the overall harmony and tempo of that scene.  In fact the generated note streams automatically adjust key and mode in order to match the currently playing scene.

Pilot post-processes the generative output, applying grammatical rules to the resulting note patterns in order to maximize musical coherence within the target harmony. The user controls the degree to which these grammatical rules are imposed on the final output.  

Motivations, and alternate implementations

The goal is to interbreed musical snippets from previous compositions, in order to use note structures to warp each other, and to generate new material. And to then feed newly generated compositions back in as inputs for further music generation, in order to evolve a library of dynamic music patterns and pieces that exist within an actual musical genealogy.

image

image

Alternate UI's have also been implemented to control these algorithms; one interface uses RGB values in images and videos. Jitter output is shown here controlling musical input weights based on the changing RGB values under the cursor.

Multi-user domains provide settings where collective musical influences and actions come into play, such as games and installations. The images below show an OpenCobalt (Squeak/Croquet) environment that uses OSC to communicate with the Pilot Max-for-Live devices.

Please see here for additional future directions.

image

image

External application

This project is a further stage of the work previously described here as the Replayer project. In order to provide data visualization, note editing, and faster performance, the C-based functionality has been moved from a Max external to a separate MacOS application, which uses OSC to communicate with Max for Live.

image

Plans and availability

Current plans are to share this project with potential collaborators/partners in terms of: composition, theory, software, distribution, and/or productization (please contact directly for further information: info@tone23.org). Mac App Store will be the likely means of any general release.