Complete Tutorial

Live-Code
Your Music.

POMSKI is a Python-based live-coding environment for real-time MIDI composition. Write and rewrite patterns while the music plays — no stopping, no rendering, no waiting.
Windows users: Get the POMSKI setup installer here: Itch.IO
Mac users: You can build from source by cloning the package from here.

Why live-code music?

Most music software asks you to build first, then listen. You place notes on a grid, render an audio file, press play, realise something is wrong, go back to the grid. The feedback loop is slow, and the act of composition is disconnected from the act of listening.

POMSKI inverts this. You write a few lines of Python, press Shift+Enter, and the change takes effect on the very next bar — while everything else keeps playing. The music is always running. You are always inside it.

The live-coding philosophy

Live coding is a practice borrowed from computer science performance art (the Algorave scene, SuperCollider, TidalCycles). Its central insight is that code is notation — a precise, expressive language for describing musical pattern. Unlike a piano roll, code can describe not just a specific sequence of notes but the rules that generate a sequence: mathematical systems, probability, chaos, feedback loops.

The key shift

In a DAW you compose a performance. In POMSKI you perform a composition. The score and the concert are the same event.

What POMSKI adds to this

Who is this for?

You do not need to be a programmer

You need to understand about 8 Python concepts. All of them are explained in the Python Primer section. If you can write a grocery list, you can write a POMSKI pattern.

Setup & first run

Prerequisites

LoopBe internal MIDI (Windows)

If using LoopBe as your virtual MIDI port, its feedback protection will silently mute output if it detects a loop. Check the LoopBe tray icon if MIDI goes silent unexpectedly.

Running the template

Terminal
# From the project directory:
python examples/pomski_template.py

You will see startup messages. Then open your browser to http://localhost:8080.

The web UI layout

Quick test

Copy this into the editor and press Shift+Enter. You should hear a metronome-like kick on MIDI channel 1.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.note(60, beat=0)

Python for musicians

You need exactly eight Python concepts to use POMSKI. Nothing else is required.

Variables

Store a value and reuse it by name.

root = 60           # middle C
speed = 0.25       # quarter note
scale = "dorian"

Functions (def)

A named block of code that runs when called.

def greet():
    print("hello")

greet()  # runs it

The @ Decorator

Wraps a function with extra behaviour. This is how POMSKI registers patterns.

@composition.pattern(channel=0)
def ch1(p):
    ...  # your pattern here

For Loops

Repeat an action a set number of times.

for i in range(4):
    # runs 4 times: i=0,1,2,3
    p.note(60, beat=i)

Lists

An ordered collection of values. Access items by index (starting at 0).

notes = [60, 63, 67, 70]
notes[0]  # → 60 (first)
notes[2]  # → 67 (third)

Modulo %

Returns the remainder after division. Essential for cycling through lists.

7 % 4  # → 3
8 % 4  # → 0  (wraps back)
# Cycle through a list:
notes[p.cycle % len(notes)]

Conditionals (if)

Run code only when a condition is true.

if p.cycle % 4 == 0:
    # runs every 4 loops
    p.note(72, beat=0)

Enumerate

Loop through a list and get both the index and the value.

pitches = [60, 64, 67]
for i, pitch in enumerate(pitches):
    p.note(pitch, beat=i)

The p object

Inside every pattern function, p is your pattern builder. It is passed in automatically — you never create it yourself. Every method you call on it (p.note(), p.euclidean(), etc.) adds events to the current loop cycle of that pattern.

p has a few useful read-only attributes:

AttributeTypeDescription
p.cycleintHow many times this pattern has looped since it was defined. Starts at 0. Use it for evolving patterns over time.
p.barintThe global bar number at the time this pattern fired.
p.rngRandomA seeded random number generator. Use p.rng.random(), p.rng.choice(list), p.rng.randint(a,b). Reproducible — same seed every time the pattern restarts.
p.sectionSectionInfoCurrent form section (name, bar within section, total bars). None if no form is defined.
p.datadictDirect reference to composition.data. Use it to read Live values, signals, or cross-pattern state.
p.cycle is an integer, not a function

p.cycle is a number. Do not write p.cycle([60, 63]) — that will throw a TypeError. To cycle through a list, write my_list[p.cycle % len(my_list)].

MIDI pitch numbers

POMSKI uses raw MIDI pitch numbers (0–127). Middle C is 60. Each semitone is +1 or −1. An octave is 12 semitones. Common reference points:

NoteMIDI#NoteMIDI#NoteMIDI#
C348C4 (middle C)60C572
D350D462G467
E352E464A469
F353F465Bb470

How POMSKI works

The composition object

Everything lives inside a composition. It holds the clock, the key, the harmony, and all the running patterns. You start it at the bottom of your script with composition.play(), which blocks forever and runs the event loop.

composition = subsequence.Composition(key="C", bpm=120)
composition.harmony(style="functional_major", cycle_beats=4)
composition.web_ui()    # start the browser dashboard
composition.live()      # start the REPL server on port 5555
composition.play()      # always last — starts the clock

Pattern slots

The template defines 16 slots — ch1 through ch16 — one per MIDI channel. Each slot is a silent placeholder. To make a slot play, you redefine it by name from the browser editor. The scheduler detects the same function name and swaps the new pattern in at the next bar boundary without interrupting playback.

Channels

POMSKI channels are 0-indexed. MIDI channel 1 = channel=0, channel 10 (drums) = channel=9. This is a common source of confusion — the template assigns slots in order, so ch1 is channel=0, ch10 is channel=9.

Auto-assign

If you use a new function name (one not already in the 16 slots), POMSKI automatically steals the first empty slot and puts your pattern there. You can use any name you like.

Pattern length

length is in beats. length=4 is one 4/4 bar. length=8 is two bars. length=0.5 is a half-bar (2 beats). Patterns of different lengths run simultaneously and loop independently — a 3-beat and a 4-beat pattern will phase against each other like a polyrhythm.

The hot-swap mechanism

When you send a redefined pattern from the browser, the live server executes the decorator. If a pattern with that name is already running, POMSKI replaces its builder function on the existing slot at the next bar boundary. The channel, MIDI routing, and sync all remain intact. Nothing restarts. Nothing skips.

Your first pattern

A single note

The simplest possible pattern. One note on beat 0, looping every 4 beats.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.note(60, beat=0)
pat — shorter decorator

pat is pre-loaded as an alias for composition.pattern. The channel is the first positional argument and length defaults to 4, so @pat(0) is equivalent to @composition.pattern(channel=0, length=4).

@pat(0)          # channel 0, length 4
def ch1(p):
    p.note(60, beat=0)

@pat(0, 8)       # channel 0, length 8 beats
def ch1(p):
    p.note(60, beat=0)

Adding notes

Beat positions are in beats from the start of the pattern. Beat 0 = the downbeat. Beat 0.5 = the "and" of beat 1. Beat 1 = beat 2 of the bar.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.note(60, beat=0,   velocity=100, duration=0.4)
    p.note(64, beat=1,   velocity=80,  duration=0.4)
    p.note(67, beat=2,   velocity=80,  duration=0.4)
    p.note(60, beat=3,   velocity=60,  duration=0.4)

A drum pattern

Drums go on channel=9 (MIDI channel 10). Use drum_note_map=gm_drums.GM_DRUM_MAP to use drum names instead of numbers.

@composition.pattern(channel=9, length=4, drum_note_map=gm_drums.GM_DRUM_MAP)
def ch10(p):
    # hit_steps places hits at 16th-note grid positions (0–15)
    p.hit_steps("kick_1",       [0, 3, 8, 12], velocity=110)
    p.hit_steps("snare_1",      [4, 12],       velocity=100)
    p.hit_steps("hi_hat_closed", range(16),     velocity=70)
    p.velocity_shape(low=55, high=90)  # humanise velocity

The seq() shorthand

Write a melody in Sonic Pi style — space-separated pitch numbers with _ for rests.

@composition.pattern(channel=1, length=4)
def ch2(p):
    p.seq("60 _ 63 _ 65 67 _ 63", velocity=80)
    # plays: C4, rest, Eb4, rest, F4, G4, rest, Eb4
    # each token is one eighth note (length/8 = 0.5 beats)

Clearing a pattern

To silence a slot without stopping playback, click the × button on that slot in the Patterns tab, or send a blank definition:

@composition.pattern(channel=0, length=4)
def ch1(p):
    pass  # empty body = silence

The p. method reference

Placing notes

MethodDescription
p.note(pitch, beat, velocity, duration) Place a single note. pitch: MIDI 0–127. beat: position in beats from start of pattern. velocity: 0–127 (default 100). duration: length in beats (default 0.5).
p.hit_steps(pitch, steps, velocity) Place hits at 16th-note grid positions. steps is a list of indices 0–15. With drum_note_map, pitch can be a drum name string like "kick_1".
p.sequence(steps, pitches, velocities, durations) Pair a list of grid positions with a list of pitches. Positions are 16th-note indices; pitch list cycles if shorter than steps list. Parameters are velocities and durations (plural) — pass a single int/float or a list.
p.seq(notation, velocity) Sonic Pi style: space-separated numbers, _ for rests. All tokens are equal duration (pattern length / token count). Example: "60 _ 62 64". Note: p.seq() has no duration parameter — note length is set automatically.

Rhythmic tools

MethodDescription
p.euclidean(pitch, pulses, velocity, duration) Distribute pulses hits as evenly as possible across the pattern's grid (Björklund algorithm). The grid size is derived automatically from the pattern's length. Classic for polyrhythmic layering.
p.bresenham(pitch, pulses, velocity, duration) Alternate even distribution using the Bresenham line algorithm — creates slightly different accent patterns than Euclidean. Grid size is derived automatically from pattern length.
p.hit_steps() with range() For quick straight 8ths or 16ths: range(0, 16, 2) = every other step (8th notes). range(16) = all 16 steps.

Modifiers — call these after placing notes

MethodDescription
p.randomize(timing, velocity) Add human-feel micro-timing and velocity variation. timing=0.02 means ±2% of a beat. velocity=0.05 means ±5% velocity.
p.dropout(probability) Randomly remove notes. p.dropout(0.2) = 20% chance each note is silenced on this cycle. Adds natural variation without changing the underlying pattern.
p.quantize(key, mode) Snap all notes to the nearest pitch in a scale. Key is a note name ("C", "F#", "Bb"). Mode is a scale name ("ionian", "dorian", "minor", "harmonic_minor", etc.).
p.quantize_m21(key, scale_name) Like p.quantize() but unlocks Music21's full scale library. Use for ragas, octatonic, whole-tone, and other exotic scales. Requires pip install music21.
p.transpose(semitones) Shift all notes up or down by a fixed number of semitones. Combine with p.cycle for automatic transposition.
p.shift(steps) Move all notes forward or backward in time by steps 16th-note positions. Useful for polyrhythmic offset against other patterns.
p.reverse() Flip the pattern backwards in time.
p.velocity_shape(low, high) Re-scale all velocities to fit within a low–high range, preserving relative dynamics. Great for humanising step-sequenced drum patterns.
p.thin(pitch, strategy, amount) Remove notes for a specific pitch or drum name based on rhythmic position. amount=0.5 removes roughly half; amount=1.0 removes all qualifying notes. Strategies: "strength" (default — weakest positions drop first), "sixteenths", "offbeat", "e_and_a", "downbeat", "upbeat", "uniform". Unlike p.dropout(), targets one instrument and respects rhythmic position.

Loading from files

MethodDescription
p.from_midi(filepath, track, channel, pitch_offset, velocity, duration) Read notes from a .mid file and place them in the pattern. Timing is normalised to beat 0 and wrapped to pattern length. track is the zero-based track index. pitch_offset transposes all loaded notes.

Building a track from scratch

Here is a complete step-by-step process for building a live track in POMSKI, starting from silence. Each step is sent as a separate block from the editor — the previous patterns keep playing while you add new ones.

1

Lay the drums

Start with rhythm. Keep it simple — kick and hi-hat first. Add the snare in a moment.

@composition.pattern(channel=9, length=4, drum_note_map=gm_drums.GM_DRUM_MAP)
def ch10(p):
    p.hit_steps("kick_1",       [0, 8],       velocity=110)
    p.hit_steps("hi_hat_closed", range(16),   velocity=65)
    p.velocity_shape(low=50, high=80)
2

Add a bass line

A repeating root note with a rhythmic pattern. Low octave, short duration for punch.

@composition.pattern(channel=1, length=4)
def ch2(p):
    p.hit_steps(36, [0, 3, 8, 11, 14], velocity=100)  # C2
3

Layer a melody

A simple ascending figure using p.seq(). The _ rests give it breathing room.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.seq("60 _ 63 65 _ 67 _ 65", velocity=80)
    p.randomize(timing=0.01, velocity=0.04)
4

Evolve the drums

Redefine ch10 with more elements. The existing kick and hi-hat keep playing until the bar ends, then this replaces them.

@composition.pattern(channel=9, length=4, drum_note_map=gm_drums.GM_DRUM_MAP)
def ch10(p):
    p.hit_steps("kick_1",       [0, 3, 8, 12], velocity=110)
    p.hit_steps("snare_1",      [4, 12],       velocity=100)
    p.hit_steps("hi_hat_closed", range(16),   velocity=70)
    p.velocity_shape(low=55, high=90)
5

Add variation with p.cycle

Make the melody change every 4 loops — play one phrase for 4 bars, then another.

@composition.pattern(channel=0, length=4)
def ch1(p):
    phrases = [
        "60 _ 63 65 _ 67 _ 65",
        "67 _ 65 63 _ 60 _ 63",
        "65 67 _ 70 _ 67 65 _",
    ]
    phrase = phrases[(p.cycle // 4) % len(phrases)]
    p.seq(phrase, velocity=80)
    p.randomize(timing=0.01)
6

Mute and unmute for arrangement

Use the Patterns tab mute buttons, or type these into the Quick Command box at the bottom of the log panel:

composition.mute("ch10")    # silence drums
composition.unmute("ch10")  # bring them back

Making patterns evolve over time

Using p.cycle for structure

p.cycle increments every time the pattern loops. Use integer division (//) to create larger structural boundaries — every N loops, something changes.

@composition.pattern(channel=0, length=4)
def ch1(p):
    # Changes key every 8 loops (every 8 bars)
    keys = ["C", "F", "G", "Am"]
    key = keys[(p.cycle // 8) % len(keys)]

    # Every other loop, add an extra note
    if p.cycle % 2 == 0:
        p.note(72, beat=3.5)

    p.seq("60 _ 63 65", velocity=80)
    p.quantize(key, "dorian")

Randomness with p.rng

p.rng is a seeded random number generator local to your pattern. Use it instead of Python's random module — it ensures reproducible behaviour if you restart a pattern with the same name.

@composition.pattern(channel=0, length=4)
def ch1(p):
    scale = [60, 62, 63, 65, 67, 70, 72]
    for i in range(8):
        if p.rng.random() > 0.3:  # 70% chance of playing
            pitch = p.rng.choice(scale)
            p.note(pitch, beat=i * 0.5)

Dropout for sparse texture

Apply p.dropout() after placing notes for probabilistic thinning — the underlying pattern stays intact but notes randomly vanish each cycle, keeping things interesting.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.seq("60 62 63 65 67 65 63 62", velocity=80)
    p.dropout(0.25)  # 25% of notes vanish each cycle

Signals and modulators

POMSKI has a signal system for continuously varying values — LFOs, envelopes, Perlin noise curves. Signals appear as scrolling graphs in the Signals tab of the UI and can drive any numeric parameter inside a pattern.

LFOs via the conductor

Create a named LFO with composition.conductor.lfo(), then read its current value inside any pattern with p.signal(name). The value (0.0–1.0 by default) can drive velocity, pitch offset, or any other numeric parameter.

# Create an LFO (send this once from the editor or add it to the template)
# cycle_beats = how many beats for one full sweep (16 = 4 bars at 4/4)
composition.conductor.lfo("vol_lfo", cycle_beats=16, min_val=0.2, max_val=1.0)
# Read it inside a pattern with p.signal()
@composition.pattern(channel=0, length=4)
def ch1(p):
    vol = p.signal("vol_lfo")          # returns 0.2–1.0, sweeping over 16 beats
    for i in range(8):
        p.note(60, beat=i * 0.5, velocity=int(vol * 100))

Perlin noise for organic modulation

perlin_1d generates smooth, organic noise — much more musical than pure random. Import it and write values to composition.data to visualise them.

from subsequence.sequence_utils import perlin_1d

@composition.pattern(channel=0, length=4)
def ch1(p):
    # perlin_1d(x, seed) → float 0.0–1.0
    # Multiplying p.cycle by a small number controls how fast it moves
    noise = perlin_1d(p.cycle * 0.07, seed=42)

    # Write to composition.data — shows up in Signals tab
    composition.data["pitch_wander"] = noise

    # Map 0–1 noise to a pitch range
    base_pitch = 48 + int(noise * 24)  # C3 to C5
    for i in range(8):
        p.note(base_pitch + i, beat=i * 0.5)
    p.quantize("C", "dorian")

Cross-pattern communication with p.data

p.data is the same dictionary as composition.data. One pattern can write a value, another can read it — enabling the kind of interdependence that makes a generative system feel alive.

from subsequence.sequence_utils import perlin_1d

# Pattern 1 writes its density to data
@composition.pattern(channel=0, length=4)
def ch1(p):
    density = perlin_1d(p.cycle * 0.05, seed=1)
    p.data["density"] = density
    p.euclidean(60, pulses=int(2 + density * 10))

# Pattern 2 reads that density to affect its own behaviour
@composition.pattern(channel=1, length=4)
def ch2(p):
    density = p.data.get("density", 0.5)
    p.seq("48 _ 51 _ 53 _ 55 _", velocity=80)
    p.dropout(1.0 - density)  # sparser when ch1 is dense
Clearing signals

Click the ✕ button next to any signal in the Signals tab to remove it from the display and the data dictionary.

Algorithmic composition

These methods generate entire note sequences from mathematical systems. They are all available directly as p. methods — no imports needed.

Euclidean rhythms

The Euclidean algorithm distributes N hits across M steps as evenly as possible. This produces the rhythmic patterns found in West African drumming, Middle Eastern music, and countless other traditions — because maximum evenness is also maximum musicality.

# 5 evenly-spaced hits across the pattern — the "cinquillo" rhythm
# (steps are derived automatically from pattern length)
p.euclidean(60, pulses=5)

# Stack multiple voices for polyrhythm
p.euclidean(36, pulses=3)  # kick
p.euclidean(40, pulses=5)  # snare
p.euclidean(42, pulses=7)  # hi-hat

# Use p.shift() to offset a voice's start point for variation
p.euclidean(60, pulses=5)
p.shift(3)  # shift all notes forward by 3 grid steps

Chaos: Lorenz attractor

The Lorenz attractor is a set of differential equations that produces chaotic (never exactly repeating) but aesthetically coherent trajectories. POMSKI maps its x-axis values to pitch.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.lorenz(
        steps=16,
        pitch_range=(48, 72),  # C3 to C5
        velocity=80,
        s=10.0, r=28.0, b=2.667  # classic parameters
    )
    p.quantize("C", "minor")

Brownian motion (random walk)

Each note is a small random step away from the previous — like a melody that wanders without leaping wildly. Highly musical because it respects the statistical shape of real melodies.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.brownian(
        start=60,          # starting pitch
        steps=16,
        step_size=2,       # max semitones per step
        pitch_range=(48, 72)
    )
    p.quantize("D", "dorian")

Conway's Game of Life

Conway's Game of Life is a 2D cellular automaton — cells live or die by neighbour rules. POMSKI runs N generations and reads one row: live cells trigger note hits. The result is rhythm that feels like it has internal logic — because it does.

@composition.pattern(channel=0, length=4)
def ch1(p):
    p.game_of_life(
        pitch=60,
        cols=16, rows=4,
        generations=p.cycle % 32 + 1,  # advance each loop
        row=0
    )

Logistic map (chaos theory)

The logistic map (x → r·x·(1−x)) is one of the simplest equations that produces chaos. At r > 3.57 it becomes chaotic — pitch and velocity diverge unpredictably from tiny differences in starting conditions.

p.logistic(steps=16, r=3.9, pitch_range=(48, 72))

Gray-Scott reaction-diffusion

The Gray-Scott equations model chemical reactions creating Turing-like patterns — spots and stripes that emerge from uniform initial conditions. POMSKI maps the intensity field to note velocities, creating complex dynamic textures on a fixed pitch.

p.gray_scott(pitch=60, n=16, f=0.055, k=0.062)

Golden ratio

Distribute notes at irrational intervals based on φ (1.618…). Because φ is the "most irrational" number, the resulting rhythm never exactly repeats — but it fills the bar with uncanny balance.

p.golden_ratio(pitch=67, count=8, velocity=75)

Markov chains

Define weighted transitions between named states — each state "decides" what comes next based on relative weights. You provide two things: a transitions dict mapping state names to a list of (next_state, weight) pairs, and a pitch_map dict mapping state names to MIDI note numbers. Produces melodies that feel stylistically consistent without being repetitive.

@composition.pattern(channel=0, length=4)
def ch1(p):
    # transitions: each state maps to a list of (next_state, weight) pairs
    # pitch_map: maps each state name to a MIDI note number
    p.markov(
        transitions={
            "root": [("3rd", 4), ("5th", 2), ("root", 1)],
            "3rd":  [("5th", 3), ("root", 2), ("7th", 1)],
            "5th":  [("root", 3), ("3rd", 2)],
            "7th":  [("root", 4), ("5th", 2)],
        },
        pitch_map={"root": 60, "3rd": 63, "5th": 67, "7th": 70},
        velocity=80,
        step=0.25,  # 16th note spacing
    )

Exotic scales with Music21

p.quantize_m21() gives access to hundreds of scale types. The scale snapping logic is identical to p.quantize() — the only difference is the source of pitch classes. Requires pip install music21.

# Indian raga
p.brownian(start=60, steps=16)
p.quantize_m21("D", "RagAsawari")

# Octatonic (diminished) scale
p.lorenz(steps=16, pitch_range=(48, 72))
p.quantize_m21("C", "OctatonicScale")

# Whole-tone
p.brownian(start=60, steps=16)
p.quantize_m21("C", "WholeToneScale")

Some scale names to try

HarmonicMinorScale
MelodicMinorScale
OctatonicScale
WholeToneScale
RagAsawari
RagMarwa
PhrygianScale
LydianScale
MixolydianScale
LocrianScale
DorianScale
AugmentedScale

Connecting to Ableton Live

POMSKI includes a bridge to Ableton Live via the AbletonOSC remote script. Once connected, you can fire clips, read track volumes, and automate parameters — all from patterns and the REPL.

Setup

  1. Install AbletonOSC as a Control Surface in Live Preferences → Link/Tempo/MIDI → Control Surface
  2. Ensure Live is running before starting the POMSKI template
  3. The bridge connects automatically at startup. Check the log panel — it will show AbletonOSC connected

Commands (use in REPL or patterns)

CommandDescription
live.clip_play(track, clip)Fire a clip. Both indices are 0-based.
live.scene_play(scene)Fire a scene (launches all clips in that row).
live.track_volume(track, value)Set track volume. value is 0.0–1.0.
live.track_mute(track, True/False)Mute or unmute a track.
live.device_param(track, device, param, value)Set a device parameter (0.0–1.0).
live.set_tempo(bpm)Change Live's tempo.
live.watch("track/0/volume")Subscribe to a Live property. Its value is pushed to composition.data["live_track_0_volume"] continuously.
live.tracksList of track names in the current Live set.
live.connectedTrue if the Live bridge is active.

Reacting to Live state inside patterns

# Watch a Live parameter (run once from REPL)
live.watch("track/0/volume")

# Now use it inside any pattern
@composition.pattern(channel=0, length=4)
def ch1(p):
    vol = p.data.get("live_track_0_volume", 0.8)
    vel = int(vol * 127)
    p.seq("60 _ 63 65 _ 67", velocity=vel)

Performance tips

Mental model for a set

Think of a live POMSKI set like a DJ set, but instead of swapping tracks you are rewriting the rules of the music in real time. A practical workflow:

  1. Establish rhythm first — drums and bass anchor the audience while you build
  2. Layer melody on top — add harmonic content once the groove is locked
  3. Mutate, don't replace — small edits to running patterns feel more live than total rewrites
  4. Use mute/unmute for energy dynamics instead of always rewriting
  5. Let generative patterns run — euclidean and lorenz patterns evolve on their own; step back and let them breathe

Keyboard shortcuts in the editor

ShortcutAction
Shift+EnterSend the current code block (detected by double newlines above the cursor)
Ctrl+↑ / Ctrl+↓Navigate command history (last 200 sent blocks)

Building in variation without thinking

# Drop out notes probabilistically — density controlled by Perlin noise
@composition.pattern(channel=0, length=4)
def ch1(p):
    from subsequence.sequence_utils import perlin_1d
    sparsity = perlin_1d(p.cycle * 0.04, seed=7)
    p.data["sparsity"] = sparsity

    p.seq("60 63 65 67 65 63 67 70", velocity=80)
    p.dropout(sparsity * 0.6)   # max 60% dropout
    p.quantize("C", "dorian")

Transposing by section

# Modulate key every 16 bars using p.bar
@composition.pattern(channel=0, length=4)
def ch1(p):
    modulation = (p.bar // 16) % 3  # 0, 1, 2 cycling
    offset = [0, 5, 7][modulation]   # root, fourth, fifth
    p.seq("60 63 65 67", velocity=85)
    p.transpose(offset)

Long-form patterns

A length=128 pattern plays for 32 bars before repeating. Inside it, p.cycle increments once every 32 bars — so p.cycle // 1 gives you 32-bar sections. Use very long patterns for slow, geological-timescale evolution.

@composition.pattern(channel=0, length=128)
def ch1(p):
    # Play 128 notes spread over 32 bars — each note is 1 bar apart
    for i in range(32):
        pitch = 48 + i % 12
        p.note(pitch, beat=i * 4, duration=3.5)
    p.quantize("C", "minor")

Buffer tabs — keeping drafts ready

The editor supports multiple named buffer tabs. Add a tab with the + button, double-click a tab label to rename it, and click × to remove one. All buffer contents and names persist across browser reloads and restarts — your code is always there when you reopen the UI.

A practical use: keep a different song section in each buffer — intro in tab 1, chorus in tab 2, bridge in tab 3. Switch between them with a click and send any block with Shift+Enter.

Error messages in the log

If a pattern raises an exception — a typo, a missing import, a bad argument — the full traceback appears as a red error message in the log panel. The pattern continues cycling silently (playing nothing) until you fix and re-send it. You do not need to watch the terminal.

Performance mindset

The best live coding performances have silences, mistakes, and recovery. An error in the log panel is not embarrassing — it is evidence that something real is happening. Leave the log visible on screen if you are performing for an audience.

Pulling live data into patterns

POMSKI includes a built-in feeds object that lets you poll any HTTP API at runtime and pipe the results directly into composition.data, where patterns can read them on every cycle. No template editing required — just type into the command box and the feed starts immediately.

How it works

Each feed is an asyncio task running on POMSKI's event loop. It fetches a URL every N seconds, applies your extract function to the JSON response, and writes the result to composition.data["feed_<key>"]. Patterns are rebuilt every cycle, so they always see the latest value with no extra wiring.

Starting a feed

Call feeds.add() from the command box, the main editor, or a REPL client. The feed starts polling immediately and persists until you stop it or restart POMSKI.

# Poll the ISS position API every 5 seconds
feeds.add(
    "iss",
    "http://api.open-notify.org/iss-now.json",
    interval=5,
    extract=lambda r: float(r["iss_position"]["latitude"])
)

The value lands at composition.data["feed_iss"] after the first fetch. Until then, .get("feed_iss", default) returns your default — always provide one so patterns play something sensible before data arrives.

Reading a feed inside a pattern

@composition.pattern(channel=0, length=4)
def ch1(p):
    # Latitude −90 to +90 → MIDI pitch 48–84
    lat = composition.data.get("feed_iss", 0)
    pitch = int(48 + (float(lat) + 90) / 180 * 36)
    p.note(pitch, beat=0)

Feeds API

CallWhat it does
feeds.add(key, url, interval, extract, headers, method, body) Start (or restart) a named polling feed. extract is a callable that receives the parsed JSON and returns the value to store. Omit it to store the full response dict.
feeds.stop("key") Cancel and remove a feed by name.
feeds.stop_all() Cancel every running feed.
feeds Print active feeds and their current values (type this bare in the REPL).
composition.data.get("feed_key", default) Read the latest value inside a pattern. Always supply a default.

Mapping data to music

Raw API numbers rarely map neatly to MIDI ranges. A simple linear rescale covers most cases:

def scale(val, in_lo, in_hi, out_lo, out_hi):
    return out_lo + (val - in_lo) / (in_hi - in_lo) * (out_hi - out_lo)

@composition.pattern(channel=0, length=4)
def ch1(p):
    # A stock price (say 100–300) mapped to pitch and velocity
    price = composition.data.get("feed_stock", 200)
    pitch    = int(max(36, min(84, scale(price, 100, 300, 36, 84))))
    velocity = int(max(40, min(120, scale(price, 100, 300, 40, 120))))
    p.seq("x . x .", pitch=pitch, velocity=velocity)

Feeds with authentication and POST

# GET with an Authorization header
feeds.add(
    "gh_stars",
    "https://api.github.com/repos/python/cpython",
    interval=60,
    headers={"Accept": "application/vnd.github+json"},
    extract=lambda r: r["stargazers_count"]
)

# POST with a JSON body
feeds.add(
    "sensor",
    "https://my-api.example.com/query",
    interval=2,
    method="POST",
    body={"sensor_id": "A1", "field": "temperature"},
    extract=lambda r: r["value"]
)

Switching data sources on-the-fly

Calling feeds.add() with the same key replaces the existing feed — the old polling task is cancelled and a new one starts immediately. This lets you swap data sources mid-performance without touching any pattern code:

# First set: ISS latitude
feeds.add("signal", "http://api.open-notify.org/iss-now.json",
         extract=lambda r: float(r["iss_position"]["latitude"]))

# Later in the set: swap to a different source, same key
feeds.add("signal", "https://api.open-meteo.com/v1/forecast?latitude=51.5&longitude=-0.1¤t=temperature_2m",
         extract=lambda r: r["current"]["temperature_2m"])

# Pattern code never changes — it always reads "feed_signal"
No extra dependencies

DataFeeds uses Python's built-in urllib — no pip install required. Any public JSON API reachable from your machine will work.

music21 integration

POMSKI bundles music21 — MIT's computational music library — and exposes it through a single pattern method: p.quantize_m21(key, scale_name). Call it after placing notes to snap every pitch to the nearest degree of any music21 scale class.

quantize vs quantize_m21

The built-in p.quantize(key, mode) covers the nine Western church modes and nothing else. p.quantize_m21 extends this to dozens of scales — ragas, whole-tone, octatonic, Xenakis sieve, and more — using music21's complete pitch theory engine under the hood.

Basic usage

@composition.pattern(channel=0, length=4)
def ch1(p):
    # Place notes first, then quantise
    for i in range(16):
        p.note(60 + i, beat=i * 0.25)
    p.quantize_m21("A", "MelodicMinorScale")

quantize_m21 is a transformation — like transpose() or dropout(), it modifies whatever notes are already in the pattern. Place your notes first, then call it at the end of the builder function.

Available scale classes (scale_name)

MajorScale
MinorScale
HarmonicMinorScale
MelodicMinorScale
DorianScale
PhrygianScale
LydianScale
MixolydianScale
LocrianScale
OctatonicScale
WholeToneScale
RagAsawari
RagMarwa
SieveScale

Scala archive (scala_name)

Pass scala_name="filename.scl" instead of scale_name to access music21's bundled Scala archive — 3,935 tuning files covering mbira, gamelan, Pythagorean, just intonation, historical European temperaments, and hundreds of other systems. scala_name takes precedence over scale_name when both are given.

@pat(0)
def ch1(p):
    p.brownian(start=60, steps=16)
    p.quantize_m21("C", scala_name="mbira_banda.scl")

@pat(1)
def ch2(p):
    p.arpeggio([60, 62, 64, 65, 67], step=0.25)
    p.quantize_m21("C", scala_name="pyth_12.scl")   # Pythagorean 12-tone

Some useful Scala files from the archive:

FileTuning system
mbira_banda.sclMbira dza vadzimu — traditional Shona tuning
mbira_gondo.scl, mbira_zimb.sclOther mbira regional variants
gamelan_udan.sclGamelan pélog/sléndro approximation
pyth_12.sclPythagorean 12-tone tuning
harm8_1.scl8th harmonic partial series
Scale classCharacter
MajorScaleStandard major (Ionian)
MinorScaleNatural minor (Aeolian)
HarmonicMinorScaleMinor with raised 7th — exotic cadences
MelodicMinorScaleAscending form — raised 6th & 7th; jazz minor
DorianScaleMinor with raised 6th — modal jazz foundation
PhrygianScaleMinor with flattened 2nd — Spanish / flamenco feel
LydianScaleMajor with raised 4th — dreamy, floating quality
MixolydianScaleMajor with flattened 7th — rock & blues staple
LocrianScaleDiminished feel — flattened 2nd & 5th
OctatonicScale8-note diminished scale — alternating whole/half steps
WholeToneScale6-note scale — all whole steps; Debussy-like ambiguity
RagAsawariNorth Indian raga (Asavari) — melancholic
RagMarwaNorth Indian raga (Marwa) — tense, unsettled
SieveScaleXenakis-style interval sieve — microtonal approximation

Combining with algorithmic generators

quantize_m21 works with any note-placement method — euclidean rhythms, random walks, Lorenz attractors, Markov chains. Generate raw pitches algorithmically, then use the scale to impose musical structure.

# Euclidean rhythm + octatonic scale
@composition.pattern(channel=1, length=4)
def ch2(p):
    p.euclidean(60, pulses=5)
    p.quantize_m21("C", "OctatonicScale")
    p.randomize(timing=0.02)
# Random walk snapped to Phrygian — raw chromatic walk becomes coherent
@composition.pattern(channel=0, length=4)
def ch1(p):
    import random
    note = 60
    for i in range(16):
        p.note(max(36, min(84, note)), beat=i * 0.25)
        note += random.choice([-2, -1, 0, 1, 2])
    p.quantize_m21("E", "PhrygianScale")
# Raga arpeggio — North Indian flavour
@composition.pattern(channel=2, length=4)
def ch3(p):
    p.arpeggio([60, 62, 63, 65, 67, 68, 71], step=0.25)
    p.quantize_m21("C", "RagAsawari")

Microtuning — pitch bend for true cent accuracy

quantize_m21 snaps pitches to the nearest MIDI semitone. For tuning systems that sit between semitones — like most mbira and gamelan scales — the pitches are approximated. p.microtuning() goes further: it snaps the MIDI note and injects a MIDI pitch bend event at each note onset to correct the remaining cent deviation to exact intonation.

@pat(0)
def ch1(p):
    p.brownian(start=60, steps=16)
    p.microtuning("C", "mbira_banda.scl")   # true mbira intonation

The bend_range parameter (default 2.0 semitones) must match the pitch bend range configured on the receiving synth. Most synths default to ±2 semitones, but for tunings with large deviations (e.g. gamelan, which can be ±50 cents or more) a wider range gives finer resolution:

# Synth set to ±12 semitone bend range:
@pat(1)
def ch2(p):
    p.euclidean(60, pulses=5)
    p.microtuning("C", "gamelan_udan.scl", bend_range=12)
Pitch bend is per-channel

All notes on the same MIDI channel share one pitch wheel. microtuning works best on monophonic lines. For chords, use a separate channel per voice, or use quantize_m21 with scala_name for semitone-level approximation without bend.

Switching scale mid-performance

Because the scale is evaluated fresh every pattern cycle, you can modulate it from the command box by redefining the pattern — no restart needed.

# Switch from dorian to harmonic minor on the fly
@composition.pattern(channel=0, length=4)
def ch1(p):
    for i in range(16):
        p.note(60 + i, beat=i * 0.25)
    p.quantize_m21("D", "HarmonicMinorScale")  # ← change this line and re-run
music21 is required

music21 is bundled in the POMSKI Windows installer. If you are running from source, install it with pip install music21. If music21 is missing, calling quantize_m21 raises an ImportError that appears as a red error message in the POMSKI log panel.