Superset
Neural MIDI Motion Engine
Train your own AI sound style. Regenerate expressive multi-channel MIDI motion in seconds.
See Superset in Action
Controlled Chaos. Musical Results.
Superset blends neural motion modeling with practical MIDI constraints. Train on your own validated presets, then regenerate expressive multi-channel CC values and motion — fast, repeatable, and musical.
Features
Turn raw performance data into intentional, repeatable movement.
Train Your Style
Learn from your validated presets to generate consistent multi-channel CC values and motion.
Regenerate Entire Sections
Refresh multiple channels at once while keeping the same CC mapping and musical intent.
Add Intelligent Motion
Ramps and looped triangle-LFO motion, with note-on reset for tight, musical phrasing.
Founder Bundles
Curated bundles compatible with the included Ableton demo set.
Techno Motion
Driving, clock-tight motion templates built for club-ready performance grids.
Evolving Pads
Slow, expressive modulation systems tuned for cinematic and ambient arrangements.
Experimental Motion Lab
Unconventional curve stacks designed to provoke happy accidents and new rhythms.
How It Works
- Set up MIDI routing (hardware or DAW via virtual ports like loopMIDI / IAC).
- Randomize → tweak → validate a small batch (typically ~10+) to build your dataset.
- Train a lightweight neural model locally (Brain.js).
- Click Superset to regenerate multi-channel CC values + motion — keep what works, iterate instantly.
Founder Insider Beta
Join a 60-day evaluation cycle with direct feedback loops. Founder beta includes private builds, early motion packs, and hands-on onboarding.
Superset Neural MIDI Motion Engine