SoundForge combines professional DSP effects with a generative AI conductor via MCP. Claude orchestrates the emotional arc of a performance -- the instrument plays the notes, AI shapes what they mean. Nothing else does this.
The problem
โI wanted to explore generative composition. Nothing bridged performance and AI narrative.โ
๐๏ธ
Modular synthesis was too complex. Preset-based tools were too shallow. Nothing let a performer shape tone and have AI respond to the emotional intent.
๐ค
Existing AI music tools generated audio independently. None integrated with a live performance signal or responded to what you're actually playing.
๐ต
The gap between human performance and AI composition was a wall. SoundForge made it a conversation.
Inside the instrument
Click a knob on the Effects screen to reveal its slider. Switch to AI Conductor and click any phase dot to navigate the emotional arc. Expand to go full screen.
SoundForge
AI Instrument
Josh M.
AI West
Signal Chain
4
System layers
2.1ms
Latency
โ
Emergence
The architecture
Six professional-grade effects built in Faust DSP -- distortion, delay, reverb, tremolo, chorus, gate. Sub-millisecond latency. Runs standalone or as a VST3/AU plugin.
A harmonic generation system based on neo-Riemannian Tonnetz theory. AI selects chord voicings that respond to the emotional context set by the conductor.
Claude receives the performance state and emotional arc, then sends real-time instructions back to the instrument. It doesn't generate audio -- it shapes what the instrument does.
Each layer feeds back into the others. The DSP state influences harmonic generation, which influences AI instructions, which influence DSP. The system evolves as you play.
Built with
Real infrastructure.
Not duct tape.
Want something built like this
for your business?
This is one of seven systems we've built. Every project starts with understanding what's actually costing you time or money -- then we build the thing that solves it.