Professional World / Build / active
AI-BletonGuy
A local desktop copilot for controlling Ableton Live with natural language, structured plans, safety rules, voice input, and visual performance tooling.
Problem
Creative control tools often feel either too brittle for real performance work or too vague to trust when a command has real musical consequences.
Why it matters
This is a serious machine-control project with real operator constraints: ambiguity handling, safety validation, local-model routing, and live creative control.
System description
AI-BletonGuy parses natural language into typed action plans, validates them, and routes execution through bridge layers for Ableton-focused control. It also supports popup control, camera-to-MIDI experimentation, pose/visual modes, and safer local execution patterns.
Tools / methods
Constraints
- Has to remain safe under live performance conditions.
- Must separate interpretation from execution.
- Needs useful fallback behavior when the bridge or model is uncertain.
Workspace source
ableton-language-agent/README.md
AI-BletonGuy is a local desktop copilot for controlling Ableton Live with natural language. It interprets a command, converts it into structured JSON, validates it against safety rules, and only then executes it through a bridge layer.