Show HN: Ableton Live MCP
A developer created an MCP server to enable voice-controlled operation of Ableton Live using AI agents like Codex, allowing hands-free music production. The tool leverages Ableton's Object Model to execute arbitrary Python code within the application, supporting a wide range of actions including real-time editing and automation. It is optimized for low latency and reliability, with predefined tools for common tasks and compatibility across Mac and Windows.
- ▪The MCP server allows AI agents to control Ableton Live by executing Python code inside the application.
- ▪It was optimized using Codex CLI with the /goal command to reduce latency and token usage while maintaining flexibility.
- ▪Users are advised to back up their Live Sets before use, as the MCP can directly edit and potentially corrupt projects.
- ▪A demo video shows the system generating an EDM track based on natural language prompts.
- ▪The tool supports integration with third-party plugins, external hardware, and features like vocal sample transcription and dynamic effects control.
Opening excerpt (first ~120 words) tap to expand
Ever wanted to control Ableton with just your voice? Me too! I made this MCP server so I could just ask Codex to do anything in Ableton Live for me, while I was nap-trapped by my baby. Unlike other Ableton MCPs I tried, this one can do pretty much anything that is possible via Ableton's Object model; the agent can just eval arbitrary python that runs inside Ableton. It also has some tools defined for common tasks so those work faster and more reliably. I had Codex CLI optimize this for hours with the new /goal command to prioritize low end-to-end latency, high reliability, low token usage, while maintaining full flexibility.
…
Excerpt limited to ~120 words for fair-use compliance. The full article is at GitHub.