ai_provider_cli
Bridges the growing ecosystem of AI CLI tools with Drupal's AI module, letting you use locally installed binaries as first-class AI providers — no custom API integrations required.
Instead of calling remote APIs directly, this module pipes prompts to CLI tools installed on your host machine and returns their output as standard Drupal AI responses. This makes any tool that accepts stdin and writes to stdout immediately available to AI-powered Drupal features like content generation, summarisation, and assistants.
Supported tools out of the box
- Claude Code (Anthropic)
- Codex (OpenAI)
- Gemini CLI (Google) — free tier, 1,000 requests/day
- OpenCode — open source, 75+ provider support
- Ollama — run open models locally, no API key needed
- LM Studio — local models via the
lmsCLI - llm (Simon Willison) — proxies to 100+ backends including DeepSeek, Groq, Mistral, and Anthropic via plugins
Custom tools can be added through the settings form — any binary that reads from stdin and writes to stdout works.
DDEV container support
For DDEV environments where CLI tools live on the host machine rather than inside the container, install the ddev-cli-relay addon. Once installed, it starts automatically on ddev start and forwards requests from inside the container to the host binaries.
Requirements
Requires the AI module and Python 3 on the host (for the DDEV relay). CLI tools must be installed separately.