Drupal is a registered trademark of Dries Buytaert

ai_provider_ollama

244 sites Security covered
View on drupal.org

Looking for a powerful server to run your local AI models? Ollama is a perfect choice, designed to operate in a headless setup. Integrated with the AI module, Ollama enables efficient management and execution of AI models on your own infrastructure, giving you full control without the need for a graphical interface.

Features

Ollama acts as a robust server solution for running local models, perfect for developers who need headless operations. Integrated with the AI module, it allows you to run and scale AI models without a GUI, making it ideal for production environments where efficiency and control are key.

Post-Installation

  1. Set up your API key using the Key module with the details of your local Ollama server.
  2. Visit /admin/config/ai/providers/ollama and setup your server ip.
  3. Once configured, you can start running models on your headless Ollama server seamlessly.

Additional Requirements

The AI module.

Supporting this Module

Support the AI module on OpenCollective.

Community Documentation

The documentation is available on

Activity

Total releases
6
First release
Apr 2025
Latest release
3 months ago
Release cadence
49 days
Stability
0% stable

Release Timeline

Releases

Version Type Release date
1.2.0-rc2 Pre-release Dec 4, 2025
1.2.0-rc1 Pre-release Oct 13, 2025
1.2.x-dev Dev Jul 7, 2025
1.1.0-beta2 Pre-release Jun 12, 2025
1.1.0-beta1 Pre-release May 8, 2025
1.1.x-dev Dev Apr 2, 2025