Drupal is a registered trademark of Dries Buytaert
drupal 11.3.7 Update released for Drupal core (11.3.7)! drupal 11.2.11 Update released for Drupal core (11.2.11)! drupal 10.6.7 Update released for Drupal core (10.6.7)! drupal 10.5.9 Update released for Drupal core (10.5.9)! cms 2.1.1 Update released for Drupal core (2.1.1)! drupal 11.3.6 Update released for Drupal core (11.3.6)! drupal 10.6.6 Update released for Drupal core (10.6.6)! cms 2.1.0 Update released for Drupal core (2.1.0)! bootstrap 8.x-3.40 Minor update available for theme bootstrap (8.x-3.40). menu_link_attributes 8.x-1.7 Minor update available for module menu_link_attributes (8.x-1.7). eca 3.1.1 Minor update available for module eca (3.1.1). layout_paragraphs 2.1.3 Minor update available for module layout_paragraphs (2.1.3). ai 1.3.3 Minor update available for module ai (1.3.3). ai 1.2.14 Minor update available for module ai (1.2.14). node_revision_delete 2.0.3 Minor update available for module node_revision_delete (2.0.3). moderated_content_bulk_publish 2.0.52 Minor update available for module moderated_content_bulk_publish (2.0.52). klaro 3.0.10 Minor update available for module klaro (3.0.10). klaro 3.0.9 Minor update available for module klaro (3.0.9). layout_paragraphs 2.1.2 Minor update available for module layout_paragraphs (2.1.2). geofield_map 11.1.8 Minor update available for module geofield_map (11.1.8).

ai_provider_ollama

268 sites Security covered
View on drupal.org

Looking for a powerful server to run your local AI models? Ollama is a perfect choice, designed to operate in a headless setup. Integrated with the AI module, Ollama enables efficient management and execution of AI models on your own infrastructure, giving you full control without the need for a graphical interface.

Features

Ollama acts as a robust server solution for running local models, perfect for developers who need headless operations. Integrated with the AI module, it allows you to run and scale AI models without a GUI, making it ideal for production environments where efficiency and control are key.

Post-Installation

  1. Set up your API key using the Key module with the details of your local Ollama server.
  2. Visit /admin/config/ai/providers/ollama and setup your server ip.
  3. Once configured, you can start running models on your headless Ollama server seamlessly.

Additional Requirements

The AI module.

Supporting this Module

Support the AI module on OpenCollective.

Community Documentation

The documentation is available on

Activity

Total releases
7
First release
Apr 2025
Latest release
1 week ago
Release cadence
62 days
Stability
0% stable

Release Timeline

Releases

Version Type Release date
1.2.0-rc3 Pre-release Apr 8, 2026
1.2.0-rc2 Pre-release Dec 4, 2025
1.2.0-rc1 Pre-release Oct 13, 2025
1.2.x-dev Dev Jul 7, 2025
1.1.0-beta2 Pre-release Jun 12, 2025
1.1.0-beta1 Pre-release May 8, 2025
1.1.x-dev Dev Apr 2, 2025