langfuse
LangFuse - Comprehensive AI observability and analytics for Drupal applications.
Built on the Dropsolid LangFuse PHP SDK (GitLab repo), this module integrates with LangFuse to provide detailed tracing, analytics, and insights for all your AI interactions in Drupal.
It targets modern PHP (PHP 8.1+, using features like readonly properties) and is ready for production workloads.
Or, for more complex cases, when working with AI Agents that call other AI Agents, that themselves call a RAG tool
Key Features
- Zero-configuration AI tracking: Automatically captures all AI.module interactions – programmatic, UI-based, and integrations.
- Unified request tracing: Group multiple AI operations (e.g., embeddings + chat + tools) into a single, coherent trace.
- Comprehensive analytics: Token usage, response times, error rates, and model performance metrics in one place.
- Multiple authentication methods: API key pair (recommended), bearer token, basic auth, and full support for self-hosted LangFuse.
- Production-ready: Error handling, caching, and performance optimizations built in.
- Developer-friendly: Rich debugging information, extensible architecture, and example implementations.
- Deep ecosystem integration: Dedicated submodules for AI Agents and AI Search/RAG, with detailed span/trace management.
Quick Start
Requirements
- Drupal: 10.x or 11.x
- PHP: 8.1 or higher (uses modern PHP features such as
readonlyproperties) -
LangFuse Account:
Cloud or self-hosted (free with Docker Compose)
Installation
composer require drupal/langfuse # Enable the core module drush en langfuse -y # Optional: AI.module integration (recommended) drush en langfuse_ai_logging -y # Optional: AI Agents instrumentation drush en langfuse_ai_agents_logging -y # Optional: AI Search / RAG instrumentation drush en langfuse_ai_search_logging -y # Optional: Example demonstrations drush en langfuse_example -y
Configuration
- Go to
/admin/config/system/langfuse/settings. - Enter your LangFuse URL, e.g.
https://cloud.langfuse.comor your self-hosted URL. - Select an authentication method: API key pair (recommended), bearer token, or basic auth.
- Enter credentials from your LangFuse project settings.
- Save configuration. The connection is tested automatically.
How It Works
With the langfuse_ai_logging submodule enabled,
AI.module interactions are logged automatically—no code changes required.
- Programmatic calls via AI.module services.
- AI Explorer at
/admin/config/ai/explorer. - Any AI.module integration: chat forms, embeddings, moderation, and more.
// This request will be automatically tracked in LangFuse. $ai_provider = \Drupal::service('ai.provider'); $response = $ai_provider->generateText('Explain quantum computing');
AI traces appear in your LangFuse project with full details:
prompts, responses, timing, token usage, tool calls, and nested spans for downstream operations.
Module Architecture
Core Module (langfuse)
- LangFuse Client Service (
langfuse.client): Main interface to the LangFuse SDK. - Configuration management: Secure credential storage supporting multiple auth methods.
- Connection testing: Automatic validation of LangFuse connectivity from the UI.
AI.module Integration (langfuse_ai_logging)
- Event-driven architecture: Subscribes to AI.module pre/post generation events.
- Unified request tracing: Creates single traces containing multiple AI operations.
- Zero developer effort: Works automatically with any AI.module integration.
AI Agents Tool Logging (langfuse_ai_agents_logging)
Instrumentation specifically for the
AI Agents module:
- Tool span tracking: Every tool execution becomes a LangFuse span with input/output metadata.
- Delegation awareness: Nested agents inherit the parent runner context, so spans nest correctly when agents delegate to other agents.
- Dependency isolation: Lives in its own submodule so sites not using AI Agents avoid the extra dependency.
AI Search Logging (langfuse_ai_search_logging)
Observability for Search API–based Retrieval-Augmented Generation (RAG) via
search_api and ai_search:
- Search API events: Wraps
search_apiquery pre/post execute to capture retrieval spans. - Embedding parenting: Retrieval spans automatically become the parent of downstream embedding requests.
- Split motivation: Kept as a separate submodule so you don’t pull in
search_api/ai_searchunless you actually use them.
Example Implementation (langfuse_example)
- OpenAI demo form: Interactive example with chat completions powered by AI.module.
- Manual trace management: Demonstrates creating and managing custom traces outside AI.module.