Ragflow Observability & Tracing
Ragflow (GitHub) is a no-code / low-code framework to design and run Retrieval-Augmented-Generation (RAG) pipelines.
With the native Langfuse integration you can build complex RAG pipelines in Ragflow and then use Langfuse to monitor, debug and improve them.
The integration works for all execution modes of Ragflow: interactive runs in the UI, scheduled jobs and API calls.
Integration
Get Langfuse API keys
- Create an account and project on cloud.langfuse.com
- Copy the public and secret API keys for your project
Configure Ragflow
You can configure Langfuse in the Ragflow API settings (scroll to bottom). Please note that you first have to configure an LLM in the Ragflow settings.
Run your pipeline and inspect traces
Execute a pipeline from the Ragflow UI or trigger it via the API. Ragflow will automatically send structured trace data to Langfuse.
![]()
Version requirements
| Library | Minimum version |
|---|---|
| Ragflow | >= 0.3.0 |
If you are on an older Ragflow version please upgrade to benefit from the Langfuse integration.
OpenWebUI
Learn how to integrate Langfuse with OpenWebUI for detailed monitoring and recording of model calls. Follow our quick start guide to observe your LLM interactions in Langfuse.
Vapi
Discover how to integrate Vapi with Langfuse for enhanced voice AI telemetry monitoring, enabling improved performance and reliability of your AI applications.