LangGraph Observer offers a streamlined solution for running and inspecting LangGraph workflows. With a FastAPI backend and an interactive Streamlit dashboard, it provides essential metrics like toxicity and hallucination scores, making it easy to visualize and enhance LLM performance.
LangGraph Observer is an observability dashboard designed to enhance the visualization and interaction with LangGraph workflows. This project offers a compact environment that integrates a FastAPI backend and a Streamlit dashboard, facilitating the efficient running and inspection of large language model (LLM) operations.
Key Features
- FastAPI Backend: Executes the underlying graph and provides robust API endpoints for workflow execution and health checks.
- Interactive Streamlit Dashboard: Users can interactively prompt the model, view metrics, and manage the workflow outputs seamlessly.
- Modular Services: Includes functionalities for generation, emoji transformations, toxicity assessment, hallucination scoring, and artifact logging.
- Artifact Management: Each workflow run generates artifacts that document output, scores, token usage, duration, and cost, enabling comprehensive traceability.
Workflow Pipeline
The typical workflow consists of the following steps:
- Generate output from the LLM.
- Optionally transform the output into an "emoji-fy" format.
- Calculate the emoji-ness score.
- Assess toxicity metrics.
- Evaluate for hallucination tendencies.
- Save generated artifacts and append run history.
Dashboard Overview
The dashboard comprises various interactive features:
- Input prompts and view sample prompts.
- Run Workflow button to execute the pipeline.
- Make More Emoji button for iterative transformations.
- A sidebar displaying essential metrics like toxicity, hallucination scores, and execution duration.
- A table to review recent runs along with full-state JSON outputs.
API Interface
The FastAPI backend exposes two main endpoints:
POST /run-graph: Triggers the complete workflow.GET /health: Provides a health status check for the API.
Getting Started
To launch the Streamlit dashboard, use the command:
uv run streamlit run app/dashboard/ui.py
To start the FastAPI server, run:
uvicorn app.api.server:app --reload
Project Structure
The project is organized as follows:
app/
api/
adapters/
dashboard/
domain/
services/
LangGraph Observer exemplifies an advanced solution for monitoring and interacting with LLM workflows, combining ease of use with powerful observability features.
No comments yet.
Sign in to be the first to comment.