For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
Arize Phoenix
Integrate agentgateway with Arize Phoenix for LLM tracing and evaluation
Arize Phoenix is an open-source LLM observability platform for tracing, evaluation, and debugging.
Features
- LLM tracing - Trace all LLM calls with full context
- Evaluation - Built-in LLM evaluators for quality assessment
- Embedding analysis - Visualize and debug embeddings
- Dataset management - Create and manage evaluation datasets
- Open source - Self-host or use Arize cloud
Quick start
Run Phoenix locally:
pip install arize-phoenix
phoenix serveOr with Docker:
docker run -p 6006:6006 arizephoenix/phoenix:latestAccess Phoenix at http://localhost:6006.
Configuration
Phoenix accepts OpenTelemetry traces natively:
# otel-collector-config.yaml
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
exporters:
otlp:
endpoint: http://localhost:4317
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
exporters: [otlp]Configure agentgateway:
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
config:
tracing:
otlpEndpoint: http://localhost:4317
randomSampling: true
binds:
- port: 3000
listeners:
- routes:
- backends:
- ai:
name: openai
provider:
openAI:
model: gpt-4o-mini
policies:
backendAuth:
key: "$OPENAI_API_KEY"Docker Compose example
version: '3'
services:
agentgateway:
image: ghcr.io/agentgateway/agentgateway:latest
ports:
- "3000:3000"
volumes:
- ./config.yaml:/etc/agentgateway/config.yaml
environment:
- OTEL_EXPORTER_OTLP_ENDPOINT=http://phoenix:4317
phoenix:
image: arizephoenix/phoenix:latest
ports:
- "6006:6006"
- "4317:4317"