LangSmith
LangSmith is LangChain’s platform for debugging, testing, evaluating, and monitoring LLM applications.
Features
- Trace logging - Detailed request/response logging.
- Debugging - Step-through debugging of LLM calls.
- Evaluation - Automated testing and evaluation.
- Monitoring - Production monitoring and alerting.
- Datasets - Build and manage evaluation datasets.
Setup
- Sign up at smith.langchain.com.
- Create a project and get your API key.
- Create a Kubernetes secret with your API key.
kubectl create secret generic langsmith-api-key \
--from-literal=api-key=YOUR_LANGSMITH_API_KEY \
-n telemetryConfiguration
Configure the OpenTelemetry Collector to forward traces to LangSmith.
# Update the traces collector
helm upgrade --install opentelemetry-collector-traces opentelemetry-collector \
--repo https://open-telemetry.github.io/opentelemetry-helm-charts \
--version 0.127.2 \
--set mode=deployment \
--set image.repository="otel/opentelemetry-collector-contrib" \
--set command.name="otelcol-contrib" \
--namespace=telemetry \
--create-namespace \
-f -<<EOF
extraEnvs:
- name: LANGSMITH_API_KEY
valueFrom:
secretKeyRef:
name: langsmith-api-key
key: api-key
config:
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
exporters:
otlphttp/langsmith:
endpoint: https://api.smith.langchain.com/otel
headers:
x-api-key: "\${LANGSMITH_API_KEY}"
debug:
verbosity: detailed
service:
pipelines:
traces:
receivers: [otlp]
exporters: [debug, otlphttp/langsmith]
EOFVerify integration
Send a request through agentgateway to an LLM backend.
curl -X POST http://localhost:8080/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "messages": [{"role": "user", "content": "Hello!"}] }'Navigate to your LangSmith project and verify that the trace appears with the following information.
- Full prompt and response.
- Token counts (input and output).
- Model information.
- Latency metrics.
- Nested span structure.