For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
LLM providers
Connect agentgateway to LLM providers for AI-powered applications
Agentgateway supports multiple LLM providers, allowing you to route requests to different AI models and manage API keys centrally.
Quick start
To use an LLM provider with agentgateway, configure an ai backend.
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
binds:
- port: 3000
listeners:
- routes:
- backends:
- ai:
name: my-llm
provider:
openAI:
model: gpt-4o-mini
policies:
backendAuth:
key: "$OPENAI_API_KEY"See LLM Consumption for complete documentation on working with LLM providers.
OpenAI
Connect agentgateway to OpenAI's GPT models
Anthropic
Connect agentgateway to Anthropic's Claude models
Azure OpenAI
Connect agentgateway to Azure-hosted OpenAI models
Amazon Bedrock
Connect agentgateway to AWS foundation models via Amazon Bedrock
Google Gemini
Connect agentgateway to Google's Gemini models
Vertex AI
Connect agentgateway to Google Cloud's Vertex AI platform
OpenAI-Compatible Providers
Connect agentgateway to any OpenAI-compatible API (xAI, Cohere, Ollama, etc.)
xAI (Grok)
Connect agentgateway to xAI's Grok models