For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
Providers
Learn how to configure agentgateway for a particular LLM providerProviderA service that provides LLM capabilities, such as OpenAI, Anthropic, or Azure. Agentgateway supports multiple LLM providers and can route to different providers based on configuration..
OpenAI-compatible providers
Popular OpenAI-compatible providers include xAI (Grok), Cohere, Together AI, Groq, DeepSeek, Mistral, Perplexity, and Fireworks AI.
Additional self-hosted solutions like vLLM and LM Studio are also supported through the OpenAI-compatible configuration.
Path prefix
When using the advanced binds/listeners/routes configuration, you can set pathPrefix on an AI provider to prepend a custom path to all API requests. Use pathPrefix when routing through a proxy or custom API endpoint that requires a different base path.
backends:
- ai:
name: openai
pathPrefix: /custom/v1
provider:
openAI:
model: gpt-4o-mini
policies:
backendAuth:
key: "$OPENAI_API_KEY"OpenAI
Configuration and setup for OpenAI LLM provider
OpenAI-compatible providers
Configure agentgateway to route traffic to any LLM provider that implements the OpenAI API format.
Ollama
Configure agentgateway to route LLM traffic to Ollama for local model inference
Vertex AI
Configuration and setup for Google Cloud Vertex AI provider
Gemini
Configuration and setup for Google Gemini provider
Amazon Bedrock
Configuration and setup for Amazon Bedrock provider
Multiple LLM providers
Anthropic
Configuration and setup for Anthropic Claude provider
Azure
Configuration and setup for Azure AI services provider