For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
Azure
Configuration and setup for Azure AI services provider
Configure Microsoft Azure AI as an LLM provider in agentgateway.
Authentication
Before you can use Azure as an LLM provider, you must authenticate by using one of the standard Azure authentication methods. By default, agentgateway uses DefaultAzureCredential which automatically detects credentials from the environment (Azure CLI, managed identity, workload identity, or environment variables). You can also authenticate with an API key.
Configuration
Azure supports two endpoint types:
- Azure AI Foundry (
foundry): Connect to Azure AI Foundry project endpoints at{resourceName}-resource.services.ai.azure.com. - Azure OpenAI (
openAI): Connect directly to Azure OpenAI Service deployments at{resourceName}.openai.azure.com.
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
models:
- name: "*"
provider: azure
params:
azureResourceName: "your-resource-name"
azureResourceType: foundry
azureProjectName: "your-project-name"| Setting | Description |
|---|---|
name | The model name to match in incoming requests. When a client sends "model": "<name>", the request is routed to this provider. Use * to match any model name. |
provider | The LLM provider, set to azure for Azure AI models. |
params.azureResourceName | The Azure resource name used to construct the endpoint hostname. |
params.azureResourceType | The endpoint type: foundry for Azure AI Foundry, or openAI for Azure OpenAI Service. |
params.azureProjectName | The Foundry project name. Required for foundry type. If omitted, defaults to azureResourceName. |
params.azureApiVersion | Optional API version override. Defaults to v1. For legacy deployments, use a dated version like 2024-04-01-preview. |
params.model | The specific Azure model to use. If set, this model is used for all requests. If not set, the request must include the model to use. |
params.apiKey | The Azure API key for authentication. If unset, implicit Entra ID authentication is used. You can reference environment variables using the $VAR_NAME syntax. |
Advanced configuration
For advanced Azure AI scenarios, use the traditional listener/route configuration format. The following tabs show examples for different authentication methods.
Azure AI Foundry with implicit auth: Use DefaultAzureCredential to automatically detect credentials from the environment (Azure CLI, managed identity, workload identity, or environment variables).
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
binds:
- port: 3000
listeners:
- routes:
- matches:
- path:
pathPrefix: /azure
policies:
urlRewrite:
authority: auto
backendAuth:
azure:
implicit: {}
backendTLS: {}
backends:
- ai:
name: azure
provider:
azure:
resourceName: "your-resource-name"
projectName: "your-project-name"
resourceType: foundry
model: gpt-4.1Review the following example configuration.
| Setting | Description |
|---|---|
ai.name | The name of the LLM provider for this AI backend. |
ai.provider.azure.resourceName | The Azure resource name used to construct the endpoint hostname. |
ai.provider.azure.resourceType | The endpoint type: foundry for Azure AI Foundry, or openAI for Azure OpenAI Service. |
ai.provider.azure.projectName | The Foundry project name. Required for foundry type. |
ai.provider.azure.model | Optionally set the model to use for requests. If set, any models in the request are overwritten. If not set, the request must include the model to use. |
backendAuth.azure.implicit | Use implicit authentication via DefaultAzureCredential, which automatically detects credentials from the environment. |