For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
Manage API keys
Manage API keys for LLM provider authentication.
Managing API keys is an important security mechanism to prevent unauthorized access to your LLM provider. If API keys are compromised, attackers can deliberately run expensive queries, such as large and recursive prompts, at your expense.
You can choose between the following options to provide an API key to agentgateway:
- Inline
- Environment variable
- File
- Kubernetes secret or passthrough token
Follow the instructions in this guide to learn how to use these different methods.
Before you begin
Install theagentgateway binary.Configure your agentgateway proxy
Browse through the tabs to learn about different ways for how to provide your API key to agentgateway.
You can provide your API key directly in the agentgateway configuration. This option is the least secure. Only use this option for quick tests.
- Configure the agentgateway proxy and enter your key in the
params.apiKeyfield directly.cat <<EOF > config.yaml # yaml-language-server: $schema=https://agentgateway.dev/schema/config llm: models: - name: "*" provider: openAI params: apiKey: "sk-proj...." EOF