For the complete documentation index, see llms.txt. Markdown versions of all docs pages are available by appending .md to any docs URL.
Windsurf
Configure Windsurf AI code editor to use agentgateway
Configure Windsurf, the AI code editor by Codeium, to route requests to your LLM through your agentgateway proxy.
Before you begin
- Install the
agentgatewaybinary. - Install Windsurf.
Example agentgateway configuration
cat > /tmp/test-windsurf.yaml << 'EOF'
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
port: 3000
models:
- name: "*"
provider: openAI
params:
apiKey: "$OPENAI_API_KEY"
EOFConfigure Windsurf
Configure Windsurf to route LLM requests through agentgateway. For more information, review the Windsurf documentation.
Open Windsurf Settings.
- macOS:
Cmd + ,or Windsurf > Settings - Windows/Linux:
Ctrl + ,or File > Preferences > Settings
- macOS:
Search for Http: Proxy.
Enter your agentgateway URL.
http://localhost:3000Save the settings.
Verify the connection
- Open the Windsurf chat panel.
- Send a message such as “test”.
- Windsurf responds through your agentgateway backend.