Gemini

Configure Google Gemini as an LLM provider in agentgateway.

Configuration

Review the following example configuration.
binds:
- port: 3000
  listeners:
  - routes:
    - backends:
      - ai:
          name: gemini
          provider:
            gemini:
              # Optional; overrides the model in requests
              model: gemini-1.5-flash
      policies:
        backendAuth:
          key: "$GEMINI_API_KEY"
Review the following example configuration.
Setting Description
ai.name The name of the LLM provider for this AI backend.
ai.provider.gemini.model Optionally set the model to use for requests. If set, any models in the request are overwritten. If not set, the request must include the model to use.
backendAuth Gemini uses API keys for authentication. Optionally configure a policy to attach an API key that authenticate to the LLM provider to outgoing requests. If you do not include an API key, each request must pass in a valid API key.