Docs Standalone Kubernetes Blog Enterprise Community Get Started GitHub

Amazon Bedrock

Configure Amazon Bedrock as an LLM provider in agentgateway.

Authentication

Before you can use Bedrock as an LLM provider, you must authenticate by using the standard AWS authentication sources.

Configuration

Review the following example configuration.
binds:
- port: 3000
  listeners:
  - routes:
    - backends:
      - ai:
          name: bedrock
          provider:
            bedrock:
              region: us-west-2
              # Optional; overrides the model in requests
              model: amazon.titan-text-express-v1
Review the following example configuration.
Setting Description
ai.name The name of the LLM provider for this AI backend.
bedrock.region The AWS region.
bedrock.model Optionally set the model to use for requests. If set, any models in the request are overwritten. If not set, the request must include the model to use.

Token counting

Bedrock supports token counting for Anthropic models via the count_tokens endpoint. Agentgateway automatically handles the required formatting for Bedrock’s count-tokens endpoint, including adding the max_tokens: 1 parameter and Base64 encoding the request body.

curl -X POST http://localhost:3000/v1/messages/count_tokens \
  -H "Content-Type: application/json" \
  -d '{
    "model": "anthropic.claude-3-5-sonnet-20241022-v2:0",
    "messages": [{"role": "user", "content": "Hello!"}],
    "system": "You are a helpful assistant."
  }'

Example response:

{
  "input_tokens": 15
}