LLM Providers

Agentgateway supports multiple LLM providers, allowing you to route requests to different AI models and manage API keys centrally.

Quick start

To use an LLM provider with agentgateway, configure an ai backend.

# yaml-language-server: $schema=https://agentgateway.dev/schema/config
binds:
- port: 3000
  listeners:
  - routes:
    - backends:
      - ai:
          name: my-llm
          provider:
            openAI:
              model: gpt-4o-mini
      policies:
        backendAuth:
          key: "$OPENAI_API_KEY"

See LLM Consumption for complete documentation on working with LLM providers.

Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

↑↓ navigate select esc dismiss

What could be improved?