Chat route with Llama 2v3.6+

Configure a chat route using a local Llama 2 model with the OLLAMA format.

Prerequisites

  • A local Llama 2 instance

Set up the plugin

Add this section to your declarative configuration file:

_format_version: "3.0"
plugins:
  - name: ai-proxy
    config:
      route_type: llm/v1/chat
      model:
        provider: llama2
        name: llama2
        options:
          llama2_format: ollama
          upstream_url: http://llama2-server.local:11434/api/chat
Copied to clipboard!

Did this doc help?

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!