You are browsing unreleased documentation.
This guide walks you through setting up the AI Proxy plugin with Azure OpenAI Service.
For all providers, the Kong AI Proxy plugin attaches to route entities.
It can be installed into one route per operation, for example:
- OpenAI
chat
route - Cohere
chat
route - Cohere
completions
route
Each of these AI-enabled routes must point to a null service. This service doesn’t need to map to any real upstream URL,
it can point somewhere empty (for example, http://localhost:32000
), because the AI Proxy plugin overwrites the upstream URL.
This requirement will be removed in a later Kong revision.
Prerequisites
- Azure OpenAI Service account and subscription
- You need a service to contain the route for the LLM provider. Create a service first:
curl -X POST http://localhost:8001/services \
--data "name=ai-proxy" \
--data "url=http://localhost:32000"
Remember that the upstream URL can point anywhere empty, as it won’t be used by the plugin.
Provider configuration
Create or locate OpenAI instance
Log in to your Azure account, and (if necessary) create an OpenAI instance with the following values:
- Name:
azure_instance
- Access key as the
header_value
Create or locate model deployment
Once it has instantiated, create (if necessary) a model deployment in this instance.
Record its name as your azure_deployment_id
:
Set up route and plugin
Now you can create an AI Proxy route and plugin configuration:
Test the configuration
Make an llm/v1/chat
type request to test your new endpoint:
curl -X POST http://localhost:8000/azure-chat \
-H 'Content-Type: application/json' \
--data-raw '{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { "role": "user", "content": "What is 1+1?"} ] }'