You are browsing documentation for an outdated plugin version.
This feature requires Kong Gateway Enterprise.
This guide walks you through setting up the AI Proxy plugin with a cloud-hosted model, using the cloud’s native authentication mechanism.
Overview
When running software on a cloud-hosted virtual machine or container instance, the provider offers a keyless role-based access mechanism, allowing you to call services native to that cloud provider without having to store any keys inside the running instance (or in the Kong configuration).
This operates like a single-sign-on (SSO) mechanism for your cloud applications.
Kong’s AI Gateway (AI Proxy) can take advantage of the authentication mechanisms for many different cloud providers and, where available, can also use this authentication to call LLM-based services using those same methods.
Supported providers
Kong’s AI Gateway currently supports the following cloud authentication:
AI-Proxy LLM Provider | Cloud Provider | Type |
---|---|---|
azure |
Azure OpenAI | Entra / Managed Identity Authentication |
Azure OpenAI
When hosting your LLMs with Azure OpenAI Service and running them through AI Proxy, it is possible to use the assigned Azure Managed Identity or User-Assigned Identity of the VM, Kubernetes service-account, or ACS container, to call the Azure OpenAI models.
You can also use an Entra principal or App Registration (client_id
, client_secret
, and tenant_id
triplet) when
Kong is hosted outside of Azure.
How you do this depends on where and how you are running Kong Gateway.
Prerequisites
You must be running a Kong Gateway Enterprise instance.
Ensure that the Azure principal that you have assigned to the Compute resource (that is running your Kong Gateway) has the necessary Entra or IAM permissions to execute commands on the desired OpenAI instances. It must have one of the following permissions:
- Cognitive Services OpenAI User
- Cognitive Services OpenAI Contributor
See Azure’s documentation on managed identity to set this up.
Configuring the AI Proxy Plugin to use Azure Identity
When running Kong inside of your Azure subscription, AI Proxy is usually able to detect the designated Managed Identity or User-Assigned Identity of that Azure Compute resource, and use it accordingly.
Azure-Assigned Managed Identity
To use an Azure-Assigned Managed Identity, set up your plugin config like this:
Make the following request:
curl -X POST http://localhost:8001/routes/{routeName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and route ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/routes/{routeId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the ingress
as follows:
kubectl annotate ingress INGRESS_NAME konghq.com/plugins=ai-proxy-example
Replace INGRESS_NAME
with the name of the ingress that this plugin configuration will target.
You can see your available ingresses by running kubectl get ingress
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
route: ROUTE_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
route = {
id = konnect_gateway_route.my_route.id
}
}
Make the following request:
curl -X POST http://localhost:8001/consumer_groups/{consumerGroupName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and consumer group ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumer_groups/{consumerGroupId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the KongConsumerGroup
object as follows:
kubectl annotate KongConsumerGroup CONSUMER_GROUP_NAME konghq.com/plugins=ai-proxy-example
Replace CONSUMER_GROUP_NAME
with the name of the consumer group that this plugin configuration will target.
You can see your available consumer groups by running kubectl get KongConsumerGroup
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, consumer group, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
consumer_group: CONSUMER_GROUP_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer_group = {
id = konnect_gateway_consumer_group.my_consumer_group.id
}
}
Make the following request:
curl -X POST http://localhost:8001/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Make the following request, substituting your own access token, region, and control plane ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
Create a KongClusterPlugin resource and label it as global:
apiVersion: configuration.konghq.com/v1
kind: KongClusterPlugin
metadata:
name: <global-ai-proxy>
annotations:
kubernetes.io/ingress.class: kong
labels:
global: "true"
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
plugin: ai-proxy
Add a plugins
entry in the declarative configuration file:
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
}
User-Assigned Identity
To use a User-Assigned Identity, specify its client ID like this:
Make the following request:
curl -X POST http://localhost:8001/routes/{routeName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and route ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/routes/{routeId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the ingress
as follows:
kubectl annotate ingress INGRESS_NAME konghq.com/plugins=ai-proxy-example
Replace INGRESS_NAME
with the name of the ingress that this plugin configuration will target.
You can see your available ingresses by running kubectl get ingress
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
route: ROUTE_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
route = {
id = konnect_gateway_route.my_route.id
}
}
Make the following request:
curl -X POST http://localhost:8001/consumer_groups/{consumerGroupName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and consumer group ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumer_groups/{consumerGroupId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the KongConsumerGroup
object as follows:
kubectl annotate KongConsumerGroup CONSUMER_GROUP_NAME konghq.com/plugins=ai-proxy-example
Replace CONSUMER_GROUP_NAME
with the name of the consumer group that this plugin configuration will target.
You can see your available consumer groups by running kubectl get KongConsumerGroup
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, consumer group, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
consumer_group: CONSUMER_GROUP_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer_group = {
id = konnect_gateway_consumer_group.my_consumer_group.id
}
}
Make the following request:
curl -X POST http://localhost:8001/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Make the following request, substituting your own access token, region, and control plane ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
Create a KongClusterPlugin resource and label it as global:
apiVersion: configuration.konghq.com/v1
kind: KongClusterPlugin
metadata:
name: <global-ai-proxy>
annotations:
kubernetes.io/ingress.class: kong
labels:
global: "true"
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
plugin: ai-proxy
Add a plugins
entry in the declarative configuration file:
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
}
Using Entra or app registration
If running outside of Azure, to use an Entra principal or app registration, specify all properties like this:
Make the following request:
curl -X POST http://localhost:8001/routes/{routeName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_client_secret": "be0c34b6-b5f1-4343-99a3-140df73e0c1c",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and route ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/routes/{routeId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_client_secret":"be0c34b6-b5f1-4343-99a3-140df73e0c1c","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the ingress
as follows:
kubectl annotate ingress INGRESS_NAME konghq.com/plugins=ai-proxy-example
Replace INGRESS_NAME
with the name of the ingress that this plugin configuration will target.
You can see your available ingresses by running kubectl get ingress
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
route: ROUTE_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_client_secret = "be0c34b6-b5f1-4343-99a3-140df73e0c1c"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
route = {
id = konnect_gateway_route.my_route.id
}
}
Make the following request:
curl -X POST http://localhost:8001/consumer_groups/{consumerGroupName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_client_secret": "be0c34b6-b5f1-4343-99a3-140df73e0c1c",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and consumer group ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumer_groups/{consumerGroupId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_client_secret":"be0c34b6-b5f1-4343-99a3-140df73e0c1c","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the KongConsumerGroup
object as follows:
kubectl annotate KongConsumerGroup CONSUMER_GROUP_NAME konghq.com/plugins=ai-proxy-example
Replace CONSUMER_GROUP_NAME
with the name of the consumer group that this plugin configuration will target.
You can see your available consumer groups by running kubectl get KongConsumerGroup
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, consumer group, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
consumer_group: CONSUMER_GROUP_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_client_secret = "be0c34b6-b5f1-4343-99a3-140df73e0c1c"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer_group = {
id = konnect_gateway_consumer_group.my_consumer_group.id
}
}
Make the following request:
curl -X POST http://localhost:8001/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_client_secret": "be0c34b6-b5f1-4343-99a3-140df73e0c1c",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Make the following request, substituting your own access token, region, and control plane ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_client_secret":"be0c34b6-b5f1-4343-99a3-140df73e0c1c","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
Create a KongClusterPlugin resource and label it as global:
apiVersion: configuration.konghq.com/v1
kind: KongClusterPlugin
metadata:
name: <global-ai-proxy>
annotations:
kubernetes.io/ingress.class: kong
labels:
global: "true"
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
plugin: ai-proxy
Add a plugins
entry in the declarative configuration file:
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_client_secret: be0c34b6-b5f1-4343-99a3-140df73e0c1c
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_client_secret = "be0c34b6-b5f1-4343-99a3-140df73e0c1c"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
}
Environment variables
You can also specify some (or all) of these properties as environment variables. For example:
Environment variable:
AZURE_CLIENT_SECRET="be0c34b6-b5f1-4343-99a3-140df73e0c1c"
Plugin configuration:
Make the following request:
curl -X POST http://localhost:8001/routes/{routeName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and route ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/routes/{routeId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the ingress
as follows:
kubectl annotate ingress INGRESS_NAME konghq.com/plugins=ai-proxy-example
Replace INGRESS_NAME
with the name of the ingress that this plugin configuration will target.
You can see your available ingresses by running kubectl get ingress
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
route: ROUTE_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace ROUTE_NAME|ID
with the id
or name
of the route that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
route = {
id = konnect_gateway_route.my_route.id
}
}
Make the following request:
curl -X POST http://localhost:8001/consumer_groups/{consumerGroupName|Id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Make the following request, substituting your own access token, region, control plane ID, and consumer group ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/consumer_groups/{consumerGroupId}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
First, create a KongPlugin resource:
echo "
apiVersion: configuration.konghq.com/v1
kind: KongPlugin
metadata:
name: ai-proxy-example
plugin: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
" | kubectl apply -f -
Next, apply the KongPlugin
resource to an ingress by annotating the KongConsumerGroup
object as follows:
kubectl annotate KongConsumerGroup CONSUMER_GROUP_NAME konghq.com/plugins=ai-proxy-example
Replace CONSUMER_GROUP_NAME
with the name of the consumer group that this plugin configuration will target.
You can see your available consumer groups by running kubectl get KongConsumerGroup
.
Note: The KongPlugin resource only needs to be defined once and can be applied to any service, consumer, consumer group, or route in the namespace. If you want the plugin to be available cluster-wide, create the resource as aKongClusterPlugin
instead ofKongPlugin
.
Add this section to your declarative configuration file:
plugins:
- name: ai-proxy
consumer_group: CONSUMER_GROUP_NAME|ID
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Replace CONSUMER_GROUP_NAME|ID
with the id
or name
of the consumer group that this plugin configuration will target.
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
consumer_group = {
id = konnect_gateway_consumer_group.my_consumer_group.id
}
}
Make the following request:
curl -X POST http://localhost:8001/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--data '
{
"name": "ai-proxy",
"config": {
"route_type": "llm/v1/chat",
"auth": {
"azure_use_managed_identity": true,
"azure_client_id": "aabdecea-fc38-40ca-9edd-263878b290fe",
"azure_tenant_id": "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
},
"model": {
"provider": "azure",
"name": "gpt-35-turbo",
"options": {
"azure_instance": "my-openai-instance",
"azure_deployment_id": "kong-gpt-3-5"
}
}
}
}
'
Make the following request, substituting your own access token, region, and control plane ID:
curl -X POST \
https://{us|eu}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer TOKEN" \
--data '{"name":"ai-proxy","config":{"route_type":"llm/v1/chat","auth":{"azure_use_managed_identity":true,"azure_client_id":"aabdecea-fc38-40ca-9edd-263878b290fe","azure_tenant_id":"1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"},"model":{"provider":"azure","name":"gpt-35-turbo","options":{"azure_instance":"my-openai-instance","azure_deployment_id":"kong-gpt-3-5"}}}}'
See the Konnect API reference to learn about region-specific URLs and personal access tokens.
Create a KongClusterPlugin resource and label it as global:
apiVersion: configuration.konghq.com/v1
kind: KongClusterPlugin
metadata:
name: <global-ai-proxy>
annotations:
kubernetes.io/ingress.class: kong
labels:
global: "true"
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
plugin: ai-proxy
Add a plugins
entry in the declarative configuration file:
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
azure_use_managed_identity: true
azure_client_id: aabdecea-fc38-40ca-9edd-263878b290fe
azure_tenant_id: 1e583ecd-9293-4db1-b1c0-2b6126cb5fdd
model:
provider: azure
name: gpt-35-turbo
options:
azure_instance: my-openai-instance
azure_deployment_id: kong-gpt-3-5
Prerequisite: Configure your Personal Access Token
terraform {
required_providers {
konnect = {
source = "kong/konnect"
}
}
}
provider "konnect" {
personal_access_token = "kpat_YOUR_TOKEN"
server_url = "https://us.api.konghq.com/"
}
Add the following to your Terraform configuration to create a Konnect Gateway Plugin:
resource "konnect_gateway_plugin_ai_proxy" "my_ai_proxy" {
enabled = true
config = {
route_type = "llm/v1/chat"
auth = {
azure_use_managed_identity = true
azure_client_id = "aabdecea-fc38-40ca-9edd-263878b290fe"
azure_tenant_id = "1e583ecd-9293-4db1-b1c0-2b6126cb5fdd"
}
model = {
provider = "azure"
name = "gpt-35-turbo"
options = {
azure_instance = "my-openai-instance"
azure_deployment_id = "kong-gpt-3-5"
}
}
}
control_plane_id = konnect_gateway_control_plane.my_konnect_cp.id
}