The AI Prompt Template plugin lets you provide tuned AI prompts to users.
Users only need to fill in the blanks with variable placeholders in the following format: {{variable}}.
This lets admins set up templates, which can then be used by anyone in the organization. It also allows admins to present an LLM
as an API in its own right - for example, a bot that can provide software class examples and/or suggestions.
This plugin also sanitizes string inputs to ensure that JSON control characters are escaped, preventing arbitrary prompt injection.
When activated, the template restricts LLM usage to the predefined templates. They are defined in the following format:
_format_version:"3.0"plugins:-name:ai-prompt-templateconfig:templates:name:sample-templatetemplate:|-{"messages": [{"role": "user","content": "Explain to me what {{thing}} is."}]}
Copied to clipboard!
curl -i-X POST http://localhost:8001/plugins/ \--header"Accept: application/json"\--header"Content-Type: application/json"\--data'
{
"name": "ai-prompt-template",
"config": {
"templates": {
"name": "sample-template",
"template": "{\n \"messages\": [\n {\n \"role\": \"user\",\n \"content\": \"Explain to me what {{thing}} is.\"\n }\n ]\n}"
}
}
}
'
Copied to clipboard!
Make the following request:
curl -X POST https://{region}.api.konghq.com/v2/control-planes/{controlPlaneId}/core-entities/plugins/ \--header"accept: application/json"\--header"Content-Type: application/json"\--header"Authorization: Bearer $KONNECT_TOKEN"\--data'
{
"name": "ai-prompt-template",
"config": {
"templates": {
"name": "sample-template",
"template": "{\n \"messages\": [\n {\n \"role\": \"user\",\n \"content\": \"Explain to me what {{thing}} is.\"\n }\n ]\n}"
}
}
}
'
Copied to clipboard!
echo "apiVersion:configuration.konghq.com/v1kind:KongClusterPluginmetadata:name:ai-prompt-templatenamespace:kongannotations:kubernetes.io/ingress.class:konglabels:global:'true'config:templates:name:sample-templatetemplate:|-{'messages': [{'role': 'user','content': 'Explain to me what {{thing}} is.'}]}plugin:ai-prompt-template"|kubectlapply-f-
Copied to clipboard!
resource"konnect_gateway_plugin_ai_prompt_template""my_ai_prompt_template"{enabled=trueconfig={templates={name="sample-template"template=<<EOF{"messages":[{"role":"user","content":"Explain to me what {{thing}} is."}]}EOF}}control_plane_id=konnect_gateway_control_plane.my_konnect_cp.id}
Copied to clipboard!
When calling a template, replace the content of messages (llm/v1/chat) or prompt (llm/v1/completions) with a template reference, using the following format:
By default, requests that don’t use a template are still be passed to the LLM. However, this can be configured using the config.allow_untemplated_requests parameter. If this parameter is set to false, requests that don’t use a template will return a 400 Bad Request response.