You are browsing unreleased documentation.
Looking for the plugin's configuration parameters? You can find them in the AI Prompt Template configuration reference doc.
The AI Prompt Template plugin lets you provide tuned AI prompts to users.
Users only need to fill in the blanks with variable placeholders in the following format: {{variable}}
.
This lets admins set up templates, which can be then be used by anyone in the organization. It also allows admins to present an LLM as an API in its own right - for example, a bot that can provide software class examples and/or suggestions.
This plugin also sanitizes string inputs to ensure that JSON control characters are escaped, preventing arbitrary prompt injection.
This plugin extends the functionality of the AI Proxy plugin, and requires AI Proxy to be configured first. Check out the AI Gateway quickstart to get an AI proxy up and running within minutes!
How it works
When calling a template, simply replace the messages
(llm/v1/chat
) or prompt
(llm/v1/completions
) with a template reference, in the
following format: {template://TEMPLATE_NAME}
When activated the template restricts an LLM usage to just those pre-defined templates, which are called in the following example format:
{
"prompt": "{template://sample-template}",
"properties": {
"thing": "gravity"
}
}
Get started with the AI Prompt Template plugin
- AI Gateway quickstart: Set up AI Proxy
- Configuration reference
- Basic configuration example
- Learn how to use the plugin