Looking for the plugin's configuration parameters? You can find them in the AI Prompt Decorator configuration reference doc.
The AI Prompt Decorator plugin adds an array of llm/v1/chat
messages to either the start or end of an LLM consumer’s chat history.
This allows you to pre-engineer complex prompts, or steer (and guard) prompts in such a way that the modification to the consumer’s
LLM message is completely transparent.
You can use this plugin to pre-set a system prompt, set up specific prompt history, add words and phrases, or otherwise have more control over how an LLM service is used when called via Kong Gateway.
This plugin extends the functionality of the AI Proxy plugin, and requires AI Proxy to be configured first. Check out the AI Gateway quickstart to get an AI proxy up and running within minutes!
Get started with the AI Prompt Decorator plugin
- AI Gateway quickstart: Set up AI Proxy
- Configuration reference
- Basic configuration example
- Learn how to use the plugin