Looking for the plugin's configuration parameters? You can find them in the AI Prompt Guard configuration reference doc.
The AI Prompt Guard plugin lets you to configure a series of PCRE-compatible regular expressions as allow or deny lists,
to guard against misuse of llm/v1/chat
or llm/v1/completions
requests.
You can use this plugin to allow or block specific prompts, words, phrases, or otherwise have more control over how an LLM service is used when called via Kong Gateway.
It does this by scanning all chat messages (where the role is user
) for the specific expressions set.
You can use a combination of allow
and deny
rules to preserve integrity and compliance when serving an LLM service using Kong Gateway.
-
For
llm/v1/chat
type models: You can optionally configure the plugin to ignore existing chat history, wherein it will only scan the trailinguser
message. -
For
llm/v1/completions
type models: There is only oneprompt
field, thus the whole prompt is scanned on every request.
This plugin extends the functionality of the AI Proxy plugin, and requires AI Proxy to be configured first. Check out the AI Gateway quickstart to get an AI proxy up and running within minutes!
How it works
The plugin matches lists of regular expressions to requests through AI Proxy.
The matching behavior is as follows:
- If any
deny
expressions are set, and the request matches any regex pattern in thedeny
list, the caller receives a 400 response. - If any
allow
expressions are set, but the request matches none of the allowed expressions, the caller also receives a 400 response. - If any
allow
expressions are set, and the request matches one of theallow
expressions, the request passes through to the LLM. - If there are both
deny
andallow
expressions set, thedeny
condition takes precedence overallow
. Any request that matches an entry in thedeny
list will return a 400 response, even if it also matches an expression in theallow
list. If the request does not match an expression in thedeny
list, then it must match an expression in theallow
list to be passed through to the LLM
Get started with the AI Prompt Guard plugin
- AI Gateway quickstart: Set up AI Proxy
- Configuration reference
- Basic configuration example
- Learn how to use the plugin