The AI Response Transformer plugin uses a configured LLM service to transform the upstream’s HTTP(S) response before returning it to the client. It can also terminate or otherwise nullify the response if it fails a compliance or formatting check from the configured LLM service, for example.
This plugin supports llm/v1/chat
requests for the same providers as the AI Proxy plugin.
It also uses the same configuration and tuning parameters as the AI Proxy plugin, in the config.llm
block.
The AI Response Transformer plugin runs after the AI Proxy plugin, and after proxying to the upstream service, allowing it to also transform responses coming from a different LLM.