Transform a response using OpenAI in Kong Gateway

Uses: Kong Gateway AI Gateway decK
Related Resources
Minimum Version
Kong Gateway - 3.6
TL;DR

Enable the AI Response Transformer plugin, configure the parameters under config.llm to access your LLM and describe the transformation to perform with the config.prompt parameter.

Prerequisites

This is a Konnect tutorial and requires a Konnect personal access token.

  1. Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.

  2. Export your token to an environment variable:

     export KONNECT_TOKEN='YOUR_KONNECT_PAT'
    
    Copied to clipboard!
  3. Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:

     curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN --deck-output
    
    Copied to clipboard!

    This sets up a Konnect Control Plane named quickstart, provisions a local Data Plane, and prints out the following environment variable exports:

     export DECK_KONNECT_TOKEN=$KONNECT_TOKEN
     export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart
     export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com
     export KONNECT_PROXY_URL='http://localhost:8000'
    
    Copied to clipboard!

    Copy and paste these into your terminal to configure your session.

Enable the AI Response Transformer plugin

In this example, we want to inject a new header in the response after it’s proxied and before it’s returned to the client. To add a new header, we need to:

We also want to make sure that the LLM only returns the JSON content and doesn’t add extra text around it. There are two ways to do this:

  • Include this in the prompt, by adding “Return only the JSON message, no extra text” for example.
  • Specify a regex in the config.transformation_extract_pattern parameter to extract only the data we need. This is the option we’ll use in this example.

Configure the AI Response Transformer plugin with the required LLM details, the transformation prompt, and the expected response body pattern to extract:

echo '
_format_version: "3.0"
plugins:
  - name: ai-response-transformer
    config:
      prompt: |
        Add a new header named "new-header" with the value "header-value" to the response. Format the JSON response as follows:
        {
          "headers":
            {
              "new-header": "header-value"
            },
          "status": 201,
          "body": "new response body"
        }
      transformation_extract_pattern: "{((.|\\n)*)}"
      parse_llm_response_json_instructions: true
      llm:
        route_type: llm/v1/chat
        auth:
          header_name: Authorization
          header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
        model:
          provider: openai
          name: gpt-4
' | deck gateway apply -
Copied to clipboard!

Validate

To check that the response transformation is working, send a request:

 curl -i "$KONNECT_PROXY_URL/anything" \
     -H "Accept: application/json"
Copied to clipboard!

The response should contain the new header: new-header: header-value.

Cleanup

If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.

Did this doc help?

Something wrong?

Help us make these docs great!

Kong Developer docs are open source. If you find these useful and want to make them better, contribute today!