Use LangChain with AI Proxy in Kong Gateway
You can configure LangChain scripts to use your AI Gateway Route by replacing the base_url
parameter in the LangChain model instantiation with your proxy URL.
Prerequisites
Kong Konnect
This is a Konnect tutorial and requires a Konnect personal access token.
-
Create a new personal access token by opening the Konnect PAT page and selecting Generate Token.
-
Export your token to an environment variable:
export KONNECT_TOKEN='YOUR_KONNECT_PAT'
Copied to clipboard! -
Run the quickstart script to automatically provision a Control Plane and Data Plane, and configure your environment:
curl -Ls https://get.konghq.com/quickstart | bash -s -- -k $KONNECT_TOKEN --deck-output
Copied to clipboard!This sets up a Konnect Control Plane named
quickstart
, provisions a local Data Plane, and prints out the following environment variable exports:export DECK_KONNECT_TOKEN=$KONNECT_TOKEN export DECK_KONNECT_CONTROL_PLANE_NAME=quickstart export KONNECT_CONTROL_PLANE_URL=https://us.api.konghq.com export KONNECT_PROXY_URL='http://localhost:8000'
Copied to clipboard!Copy and paste these into your terminal to configure your session.
Configure the AI Proxy plugin
Enable the AI Proxy plugin with your OpenAI API key and the model details. In this example, we’ll use the GPT-4o model.
echo '
_format_version: "3.0"
plugins:
- name: ai-proxy
config:
route_type: llm/v1/chat
auth:
header_name: Authorization
header_value: Bearer ${{ env "DECK_OPENAI_API_KEY" }}
model:
provider: openai
name: gpt-4o
' | deck gateway apply -
Add authentication
To secure the access to your Route, create a Consumer and set up an authentication plugin.
Note that LangChain expects authentication as an
Authorization
header with a value starting withBearer
. You can use plugins like OAuth 2.0 Authentication or OpenID Connect to generate Bearer tokens. In this example, for testing purposes, we’ll recreate this pattern using the Key Authentication plugin.
echo '
_format_version: "3.0"
plugins:
- name: key-auth
route: example-route
config:
key_names:
- Authorization
consumers:
- username: ai-user
keyauth_credentials:
- key: Bearer my-api-key
' | deck gateway apply -
Install LangChain
Load the LangChain SDK into your Python dependencies:
Create a LangChain script
Use the following command to create a file named app.py
containing a LangChain Python script:
echo 'from langchain_openai import ChatOpenAI
import os
kong_url = os.environ['KONNECT_PROXY_URL']
kong_route = "anything"
llm = ChatOpenAI(
base_url=f"{kong_url}/{kong_route}",
model="gpt-4o",
api_key="my-api-key"
)
response = llm.invoke("What are you?")
print(f"$ ChainAnswer:> {response.content}")' > app.py
With the base_url
parameter, we can override the OpenAI base URL that LangChain uses by default with the URL to our Kong Gateway Route. This way, we can proxy requests and apply Kong Gateway plugins, while also using LangChain integrations and tools.
In the api_key
parameter, we’ll add the API key we created, without the Bearer
prefix, which is added automatically by LangChain.
Validate
Run your script to validate that LangChain can access the Route:
python3 ./app.py
The response should look like this:
ChainAnswer:> I am an AI language model created by OpenAI, designed to assist with understanding and generating human-like text based on the input I receive. I can help answer questions, provide explanations, and assist with a variety of tasks involving language. What would you like to know or discuss today?
Cleanup
Clean up Konnect environment
If you created a new control plane and want to conserve your free trial credits or avoid unnecessary charges, delete the new control plane used in this tutorial.