Skip to content
Kong Summit 2022: Where API Innovation Runs Wild  —Learn More →
Kong Logo | Kong Docs Logo
search
  • We're Hiring!
  • Docs
    • Kong Gateway
    • Konnect Cloud
    • Kong Mesh
    • Plugin Hub
    • decK
    • Kubernetes Ingress Controller
    • Insomnia
    • Kuma

    • Kong Konnect Platform

    • Docs contribution guidelines
  • Plugin Hub
  • Support
  • Community
  • Kong Academy
Request Demo
  • Kong Gateway
  • Konnect Cloud
  • Kong Mesh
  • Plugin Hub
  • decK
  • Kubernetes Ingress Controller
  • Insomnia
  • Kuma

  • Kong Konnect Platform

  • Docs contribution guidelines
  • 2.8.x (latest)
  • 2.7.x
  • 2.6.x
  • Older Enterprise versions (0.31-2.5)
  • Older OSS versions (0.13-2.5)
  • Archive (pre-0.13)
    • Version Support Policy
    • Changelog
    • Kubernetes
    • Helm
    • OpenShift with Helm
    • Docker
    • Amazon Linux
    • CentOS
    • macOS
    • Debian
    • RHEL
    • Ubuntu
    • Migrating from OSS to EE
    • Upgrade Kong Gateway
    • Upgrade Kong Gateway OSS
      • Configuring a Service
      • Configuring a gRPC Service
      • Enabling Plugins
      • Adding Consumers
      • Prepare to Administer
      • Expose your Services
      • Protect your Services
      • Improve Performance
      • Secure Services
      • Set Up Intelligent Load Balancing
      • Manage Administrative Teams
      • Publish, Locate, and Consume Services
    • Running Kong as a Non-Root User
    • Resource Sizing Guidelines
      • Deploy Kong Gateway in Hybrid Mode
    • Kubernetes Deployment Options
    • Control Kong Gateway through systemd
    • Performance Testing Framework
    • DNS Considerations
    • Default Ports
      • Access Your License
      • Deploy Your License
      • Monitor License Usage
      • Start Kong Gateway Securely
      • Keyring and Data Encryption
      • Kong Security Update Process
      • Authentication Reference
        • OpenID Connect with Curity
        • OpenID Connect with Azure AD
        • OpenID Connect with Google
        • OpenID Connect with Okta
        • OpenID Connect with Auth0
        • OpenID Connect with Cognito
        • OpenID Connect Plugin Reference
      • Allowing Multiple Authentication Methods
        • Create a Super Admin
        • Configure Networking
        • Configure Kong Manager to Send Email
        • Reset Passwords and RBAC Tokens
        • Configure Workspaces
        • Basic Auth
        • LDAP
        • OIDC
        • Sessions
        • Add a Role
        • Add a User
        • Add an Admin
      • Mapping LDAP Service Directory Groups to Kong Roles
      • Enable the Dev Portal
      • Structure and File Types
      • Portal API
      • Working with Templates
      • Using the Editor
          • Basic Auth
          • Key Auth
          • OIDC
          • Sessions
          • Adding Custom Registration Fields
        • SMTP
        • Workspaces
        • Manage Developers
        • Developer Roles and Content Permissions
          • Authorization Provider Strategy
          • Enable Application Registration
          • Enable Key Authentication for Application Registration
          • External OAuth2 Support
          • Set up Okta and Kong for external OAuth
          • Set Up Azure AD and Kong for External Authentication
          • Manage Applications
        • Easy Theme Editing
        • Migrating Templates Between Workspaces
        • Markdown Rendering Module
        • Customizing Portal Emails
        • Adding and Using JavaScript Assets
        • Single Page App in Dev Portal
        • Alternate OpenAPI Renderer
      • Helpers CLI
    • Configure gRPC Plugins
    • GraphQL Quickstart
    • Logging Reference
    • Network and Firewall
      • Metrics
      • Reports
      • Vitals with InfluxDB
      • Vitals with Prometheus
      • Estimate Vitals Storage in PostgreSQL
    • Prometheus plugin
    • Zipkin plugin
      • DB-less Mode
      • Declarative Configuration
      • Supported Content Types
      • Information Routes
      • Health Routes
      • Tags
      • Service Object
      • Route Object
      • Consumer Object
      • Plugin Object
      • Certificate Object
      • CA Certificate Object
      • SNI Object
      • Upstream Object
      • Target Object
        • Licenses Reference
        • Licenses Examples
        • Workspaces Reference
        • Workspace Examples
        • RBAC Reference
        • RBAC Examples
        • API Reference
        • Examples
        • API Reference
        • Examples
        • Event Hooks Reference
        • Examples
      • Audit Logging
      • Keyring and Data Encryption
      • Securing the Admin API
    • DB-less and Declarative Configuration
    • Configuration Reference
    • CLI Reference
    • Load Balancing Reference
    • Proxy Reference
    • Rate Limiting Library
    • Health Checks and Circuit Breakers Reference
    • Clustering Reference
      • kong.client
      • kong.client.tls
      • kong.cluster
      • kong.ctx
      • kong.ip
      • kong.log
      • kong.nginx
      • kong.node
      • kong.request
      • kong.response
      • kong.router
      • kong.service
      • kong.service.request
      • kong.service.response
      • kong.table
      • Introduction
      • File structure
      • Implementing custom logic
      • Plugin configuration
      • Accessing the datastore
      • Storing custom entities
      • Caching custom entities
      • Extending the Admin API
      • Writing tests
      • (un)Installing your plugin
    • Plugins in Other Languages

github-edit-pageEdit this page

report-issueReport an issue

enterprise-switcher-iconSwitch to OSS

On this page
  • What is Proxy Caching?
  • Why use Proxy Caching?
  • Set up the Proxy Caching plugin
  • Validate Proxy Caching
  • Summary and Next Steps
Kong Gateway
2.7.x
  • Home
  • Kong Gateway
  • Get started
  • Comprehensive
You are browsing documentation for an outdated version. See the latest documentation here.

Improve Performance with Proxy Caching

In this topic, you’ll learn how to use proxy caching to improve response efficiency using the Proxy Caching plugin.

If you are following the getting started workflow, make sure you have completed Protect your Services before continuing.

What is Proxy Caching?

Kong Gateway delivers fast performance through caching. The Proxy Caching plugin provides this fast performance using a reverse proxy cache implementation. It caches response entities based on the request method, configurable response code, content type, and can cache per Consumer or per API.

Cache entities are stored for a configurable period of time. When the timeout is reached, the gateway forwards the request to the Upstream, caches the result and responds from cache until the timeout. The plugin can store cached data in memory, or for improved performance, in Redis.

Why use Proxy Caching?

Use proxy caching so that Upstream services are not bogged down with repeated requests. With proxy caching, Kong Gateway can respond with cached results for better performance.

Set up the Proxy Caching plugin

Using Kong Manager
Using the Admin API
Using decK (YAML)
  1. Access your Kong Manager instance and your default workspace.

  2. Go to API Gateway and click Plugins.

  3. Click New Plugin.

  4. Scroll down to the Traffic Control section and find the Proxy Caching plugin.

  5. Click Enable.

  6. Select to apply the plugin as Global. This means that proxy caching applies to all requests.

  7. Scroll down and complete only the following fields with the parameters listed.
    1. config.cache_ttl: 30
    2. config.content_type: application/json; charset=utf-8
    3. config.strategy: memory

    Besides the above fields, there may be others populated with default values. For this example, leave the rest of the fields as they are.

  8. Click Create.

Call the Admin API on port 8001 and configure plugins to enable in-memory caching globally, with a timeout of 30 seconds for Content-Type application/json.

cURL
HTTPie
curl -i -X POST http://<admin-hostname>:8001/plugins \
  --data name=proxy-cache \
  --data config.content_type="application/json; charset=utf-8" \
  --data config.cache_ttl=30 \
  --data config.strategy=memory
http -f :8001/plugins \
  name=proxy-cache \
  config.strategy=memory \
  config.cache_ttl=30 \
  config.content_type="application/json; charset=utf-8"
  1. In the plugins section of your kong.yaml file, add the proxy-cache plugin with a timeout of 30 seconds for Content-Type application/json; charset=utf-8.

     plugins:
     - name: proxy-cache
       config:
         content_type:
         - "application/json; charset=utf-8"
         cache_ttl: 30
         strategy: memory
    

    Your file should now look like this:

     _format_version: "1.1"
     services:
     - host: mockbin.org
       name: example_service
       port: 80
       protocol: http
       routes:
       - name: mocking
         paths:
         - /mock
         strip_path: true
     plugins:
     - name: rate-limiting
       config:
         minute: 5
         policy: local
     - name: proxy-cache
       config:
         content_type:
         - "application/json; charset=utf-8"
         cache_ttl: 30
         strategy: memory
    
  2. Sync the configuration:

     deck sync
    

Validate Proxy Caching

Let’s check that proxy caching works. You’ll need the Kong Admin API for this step.

Access the /mock route using the Admin API and note the response headers:

cURL
HTTPie
curl -i -X GET http://<admin-hostname>:8000/mock/request
http :8000/mock/request

In particular, pay close attention to the values of X-Cache-Status, X-Kong-Proxy-Latency, and X-Kong-Upstream-Latency:

HTTP/1.1 200 OK
...
X-Cache-Key: d2ca5751210dbb6fefda397ac6d103b1
X-Cache-Status: Miss
X-Content-Type-Options: nosniff
...
X-Kong-Proxy-Latency: 25
X-Kong-Upstream-Latency: 37

Next, access the /mock route one more time.

This time, notice the differences in the values of X-Cache-Status, X-Kong-Proxy-Latency, and X-Kong-Upstream-Latency. Cache status is a hit, which means Kong Gateway is responding to the request directly from cache instead of proxying the request to the Upstream service.

Further, notice the minimal latency in the response, which allows Kong Gateway to deliver the best performance:

HTTP/1.1 200 OK
...
X-Cache-Key: d2ca5751210dbb6fefda397ac6d103b1
X-Cache-Status: Hit
...
X-Kong-Proxy-Latency: 0
X-Kong-Upstream-Latency: 1

To test more rapidly, the cache can be deleted by calling the Admin API:

cURL
HTTPie
curl -i -X DELETE http://<admin-hostname>:8001/proxy-cache
http delete :8001/proxy-cache

Summary and Next Steps

In this section, you:

  • Set up the Proxy Caching plugin, then accessed the /mock route multiple times to see caching in effect.
  • Witnessed the performance differences in latency with and without caching.

Next, you’ll learn about securing services.

Thank you for your feedback.
Was this page useful?
  • Kong
    THE CLOUD CONNECTIVITY COMPANY

    Kong powers reliable digital connections across APIs, hybrid and multi-cloud environments.

    • Company
    • Customers
    • Events
    • Investors
    • Careers Hiring!
    • Partners
    • Press
    • Contact
  • Products
    • Kong Konnect
    • Kong Gateway
    • Kong Mesh
    • Get Started
    • Pricing
  • Resources
    • eBooks
    • Webinars
    • Briefs
    • Blog
    • API Gateway
    • Microservices
  • Open Source
    • Install Kong Gateway
    • Kong Community
    • Kubernetes Ingress
    • Kuma
    • Insomnia
  • Solutions
    • Decentralize
    • Secure & Govern
    • Create a Dev Platform
    • API Gateway
    • Kubernetes
    • Service Mesh
Star
  • Terms•Privacy
© Kong Inc. 2022