Publish request and response logs to an Apache Kafka topic. For more information, see Kafka topics.
Kong also provides a Kafka plugin for request transformations. See Kafka Upstream.
Configuration Reference
This plugin is compatible with DB-less mode.
In DB-less mode, you configure Kong Gateway declaratively. Therefore, the Admin API is mostly read-only. The only tasks it can perform are all related to handling the declarative config, including:
- Setting a target's health status in the load balancer
- Validating configurations against schemas
- Uploading the declarative configuration using the
/config
endpoint
Example plugin configuration
A plugin which is not associated to any service, route, or consumer is considered global, and will be run on every request. Read the Plugin Reference and the Plugin Precedence sections for more information.
The following examples provide some typical configurations for enabling
the kafka-log
plugin globally.
Parameters
Here's a list of all the parameters which can be used in this plugin's configuration:
Form Parameter | Description |
---|---|
name
required Type: string |
The name of the plugin, in this case kafka-log . |
enabled
Type: boolean Default value: true |
Whether this plugin will be applied. |
config.bootstrap_servers
required Type: set of record elements |
Set of bootstrap brokers in a |
config.topic
required Type: string |
The Kafka topic to publish to. |
config.authentication.strategy
optional Type: string |
The authentication strategy for the plugin, the only option for the value is |
config.authentication.mechanism
optional Type: string |
The SASL authentication mechanism. Supported options: |
config.authentication.user
optional Type: string |
Username for SASL authentication. If keyring database encryption is enabled, this value will be encrypted. This field is referenceable, which means it can be securely stored as a secret in a vault. References must follow a specific format. |
config.authentication.password
optional Type: string |
Password for SASL authentication. If keyring database encryption is enabled, this value will be encrypted. This field is referenceable, which means it can be securely stored as a secret in a vault. References must follow a specific format. |
config.authentication.tokenauth
optional Type: boolean Default value: false
|
Enable this to indicate |
config.security.ssl
optional Type: boolean Default value: false
|
Enables TLS. |
config.security.certificate_id
optional Type: string |
UUID of certificate entity for mTLS authentication. |
config.timeout
optional Type: integer Default value: 10000
|
Socket timeout in milliseconds. |
config.keepalive
optional Type: integer Default value: 60000
|
Keepalive timeout in milliseconds. |
config.cluster_name
optional Type: string Default value: <autogenerated-value>
|
An identifier for the Kafka cluster. By default, this field generates a random string. You can also set your own custom cluster identifier. If more than one Kafka plugin is configured without a |
config.producer_request_acks
optional Type: integer Default value: 1
|
The number of acknowledgments the producer requires the leader to have received before considering a request complete. Allowed values: 0 for no acknowledgments; 1 for only the leader; and -1 for the full ISR (In-Sync Replica set). |
config.producer_request_timeout
optional Type: integer Default value: 2000
|
Time to wait for a Produce response in milliseconds |
config.producer_request_limits_messages_per_request
optional Type: integer Default value: 200
|
Maximum number of messages to include into a single Produce request. |
config.producer_request_limits_bytes_per_request
optional Type: integer Default value: 1048576
|
Maximum size of a Produce request in bytes. |
config.producer_request_retries_max_attempts
optional Type: integer Default value: 10
|
Maximum number of retry attempts per single Produce request. |
config.producer_request_retries_backoff_timeout
optional Type: integer Default value: 100
|
Backoff interval between retry attempts in milliseconds. |
config.producer_async
optional Type: boolean Default value: true
|
Flag to enable asynchronous mode. |
config.producer_async_flush_timeout
optional Type: integer Default value: 1000
|
Maximum time interval in milliseconds between buffer flushes in asynchronous mode. |
config.producer_async_buffering_limits_messages_in_memory
optional Type: integer Default value: 50000
|
Maximum number of messages that can be buffered in memory in asynchronous mode. |
Quickstart
The following guidelines assume that both Kong Enterprise and Kafka
have been
installed on your local machine.
Note: We use
zookeeper
in the following example, which is not required or has been removed on some Kafka versions. Refer to the Kafka ZooKeeper documentation for more information.
-
Create a
kong-log
topic in your Kafka cluster:${KAFKA_HOME}/bin/kafka-topics.sh --create \ --zookeeper localhost:2181 \ --replication-factor 1 \ --partitions 10 \ --topic kong-log
-
Add the
kafka-log
plugin globally:curl -X POST http://localhost:8001/plugins \ --data "name=kafka-log" \ --data "config.bootstrap_servers[1].host=localhost" \ --data "config.bootstrap_servers[1].port=9092" \ --data "config.topic=kong-log"
-
Make sample requests:
for i in {1..50} ; do curl http://localhost:8000/request/$i ; done
-
Verify the contents of the Kafka
kong-log
topic:${KAFKA_HOME}/bin/kafka-console-consumer.sh \ --bootstrap-server localhost:9092 \ --topic kong-log \ --partition 0 \ --from-beginning \ --timeout-ms 1000
Log format
Similar to the HTTP Log Plugin.
Implementation details
This plugin uses the lua-resty-kafka client.
When encoding request bodies, several things happen:
- For requests with a content-type header of
application/x-www-form-urlencoded
,multipart/form-data
, orapplication/json
, this plugin passes the raw request body in thebody
attribute, and tries to return a parsed version of those arguments inbody_args
. If this parsing fails, an error message is returned and the message is not sent. - If the
content-type
is nottext/plain
,text/html
,application/xml
,text/xml
, orapplication/soap+xml
, then the body will be base64-encoded to ensure that the message can be sent as JSON. In such a case, the message has an extra attribute calledbody_base64
set totrue
.
TLS
Enable TLS by setting config.security.ssl
to true
.
mTLS
Enable mTLS by setting a valid UUID of a certificate in config.security.certificate_id
.
Note that this option needs config.security.ssl
set to true.
See Certificate Object
in the Admin API documentation for information on how to set up Certificates.
SASL Authentication
This plugin supports the following authentication mechanisms:
-
PLAIN: Enable this mechanism by setting
config.authentication.mechanism
toPLAIN
. You also need to provide a username and password with the config optionsconfig.authentication.user
andconfig.authentication.password
respectively. -
SCRAM: In cryptography, the Salted Challenge Response Authentication Mechanism (SCRAM) is a family of modern, password-based challenge–response authentication mechanisms providing authentication of a user to a server. The Kafka Log plugin supports the following:
-
SCRAM-SHA-256: Enable this mechanism by setting
config.authentication.mechanism
toSCRAM-SHA-256
. You also need to provide a username and password with the config optionsconfig.authentication.user
andconfig.authentication.password
respectively. -
SCRAM-SHA-512: Enable this mechanism by setting
config.authentication.mechanism
toSCRAM-SHA-512
. You also need to provide a username and password with the config optionsconfig.authentication.user
andconfig.authentication.password
respectively.
-
-
Delegation Tokens: Delegation Tokens can be generated in Kafka and then used to authenticate this plugin.
Delegation Tokens
leverage theSCRAM-SHA-256
authentication mechanism. ThetokenID
is provided with theconfig.authentication.user
field and thetoken-hmac
is provided with theconfig.authentication.password
field. To indicate that a token is used you have to set theconfig.authentication.tokenauth
setting totrue
.Read more on how to create, renew, and revoke delegation tokens.
Known issues and limitations
Known limitations:
- Message compression is not supported.
- The message format is not customizable.
Changelog
Kong Gateway 2.8.x
-
Added support for the
SCRAM-SHA-512
authentication mechanism. -
Added the
cluster_name
configuration parameter. -
The
authentication.user
andauthentication.password
configuration fields are now marked as referenceable, which means they can be securely stored as secrets in a vault. References must follow a specific format.
Kong Gateway 2.7.x
-
Starting with Kong Gateway 2.7.0.0, if keyring encryption is enabled, the
config.authentication.user
andconfig.authentication.password
parameter values will be encrypted.There’s a bug in Kong Gateway that prevents keyring encryption from working on deeply nested fields, so the
encrypted=true
setting does not currently have any effect in this plugin.
Kong Gateway 2.6.x
- The Kafka Log plugin now supports TLS, mTLS, and SASL auth. SASL auth includes support for PLAIN, SCRAM-SHA-256, and delegation tokens.