Skip to content
Stand with Ukraine flag

Kafka integration

TBMQ Kafka Integration enables communication with Apache Kafka, allowing TBMQ to publish messages to external Kafka clusters. This is useful for the following scenarios:

  • Streaming IoT Data — forwarding device telemetry, logs, or events to Kafka for processing and storage.
  • Event-Driven Architectures — publishing messages to Kafka topics for real-time analytics and monitoring.
  • Decoupled System Communication — using Kafka as a buffer between TBMQ and downstream applications.
  1. Device (client) publishes an MQTT message to a topic matching the integration’s topic filters.
  2. TBMQ broker receives the message and forwards it to the TBMQ Integration Executor.
  3. Integration Executor processes the message and sends it to a configured Kafka topic.
  4. Kafka consumers process the message in downstream systems.
  • A running TBMQ instance.
  • An external Kafka cluster ready to receive messages (e.g., Confluent Cloud).
  • A client capable of publishing MQTT messages (e.g., TBMQ WebSocket Client).
  1. Go to the Integrations page and click the ”+” button.
  2. Select Kafka as the integration type and click Next.
  3. On the Topic Filters step, click Next to subscribe to the default topic tbmq/#.
  4. In the Configuration step, enter the Bootstrap servers (Kafka broker addresses). See below for common and Confluent Cloud configurations.
  5. Click Add to save the integration.

Specify the bootstrap server address (e.g., localhost:9092 for a local broker). The screenshot below shows the basic configuration for establishing a connection between TBMQ and a Kafka broker.

Topic filters define MQTT-based subscriptions that trigger the integration. When TBMQ receives a message matching a configured topic filter, the integration processes it and forwards the data to the Kafka cluster.

For example, with the topic filter tbmq/devices/+/status, any of the following messages will trigger the integration:

tbmq/devices/device-01/status
tbmq/devices/gateway-01/status
FieldDescription
Send only message payloadIf enabled, only the raw message payload is forwarded. If disabled, TBMQ wraps the payload in a JSON object with additional metadata.
Bootstrap serversKafka broker addresses (comma-separated list of hostnames/IPs and ports).
TopicThe Kafka topic where messages will be published.
Key(Optional) Used for partitioning. Kafka hashes the key to consistently assign messages to the same partition.
Client ID prefix(Optional) Prefix for the Kafka client ID. Default: tbmq-ie-kafka-producer.
Automatically retry times if failsNumber of retries before marking a message as failed.
Produces batch size in bytesMaximum batch size before sending messages to Kafka.
Time to buffer locally (ms)Time to buffer messages locally before sending.
Client buffer max size in bytesMaximum memory allocated for buffering messages before sending.
Number of acknowledgmentsKafka acknowledgment mode: all, 1, 0.
CompressionCompression algorithm: none, gzip, snappy, lz4, zstd.
Other propertiesAdditional Kafka producer configuration as key-value pairs.
Kafka headersCustom headers added to Kafka messages.
MetadataCustom metadata attached to forwarded messages.

TBMQ logs integration-related events for debugging and troubleshooting:

  • Lifecycle Events — logs events such as Started, Created, Updated, Stopped.
  • Statistics — insights into integration performance, including processed message counts and error rates.
  • Errors — captures failures related to authentication, timeouts, payload formatting, or connectivity issues.
  1. Navigate to the WebSocket Client page.
  2. Select WebSocket Default Connection (or any working connection) and click Connect. Verify the connection status shows Connected.
  3. Set the Topic field to tbmq/kafka-integration to match the integration’s topic filter tbmq/#.
  4. Click the Send icon to publish the message.

If successful, the message should be available in your Kafka service under the topic tbmq.messages:

{
"payload": "eyJ0ZW1wZXJhdHVyZSI6MjV9",
"topicName": "tbmq/kafka-integration",
"clientId": "tbmq_df52bNUQ",
"eventType": "PUBLISH_MSG",
"qos": 1,
"retain": false,
"tbmqIeNode": "tbmq_ie_node",
"tbmqNode": "tbmq_node",
"ts": 1742554969254,
"props": {},
"metadata": {
"integrationName": "Kafka integration"
}
}

Message field descriptions:

FieldDescription
payloadBase64-encoded content of the MQTT message (e.g., "eyJ0ZW1wZXJhdHVyZSI6MjV9" decodes to {"temperature": 25}).
topicNameMQTT topic to which the message was published.
clientIdID of the MQTT client that published the message.
eventTypeType of MQTT event. PUBLISH_MSG is the only supported type.
qosQuality of Service level of the incoming message.
retainWhether the message has the Retain flag set.
tbmqIeNodeNode ID of the Integration Executor that handled the message.
tbmqNodeNode ID of the TBMQ broker that received the message.
tsTimestamp (milliseconds) when the message was received.
propsMQTT 5.0 user properties or other MQTT properties.
metadataAdditional metadata from integration configuration (e.g., integration name).