Kafka Bridge

PubNub's Kafka Bridge connects real‑time messaging with Apache Kafka stream processing.

Use it to receive Kafka events in PubNub and to stream events from PubNub to Kafka. This bridge lets you build responsive, scalable apps with low latency.

Kafka Bridge diagram showing PubNub ↔ Kafka event flow

Receive events from Kafka into PubNub

The PubNub Kafka Sink Connector streams events from an Apache Kafka cluster into PubNub with minimal latency.

This helps when your app must react immediately to data processed in Kafka, such as chat updates, live dashboards, or IoT alerts.

Set up the PubNub Kafka Sink Connector:

  1. Compile the connector from source.
  2. Configure the connector with PubNub publish/subscribe keys, Kafka topic details, and connection info.
  3. Deploy the connector to Kafka Connect to bridge Kafka events to target PubNub channels.

Stream events from PubNub to Kafka

The Kafka Action in Events & Actions sends data from PubNub channels to an external Kafka cluster. Kafka Action works with any hosting option.

Use this when you need real‑time processing in Kafka for analytics, storage, or big‑data pipelines.

To configure the Kafka Action:

  1. Prepare your Kafka environment: topic, key, broker URLs, and authentication.
  2. In the Admin Portal, create the Kafka Action. Specify the topic and key, auth mechanism, username, password, and broker URLs.

You can create a Kafka Action whether you already have a Kafka cluster or are setting up a new one. Supported hosting includes self‑hosted Kafka, Amazon MSK (Terraform), and Confluent Cloud. For the action steps, see Kafka Action.

Last updated on