PubNub Kafka Sink Connector
This guide provides steps for deploying PubNub Kafka Sink Connector to integrate Apache Kafka with the PubNub real-time messaging service and let you forward data from Apache Kafka topics directly to PubNub channels.
You can choose between two paths: testing with a preconfigured Docker Compose setup or deploying the connector to a production Kafka environment.
Stream events from PubNub to Kafka
Apart from receiving Kafka events in PubNub, PubNub lets you stream PubNub events to Kafka through Kafka Action.
Prerequisites
- I'm just testing
- Use for production
You need these to get started:
- Docker
- Access to PubNub's Admin Portal
You need these to get started:
- Apache Kafka and Kafka Connect setup
- Maven 3.8.6+
- Java 11+
- Access to PubNub's Admin Portal
Steps
Follow the installation steps for test or production environments.
- I'm just testing
- Use for production
-
Clone the
pubnub-kafka-sink-connector
repository to your local machine.git clone git@github.com:pubnub/pubnub-kafka-sink-connector.git
-
Change your current directory to the one that contains the cloned source code.
cd pubnub-kafka-sink-connector
-
Log in to the Admin Portal and get your development publish and subscribe keys from your app's keyset on the Admin Portal.
-
Modify the default configuration options of PubNub Kafka Sink Connector in the
examples/pubnub-sink-connector.json
file and place your keys from the Admin Portal underpublish_key
andsubscribe_key
. -
Use Docker Compose to build the ready Kafka and Kafka Connect images.
docker compose up
-
Deploy the connector to the Kafka Connect cluster.
curl -X POST \
-d @examples/pubnub-sink-connector.json \
-H "Content-Type:application/json" \
http://localhost:8083/connectors -
Verify if the connector works.
Sample producer included in PubNub Kafka Sink Connector will generate test messages (like
{"timestamp":1705689453}
) every few seconds.You can use the Debug Console to verify that these messages are published on one of the predefined PubNub channels. Ensure to provide your publish key, subscribe key specified during configuration,
pubnub
as the channel, and a user ID. -
Once you're done testing, undeploy the connector from Kafka Connect.
curl -X DELETE \
http://localhost:8083/connectors/pubnub-sink-connector -
Stop the Kafka and Kafka Connect containers using Docker Compose.
docker compose down
-
Clone the
pubnub-kafka-sink-connector
repository to your local machine.git clone git@github.com:pubnub/pubnub-kafka-sink-connector.git
-
Change your current directory to the one that contains the cloned source code.
cd pubnub-kafka-sink-connector
-
Log in to the Admin Portal and get your production publish and subscribe keys from your app's keyset on the Admin Portal.
-
Edit the default configuration options in
examples/pubnub-sink-connector-test.json
and put your production details, such as the Kafka topics and PubNub API keys. -
Compile PubNub Kafka Sink Connector locally.
Run the following command in the root directory of your connector source code to compile it into a JAR file:
mvn clean package
After running the command, a file named
pubnub-kafka-connector-1.x.jar
will be created in the target directory. -
Add the packaged connector as a Kafka Connect plugin.
Use the created JAR file to deploy the connector to a Kafka Connect cluster by copying it to the appropriate directory and configuring it accordingly. The exact steps for deploying and configuring Kafka Connect will depend on the specific Kafka Connect cluster setup and requirements.
-
Fill in your Kafka Connect host address (
your_kafka_connect_host
) and run the command to deploy the connector to your production Kafka Connect cluster.curl -X POST \
-d @examples/pubnub-sink-connector.json \
-H "Content-Type:application/json" \
http://your_kafka_connect_host:8083/connectorsAny new data sent to the Kafka topics will be copied to the target devices listening on the PubNub channels defined in the configuration file.
Configuration options
The configuration from the examples/pubnub-sink-connector-test.json
file defines how PubNub Kafka Sink Connector will be set up.
Here's a breakdown of each parameter, detailing its purpose and whether it's required or optional.
{
"name": "pubnub-sink-connector",
"config": {
"topics": "pubnub,pubnub1,pubnub2",
"topics.regex": "",
"pubnub.user_id": "myUserId123",
"pubnub.publish_key": "demo",
"pubnub.subscribe_key": "demo",
"pubnub.secret_key": "demo",
"connector.class": "com.pubnub.kafka.connect.PubNubKafkaSinkConnector",
"tasks.max": "3",
"value.deserializer": "custom.class.serialization.JsonDeserializer",
"value.serializer": "custom.class.serialization.JsonSerializer",
"errors.tolerance": "all",
"errors.deadletterqueue.topic.name": "my_dlq_topic",
show all 18 linesParameter | Description |
---|---|
name * | Unique name for the connector instance used to identify it. |
topics |