Sinks
This page details the configuration options for the Stream Reactor Kafka Connect sink connectors.
Sink connectors read data from Kafka and write to an external system.

AWS S3
Sink data from Kafka to AWS S3 including backing up topics and offsets.

Azure CosmosDB
Sink data from Kafka to Azure CosmosDB.

Azure Data Lake Gen2
Sink data from Kafka to Azure Data Lake Gen2 including backing up topics and offsets.

Azure Event Hubs
Load data from Azure Event Hubs into Kafka topics.

Azure Service Bus
Sink data from Kafka to Azure Service Bus topics and queues.

Cassandra
Sink data from Kafka to Cassandra.

Elasticsearch
Sink data from Kafka to Elasticsearch.

GCP PubSub
Sink data from Kafka to GCP PubSub.

GCP Storage
Sink data from Kafka to GCP Storage.

HTTP Sink
Sink data from Kafka to a HTTP endpoint.

InfluxDB
Sink data from Kafka to InfluxDB.

JMS
Sink data from Kafka to JMS.

MongoDB
Sink data from Kafka to MongoDB.

MQTT
Sink data from Kafka to MQTT.

Redis
Sink data from Kafka to Redis.
FAQ
Can the datalakes sinks lose data?
Kafka topic retention policies determine how long a message is retained in a topic before it is deleted. If the retention period expires and the connector has not processed the messages, possibly due to not running or other issues, the unprocessed Kafka data will be deleted as per the retention policy. This can lead to significant data loss since the messages will no longer be available for the connector to sink to the target system.
Do the datalake sinks support exactly once semantics?
Yes, the datalakes connectors natively support exactly-once guarantees.
How do I escape dots in field names in KCQL?
Field names in Kafka message headers or values may contain dots (.
). To access these correctly, enclose the entire target in backticks (```) and each segment which consists of a field name in single quotes ('
):
How do I escape other special characters in field names in KCQL?
For field names with spaces or special characters, use a similar escaping strategy:
Field name with a space:
`_value.'full name'`
Field name with special characters:
`_value.'$special_characters!'`
This ensures the connector correctly extracts the intended fields and avoids parsing errors.
Last updated
Was this helpful?