Stream Reactor sink connectors support KCQL.
Field names in Kafka message headers or values may contain dots (.). To access these correctly, enclose the entire target in backticks (```) and each segment which consists of a field name in single quotes ('):
.
'
INSERT INTO `_value.'customer.name'.'first.name'` SELECT * FROM topicA
For field names with spaces or special characters, use a similar escaping strategy:
`_value.'full name'`
`_value.'$special_characters!'`
This ensures the connector correctly extracts the intended fields and avoids parsing errors.
Azure CosmosDB
A Kafka Connect sink connector for writing records from Kafka to Azure CosmosDB using the SQL API.
AWS S3
A Kafka Connect sink connector for writing records from Kafka to AWS S3 buckets
Azure Datalake
A Kafka Connect sink connector for writing records from Kafka to Azure Datalake buckets
Cassandra
A Kafka Connect sink connector for writing records from Kafka to Cassandra.
GCP Storage
A Kafka Connect sink connector for writing records from Kafka to GCP Storage buckets
HTTP
A Kafka Connect sink connector for writing records from Kafka to HTTP endpoints.
Elastic
A set of Kafka Connect sink connectors for writing records from Kafka to Elastic.
InfluxDB
Kafka Connect sink connector for writing data from Kafka to InfluxDB.
JMS
Kafka Connect sink connector for writing data from Kafka to JMS.
MongoDB
Kafka Connect sink connector for writing data from Kafka to MongoDB.
MQTT
Kafka Connect sink connector for writing data from Kafka to MQTT.
Service Bus
Kafka Connect sink connector for writing data from Kafka to Azure Service Bus.
Redis
Kafka Connect sink connector for writing data from Kafka to Redis.
Payload support
Error policies
On this page