Sinks

This page details the configuration options for the Stream Reactor Kafka Connect sink connectors.

Sink connectors read data from Kafka and write to an external system.


FAQ

Can the datalakes sinks lose data?

Kafka topic retention policies determine how long a message is retained in a topic before it is deleted. If the retention period expires and the connector has not processed the messages, possibly due to not running or other issues, the unprocessed Kafka data will be deleted as per the retention policy. This can lead to significant data loss since the messages will no longer be available for the connector to sink to the target system.

Do the datalake sinks support exactly once semantics?

Yes, the datalakes connectors natively support exactly-once guarantees.

How do I escape dots in field names in KCQL?

Field names in Kafka message headers or values may contain dots (.). To access these correctly, enclose the entire target in backticks (```) and each segment which consists of a field name in single quotes ('):

INSERT INTO `_value.'customer.name'.'first.name'` SELECT * FROM topicA

How do I escape other special characters in field names in KCQL?

For field names with spaces or special characters, use a similar escaping strategy:

  • Field name with a space: `_value.'full name'`

  • Field name with special characters: `_value.'$special_characters!'`

This ensures the connector correctly extracts the intended fields and avoids parsing errors.

Last updated

Logo

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.