All pages
Powered by GitBook
1 of 1

Loading...

Sink converters & different data formats

This page describes configuring sink converters for Kafka Connect.

You can configure the converters either at the Connect worker level or at the Connector instance level.

Schema.Struct and a Struct Payload

If you follow the best practice while producing the events, each message should carry its schema information. The best option is to send AVRO.

This requires the SchemaRegistry.

Schema.String and a JSON Payload

Sometimes the producer would find it easier to just send a message with Schema.String and a JSON string. In this case, your connector configuration should be set to value.converter=org.apache.kafka.connect.json.JsonConverter. This doesn’t require the SchemaRegistry.

No schema and a JSON Payload

Many existing systems publish JSON over Kafka and bringing them in line with best practices is quite a challenge, hence we added the support. To enable this support you must change the converters in the connector configuration.

key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false