Source converters with incoming JSON or Avro

This page describes how to use converters with source systems sending JSON and Avro.

Source converters depend on the source system you are reading data from. The Connect SourceTask class requires you to supply a List of SourceRecords. Those records can have a schema but how the schema is translated from the source system to a Connect Struct depends on the connector.

We provide four converters out of the box but you can plug in your own. The WITHCONVERTER keyword supports this option. These can be used when source systems send records as JSON or AVRO, for example, MQTT or JMS.

Not all Connectors support the source converters. Check the option reference for your connector.

Before records are passed back to connect, they go through the converter if specified.

AvroConverter

io.lenses.streamreactor.connect.converters.source.AvroConverter

The payload is an Avro message. In this case, you need to provide a path for the Avro schema file to be able to decode it.

JsonPassThroughConverter

io.lenses.streamreactor.connect.converters.source.JsonPassThroughConverter 

The incoming payload is JSON, the resulting Kafka message value will be of type string and the contents will be the incoming JSON.

JsonSimpleConverter

io.lenses.streamreactor.connect.converters.source.JsonSimpleConverter

The payload is a JSON message. This converter will parse the JSON and create an Avro record for it which will be sent to Kafka.

JsonConverterWithSchemaEvolution

An experimental converter for translating JSON messages to Avro. The Avro schema is fully compatible as new fields are added as the JSON payload evolves.

BytesConverter

io.lenses.streamreactor.connect.converters.source.BytesConverter

Last updated

Logo

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.