# Sink converters & different data formats

{% hint style="info" %}
You can configure the converters either at the Connect worker level or at the Connector instance level.&#x20;
{% endhint %}

### Schema.Struct and a Struct Payload  <a href="#schemastruct-and-a-struct-payload" id="schemastruct-and-a-struct-payload"></a>

If you follow the best practice while producing the events, each message should carry its schema information. The best option is to send AVRO.

```properties
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
```

This requires the SchemaRegistry.

### Schema.String and a JSON Payload  <a href="#schemastring-and-a-json-payload" id="schemastring-and-a-json-payload"></a>

Sometimes the producer would find it easier to just send a message with **`Schema.String`** and a JSON string. In this case, your connector configuration should be set to **`value.converter=org.apache.kafka.connect.json.JsonConverter`**. This doesn’t require the SchemaRegistry.

```properties
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
```

### No schema and a JSON Payload  <a href="#no-schema-and-a-json-payload" id="no-schema-and-a-json-payload"></a>

Many existing systems publish JSON over Kafka and bringing them in line with best practices is quite a challenge, hence we added the support. To enable this support you must change the converters in the connector configuration.

```properties
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false
```
