Creating & managing a connector
This page describes managing a basic connector instance in your Connect cluster.
Creating your Kafka Connector Configuration
To deploy a connector into the Kafka Connect Cluster, you must follow the steps below:
You need to have the Jars in your Kafka Connect Cluster plugin.path
Each connector has mandatory configurations that must be deployed and validated; other configurations are optional. Always read the connector documentation first.
You can deploy the connector using Kafka Connect API or Lenses to manage it for you.
Sample of Connector, AWS S3 Sink Connector from Stream Rector:
Let's drill down this connector configuration and what will happen when I deploy:
connector.class
is the plugin we will use.task.max
is how many tasks will be executed in the Kafka Connect Cluster. In this example, we will have 1 task; my topic has 9 partitions, so in this case, in the consumer group of this connector, we will have 1 instance. This one task will consume 9 partitions. To scale is just to increase the number of tasks.name
of the connector on the Kafka Connect Cluster must be a unique name for each Kafka Connect Cluster.topics
the topic name will be consumed, and all data will be written in AWS S3, as our example describes.
value.converter
is the format type used to deserialize or serialize the data in value. In this case, our value will be injson
format.key.converter
is the format type used to deserialize or serialize the data in key. In this case, our key will be instring
format.key.converter.schemas.enable
is a field where we tell Kafka Connect if we want to include the value schema in the message. in our example,false
we don't want to include the value schema.value.converter.schemas.enable
is a field where we tell Kafka Connect if we want to include the key schema in the message. in our example,false
we don't want to include the key schema.
connect.s3.aws.auth.mode
is what is the type of authentication we will use to connect to the AWS S3 bucket.connect.s3.aws.access.ke
y
is the access key to authenticate into the AWS S3 bucket.connect.s3.aws.secret.key
is the secret key to authenticate into the AWS S3 bucket.connect.s3.aws.region
which region the AWS S3 bucket is deployed.connect.s3.kcql
We use the Kafka Connect Query for configuration, bucket name, folder, which format will be stored, and frequency of adding new files into the bucket.
For deep configuration of AWS S3 Sink connect, click here
Managing your Connector
After deploying your Kafka Connector into your Kafka Connect Cluster, it will be managed by the Kafka Connect Cluster.
To better show how Kafka Connect manages your connectors, we will use Lenses UI.
The image below is the Lenses Connectors list:
In the following image, we will delve into the Kafka Connector details.
Consumer Group
, when we use Lenses, we can see which consumer group is reading and consuming that topic.Connector tasks
, we can see which tasks are open, the status, and how many records are in and out of that topic.
Last updated