Installation¶
Prerequisites¶
A Kafka Connect instance is required to run the connectors in standalone or distributed mode. Lenses Box contains a preconfigured setup for both Kafka Connect and Stream Reactor, as well as a number of third-party connectors. To install Kafka Connect outside of Lenses Box follow the instructions at Kafka.
You will also need the Kafka Brokers, Zookeeper and optionally the Schema Registry to be installed and running. If you are using the Schema Registry, you may set the key and value serializers in the worker properties:
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
If you require support for Schema Registry, you can download this from Confluent.
Stream Reactor Install¶
To install in a production environment or on Connect clusters, download the entire release. Or cherry-pick manually the connector you are interested from here.
The advised installation method uses the classpath isolation features (plugin.path) that were introduced in Kafka Connect 0.11.
All Connectors Archive¶
For the entire collection, unpack the archive under /opt
or your preferred location:
wget https://github.com/lensesio/stream-reactor/releases/download/x.x.x/stream-reactor-x.x.x-x.x.x.tar.gz
tar xvf stream-reactor-x.x.x-x.x.x.tar.gz -C /opt
Within the unpacked directory you will find the following structure:
/opt/stream-reactor-x.x.x-x.x.x
├── bin
├── conf
├── libs
├── LICENSE
└── README.md
The libs
directory contains all the Stream Reactor Connector jars.
Edit your Connect worker properties and append to plugin.path
the libs
directory. As an example:
plugin.path=/usr/share/connectors,/opt/stream-reactor-x.x.x-x.x.x/libs
Next restart your Connect worker. This step depends on the installation medium of Kafka Connect in your server.
Last, repeat this process for all the Connect workers in your cluster. The connectors must be available to all the workers.
Single Connector Archive¶
For a single connector, create a directory first, then extract the archive:
wget https://github.com/lensesio/stream-reactor/releases/download/x.x.x/kafka-connect-cassandra-x.x.x-x.x.x-all.tar.gz
mkdir -p /opt/stream-reactor
tar xzvf kafka-connect-cassandra-x.x.x-x.x.x-all.tar.gz -C /opt/stream-reactor
Within the unpacked directory you will find the following structure:
/opt/stream-reactor
├── kafka-connect-cassandra-x.x.x-x.x.x-all.jar
└── LICENSE
The stream-reactor
directory contains the Stream Reactor Connector jar.
Edit your Connect worker properties and append it to plugin.path
setting.
As an example:
plugin.path=/usr/share/connectors,/opt/stream-reactor
Next restart your Connect worker. This step depends on the installation medium of Kafka Connect in your server.
Last, repeat this process for all the Connect workers in your cluster. The connector must be available to all the workers.
Tip
The more connectors you add to the plugin.path
, the longer the Connect
workers will take to start and the more memory they will use out of the
box. This is because Connect is scanning all the jars in this directory for
those classes implementing the Kafka Connect Source and Sink interfaces. You
may want to restrict which connectors are loaded by default.
Helm Charts¶
Helm is a package manager for Kubernetes, Helm charts are available for Connectors here and targeted toward use with the Landscaper.
Even if you use Dockers your landscape can still be complex, handling multi-tenancy, inspecting and managing docker files, handling service discovery, environment variables and promotion to production, Helm and the Landscaper helps you manage this.
To add the Stream Reactor Helm charts simply add our repository to your Helm instance:
helm repo add lenses https://lensesio.github.io/kafka-helm-charts/
Cloudera Integration¶
When Landoop’s FAST DATA CSD is used, the Stream Reactor connector can be installed and provisioned the connector in a few seconds. More information and step-by-step instructions on how to install the parcel can be found at FAST DATA docs