Multi-Cluster Apache Kafka Monitoring¶
Many organisations manage multiple Apache Kafka deployments across on-premises and cloud environments. Keeping track of the health and state of each Apache Kafka cluster as well as understanding its deployment type can be challenging for operations.
Lenses offers a portal to provide a global view across multiple Lenses instances (known as “workspaces”) connected to different Apache Kafka environments. All Apache Kafka deployments are supported including: open-source Kafka, Confluent Cloud, AWS Managed Streaming for Apache Kafka (MSK), Aiven and Azure HDInsight.
The portal is designed for data platform engineers to have a unified view and monitor the global health and availability of different Apache Kafka environments to simplify operations.
Lenses Multi-Cluster is deployed as a separate package to your Lenses instances. The package can be downloaded for free from here. but requires being connected to licensed Lenses instances, each defined as “workspaces”.
Workspaces can be added from within the Multi-Cluster UI by having HTTP connectivity and authentication onto your different Lenses instances.
Configuration for Multi-Cluster is held in a database. Multi-Cluster portal comes with a SQLite database ready to run in a Linux machine 64-bit.
After downloading the package, it can be installed by
tar -xzf lensesio-portal_1.0.3_Linux_64bit.tar.gz cd lensesio-portal_1.0.3_Linux_64bit ./lensesio-portal
Following installation, the portal can be accessed via http://localhost:3000
You should remember to backup the .db file to avoid loss of configuration information.
A SQLite deployment should only be used only for quick installation for development environments. For production environments, Multi-Cluster should be connected to a Postgres database. You will be required to setup your own Postgres instance and connect Multi-Cluster to it.
Postgres Config with env variables:
LENSES_DB_TYPE="postgress" LENSES_DB_PORT=5432 LENSES_DB_NAME="multicluster" # You need to have created this before running multi-cluster LENSES_DB_USERNAME="<db-username>" LENSES_DB_PASSWORD="<db-password>"
Multi-cluster ensures sensitive data (service token accounts) is encrypted. There are two options for encryption:
Use a base64-encoded key, of length 32 bytes as an encryption key. Called “local”
Local Config with env variables:
Use Vault Transit Secrets Engine to encrypt the sensitive data
Vault Config with env variables:
LENSES_SECRETS_TYPE="vault" LENSES_SECRETS_SECRET="<the-key-name-in-vault-transit-engine>" LENSES_SECRETS_VAULT_HOST="<http://<your-vault-instance>" LENSES_SECRETS_VAULT_TOKEN="<http://<your-vault-token>"
HTTP settings to change the port in which Multi-Cluster binds to as well as cross-origin can be changed accordingly:
HTTP Config with env variables:
LENSES_HTTP_ADDRESS=":3000" LENSES_HTTP_CORS_ENABLE=true LENSES_HTTP_CORS_ORIGIN="*"
As we mentioned above it’s possible instead of environment variables to use a YAML configuration file.
http: cors: enable: true origin: "*" secrets: type: local secret: "<your-base64-encoded-key-of-length-32-bytes>" db: type: "postgres" host: "localhost" port: 5432 name: "multicluster" username: "<your-db-username>" password: "<your-db-password>"
Multiple Workspaces can be connected to your Multi-Cluster instance. Each Workspace represents a different licensed Lenses instance connected to an Apache Kafka cluster.
Here is a quick animated screenshots which demonstrates how to use multi-cluster:
Multi-cluster leverages the Lenses Service accounts in order to authenticate with a Lenses workspace to collect data.
To add a new Workspace:
- In the Lenses instance you want to connect, create a group with read-only permissions
- In the same Lenses instance, create a service account and assign it to the group just created. Keep the token safe.
- In your Multi-Cluster, click Lenses Workspace and enter a name the the instance, the URL and port of your Lenses instance, service account token and name of your service account