LogoLogo
HomeProductsDownload Community Edition
6.0
  • Lenses DevX
  • Kafka Connectors
6.0
  • Overview
  • What's New?
    • Version 6.0.3
      • Features / Improvements & Fixes
    • Version 6.0.2
    • Version 6.0.1
    • Version 6.0.0-la.2
      • Features / Improvements & Fixes
    • Version 6.0.0-la.1
      • Features / Improvements & Fixes
    • Version 6.0.0-la.0
      • Features / Improvements & Fixes
    • Version 6.0.0-alpha.20
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.19
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.18
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.17
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.16
    • Version 6.0.0-alpha.14
  • Getting Started
    • Setting Up Community Edition
    • Connecting Lenses to your Kafka environment
      • Overview
      • Install
  • Deployment
    • Installation
      • Kubernetes - Helm
        • Deploying HQ
        • Deploying an Agent
      • Docker
        • Deploying HQ
        • Deploying an Agent
      • Linux
        • Deploying HQ
        • Deploying an Agent
    • Configuration
      • Authentication
        • Admin Account
        • Basic Authentication
        • SSO & SAML
          • Overview
          • Azure SSO
          • Google SSO
          • Keycloak SSO
          • Okta SSO
          • OneLogin SSO
          • Generic SSO
      • HQ
        • Configuration Reference
      • Agent
        • Overview
        • Provisioning
          • Overview
          • HQ
          • Kafka
            • Apache Kafka
            • Aiven
            • AWS MSK
            • AWS MSK Serverless
            • Azure EventHubs
            • Azure HDInsight
            • Confluent Cloud
            • Confluent Platform
            • IBM Event Streams
          • Schema Registries
            • Overview
            • AWS Glue
            • Confluent
            • Apicurio
            • IBM Event Streams Registry
          • Kafka Connect
          • Zookeeper
          • AWS
          • Alert & Audit integrations
          • Infrastructure JMX Metrics
        • Hardware & OS
        • Memory & CPU
        • Database
        • TLS
        • Kafka ACLs
        • Rate Limiting
        • JMX Metrics
        • JVM Options
        • SQL Processor Deployment
        • Logs
        • Plugins
        • Configuration Reference
  • User Guide
    • Environments
      • Create New Environment
    • Lenses Resource Names (LRNs)
    • Identity & Access Management
      • Overview
      • Users
      • Groups
      • Roles
      • Service Accounts
      • IAM Reference
      • Example Policies
    • Topics
      • Global Topic Catalogue
      • Environment Topic Catalogue
        • Finding topics & fields
        • Searching for messages
        • Inserting & deleting messages
        • Viewing topic metrics
        • Viewing topic partitions
        • Topic Settings
        • Adding metadata & tags to topics
        • Managing topic configurations
        • Approval requests
        • Downloading messages
        • Backup & Restore
    • SQL Studio
      • Concepts
      • Best practices
      • Filter by timestamp or offset
      • Creating & deleting Kafka topics
      • Filtering
      • Limit & Sampling
      • Joins
      • Inserting & deleting data
      • Aggregations
      • Metadata fields
      • Views & synonyms
      • Arrays
      • Managing queries
    • Applications
      • Connectors
        • Overview
        • Sources
        • Sinks
        • Secret Providers
      • SQL Processors
        • Concepts
        • Projections
        • Joins
        • Lateral Joins
        • Aggregations
        • Time & Windows
        • Storage format
        • Nullibility
        • Settings
      • External Applications
        • Registering via SDK
        • Registering via REST
    • Schemas
    • Monitoring & Alerting
      • Infrastructure Health
      • Alerting
        • Alert Reference
      • Integrations
      • Consumer Groups
    • Self Service & Governance
      • Data policies
      • Audits
      • Kafka ACLs
      • Kafka Quotas
    • Topology
    • Tutorials
      • SQL Processors
        • Data formats
          • Changing data formats
          • Rekeying data
          • Controlling AVRO record names and namespaces
          • Changing the shape of data
        • Filtering & Joins
          • Filtering data
          • Enriching data streams
          • Joining streams of data
          • Using multiple topics
        • Aggregations
          • Aggregating data in a table
          • Aggregating streams
          • Time window aggregations
        • Complex types
          • Unwrapping complex types
          • Working with Arrays
        • Controlling event time
      • SQL Studio
        • Querying data
        • Accessing headers
        • Deleting data from compacted topics
        • Working with JSON
    • SQL Reference
      • Expressions
      • Functions
        • Aggregate
          • AVG
          • BOTTOMK
          • COLLECT
          • COLLECT_UNIQUE
          • COUNT
          • FIRST
          • LAST
          • MAXK
          • MAXK_UNIQUE
          • MINK
          • MINK_UNIQUE
          • SUM
          • TOPK
        • Array
          • ELEMENT_OF
          • FLATTEN
          • IN_ARRAY
          • REPEAT
          • SIZEOF
          • ZIP_ALL
          • ZIP
        • Conditions
        • Conversion
        • Date & Time
          • CONVERT_DATETIME
          • DATE
          • DATETIME
          • EXTRACT_TIME
          • EXTRACT_DATE
          • FORMAT_DATE
          • FORMAT_TIME
          • FORMAT_TIMESTAMP
          • HOUR
          • MONTH_TEXT
          • MINUTE
          • MONTH
          • PARSE_DATE
          • PARSE_TIME_MILLIS
          • PARSE_TIME_MICROS
          • PARSE_TIMESTAMP_MILLIS
          • PARSE_TIMESTAMP_MICROS
          • SECOND
          • TIMESTAMP
          • TIME_MICROS
          • TIMESTAMP_MICROS
          • TIME_MILLIS
          • TIMESTAMP_MILLIS
          • TO_DATE
          • TO_DATETIME
          • TOMORROW
          • TO_TIMESTAMP
          • YEAR
          • YESTERDAY
        • Headers
          • HEADERASSTRING
          • HEADERASINT
          • HEADERASLONG
          • HEADERASDOUBLE
          • HEADERASFLOAT
          • HEADERKEYS
        • JSON
          • JSON_EXTRACT_FIRST
          • JSON_EXTRACT_ALL
        • Numeric
          • ABS
          • ACOS
          • ASIN
          • ATAN
          • CBRT
          • CEIL
          • COSH
          • COS
          • DEGREES
          • DISTANCE
          • FLOOR
          • MAX
          • MIN
          • MOD
          • NEG
          • POW
          • RADIANS
          • RANDINT
          • ROUND
          • SIGN
          • SINH
          • SIN
          • SQRT
          • TANH
          • TAN
        • Nulls
          • ISNULL
          • ISNOTNULL
          • COALESCE
          • AS_NULLABLE
          • AS_NON_NULLABLE
        • Obfuscation
          • ANONYMIZE
          • MASK
          • EMAIL
          • FIRST1
          • FIRST2
          • FIRST3
          • FIRST4
          • LAST1
          • LAST2
          • LAST3
          • LAST4
          • INITIALS
        • Offsets
        • Schema
          • TYPEOF
          • DUMP
        • String
          • ABBREVIATE
          • BASE64
          • CAPITALIZE
          • CENTER
          • CHOP
          • CONCAT
          • CONTAINS
          • DECODE64
          • DELETEWHITESPACE
          • DIGITS
          • DROPLEFT
          • DROPRIGHT
          • ENDSWITH
          • INDEXOF
          • LEN
          • LOWER
          • LPAD
          • MKSTRING
          • REGEXP
          • REGEX_MATCHES
          • REPLACE
          • REVERSE
          • RPAD
          • STARTSWITH
          • STRIPACCENTS
          • SUBSTR
          • SWAPCASE
          • TAKELEFT
          • TAKERIGHT
          • TRIM
          • TRUNCATE
          • UNCAPITALIZE
          • UPPER
          • UUID
        • User Defined Functions
        • User Defined Aggregate Functions
      • Deserializers
      • Supported data formats
        • Protobuf
  • Resources
    • Downloads
    • CLI
      • Environment Creation
    • API Reference
      • API Authentication
      • Websocket Spec
      • Lenses API Spec
        • Authentication
        • Environments
        • Users
        • Groups
        • Roles
        • Service Accounts
        • Meta
        • Settings
        • License
        • Topics
        • Applications
          • SQL Processors
          • Kafka Connectors
          • External Applications
        • Kafka ACLs & Quotas
        • Kafka Consumer Groups
        • Schema Registry
        • SQL Query Management
        • Data Policies
        • Alert Channels
        • Audit Channels
        • Provisioning State
        • Agent Metadata
        • Backup & Restore
        • As Code
Powered by GitBook
LogoLogo

Resources

  • Privacy
  • Cookies
  • Terms & Conditions
  • Community EULA

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.

On this page
  • Configure HQ
  • Adding a Database Connection
  • Authentication
  • Start HQ
  • Create an Environment for your Kafka Cluster
  • Configure the Agent
  • Adding an Agent Database Connection
  • Adding a HQ Connection
  • Adding a Kafka Connection
  • Start the Agent

Was this helpful?

Export as PDF
  1. Getting Started
  2. Connecting Lenses to your Kafka environment

Install

This page describes configuring and starting Lenses HQ and Agent against your Kafka cluster.

PreviousOverviewNextInstallation

Last updated 19 days ago

Was this helpful?

This guide uses the Lenses docker-compose file. For non-dev installations and automation see the section.

Configure HQ

HQ is configured via by one file, config.yaml. The docker-compose files loads the content of hq.config.yaml and mounts it as the HQ config.yaml file.

Adding a Database Connection

You only need to follow this step if you do not want to use the local Postgres instance started by the docker-compose file.

You must create a database and role in your Postgres instance for HQ to use. See .

Edit the docker-compose.yaml and add the set the credentials for your database in the hq.config.yaml section.

docker-compose.yaml
 hq.config.yaml:
    content: |
      # ACCEPT THE LENSES EULA
      license:
        acceptEULA: true
      database:
        host: postgres:5432
        username: [YOUR_POSTGRES_YOUR_NAME]
        password: lenses
        database: hq

Authentication

Currently HQ supports:

  1. Basic Authentication (default)

  2. SAML

Start HQ

To start HQ, run the following docker command:

terminal
docker-compose up hq

Create an Environment for your Kafka Cluster

To create an environment in HQ:

  1. Login into HQ and create an environment, Environments->New Environment.

  2. At the end of the process, you will be shown an Agent Key. Copy that, keep it safe!

The environment will be disconnected until the Agent is up and configured with the key.

You can also manage environments using the CLI.

terminal
➜  lenses environments
Manage Environments.

Usage:
  lenses environments [command]

Aliases:
  environments, e, env, envs

Available Commands:
  create      Creates a new environment.
  delete      Deletes an environment.
  get         Retrieves a single environment by name.
  list        Lists all environments
  metadata    Manages environment metadata.
  update      Updates an environment.
  watch       Watch live environment updates.

Configure the Agent

The Agent is configured via two files:

  • provisioning.yaml - holds the connection details to your Kafka cluster and supporting systems. can set this via the agent.provisioning.yaml key in the docker-compose file.

Adding an Agent Database Connection

You only need to follow this step if you do not want to use the local Postgres instance started by the docker-compose file.

Update the docker-compose file agent.lenses.conf key for your Postgres instance.

docker-compose.yaml
 agent.lenses.conf:
    content: |
      lenses.storage.postgres.host=[YOUR_POSTGRES_INSTANCE]
      lenses.storage.postgres.port=[YOUR_POSTGRES_PORT]
      lenses.storage.postgres.database=agent
      lenses.storage.postgres.username=lenses
      lenses.storage.postgres.password=lenses

Adding a HQ Connection

The Agent Key for an environment needs to be added to the agent.provisioning.yaml key in the docker compose file.

docker-compose.yaml
agent.provisioning.yaml:
    content: |
      lensesHq:
        - configuration:
            agentKey:
              value: ${LENSESHQ_AGENT_KEY}
            port:
              value: 10000
            server:
              value: lenses-hq
          name: lenses-hq
          tags: ['hq']
          version: 1

Replace ${LENSESHQ_AGENT_KEY} with the Agent Key for the environment that you are linking to.

Adding a Kafka Connection

By default, the agent is configured to connect to Kafka on localhost. To change this, update the agent.provisioning.yaml key. The information required here depends on how you want the Agent to authenticate against Kafka.

  1. Add the following for a basic plaintext connection to a Kafka broker; if you are using a different authentication mechanism, adjust accordingly.

  2. Remove or adjust the Kafka (kafka-demo), Schema Registry and Connect services in the default docker-compose file.

docker-compose.yaml
agent.provisioning.yaml:
    content: |
      kafka:
        - name: kafka
          version: 1
          tags: [ 'kafka', 'dev' ]
          configuration:
            metricsType:
              value: JMX
            metricsPort:
              value: 9581
            kafkaBootstrapServers:
              value: PLAINTEXT://[YOUR_BOOTSTRAP_BROKER:PORT] 
            protocol:
              value: PLAINTEXT
      lensesHq:
        - configuration:
            agentKey:
              value: $${LENSESHQ_AGENT_KEY}
            port:
              value: 10000
            server:
              value: lenses-hq
          name: lenses-hq
          tags: ['hq']
          version: 1

Replace [YOUR_BOOTSTRAP_BROKER:PORT] with the bootstrap brokers and ports for the Kafka cluster you want the Agent to connect to.

Start the Agent

To start Agent, run the following Docker command:

For non-Dev environments, install the agent as close as possible to your Kafka clusters and automate the installation.

terminal
docker-compose up agent

Once the agent fully starts, it will report as connected in HQ, allowing you to explore your Kafka environments.

For this example, we will use basic authentication, for information on configuring other methods, see and configure the hq.config.yaml key accordingly for SAML.

You can now log in to your with admin/admin.

lenses.conf - holds low-level options for the agent and the database connection. You can set this via the agent.lenses.conf in the docker-compose file

You must create a database and role in your Postgres instance for the Agent to use. See .

For more information on the configuration of the connection to HQ, see .

See for examples of different authentication types for Kafka.

For examples of adding other services such as Schema Registries and Kafka Connect, see .

browser
configuration
here
provisioning
provisioning
Authentication
Installation
Database Role
Database Role