Features Overview

Lenses is the Streaming Data Management Platform for Apache Kafka. It delivers the core elements of Apache Kafka along with a web user interface and rich enterprise-ready capabilities. This leads to an improved user experience for engineering teams, business users, data scientists, and administrators while using Apache Kafka.

Flexibility View and store any type of data. Manipulate it using batch processing, or interactive SQL
Integration Connect to popular data stores using one of the 100+ open source Kafka connectors
Security Process and control sensitive data with SSL, LDAP and Kerberos support
Scalability Provision SQL Processors and scale them to suit your requirements with Kubernetes and other scalable modes
High Availability Perform real-time operational business tasks with confidence
Monitoring Provides infrastructure and application topology KPIs
Alerting Receive notifications on your phone, email, or Slack
Compatibility Leverage your existing IT infrastructure
Auditing Role-based access and auditing for compliance with regulations

Explore data in motion

The platform provides a rich Web Interface allowing you to explore Kafka topics in real-time or browse, search and filter historical data with full access to partition/offset/timestamp information.


LSQL Processors

Create boundless data queries with LSQL processors to aggregate, join and/or transform Apache Kafka streams. Through the Web Interface, the topologies can be visualized, monitored or even scaled out for more throughput. Furthermore, the execution plan viewer allows you to optimize for performance.


Stream Topologies

With Lenses, you can now build and operate with confidence complex streaming topologies, such as ETL data pipelines, stream processing and analytics combining multiple Connectors, Processors and Topics with full data lineage.

The platform provides more than just LSQL processors data flow graph. It is also able to display a global, high-level view of the entire data flow pipelines in one interactive graph which contains topics, connectors, and processors.


AVRO Support

Full support for AVRO messages, including decimal type (for financial institutions), is available in Lenses SQL Engine. Lenses platform integrates with your schema registry and provides a rich user interface to create, edit and track schemas.


Kafka Connect

With Lenses, you can manage multiple connect clusters, build streaming ETL data pipelines with ease while monitoring connectors and their tasks. Lenses contains the largest collection of Apache Kafka connectors with Lenses SQL support, for all major data sources and sinks including Cassandra, Elastic, InfluxDB, Azure CosmosDB, MQTT, JMS and more.


See how British Gas streams 4+ billion entries every day into Elastic-Search with a simple SQL statement:

# Upsert into elastic with auto create and index suffix

Kafka Connect Metrics

With Kafka 1.0 and above, additional metrics are supported and available for your Connect clusters.


Consumers Lag

Monitor Kafka consumer lag in real time and manage the consumer offsets. Set up alerts to make sure you can scale or take the appropriate action.


Monitor Services

Monitor the core services and infrastructure via JMX for Kafka Brokers, Zookeeper, Schema Registry and Connect. Prevent and react to potential issues and get insights into your clusters’ performance.

../../_images/lenses-monitoring.png ../../_images/lenses-offline-partitions.png


Set up alerts to prevent issues earlier.



Track all changes to your Kafka cluster: topic creation, configuration modification, deletion (for Schemas, Connectors, and Processors). A single point of reference for Who did what and when across your data streaming pipeline with LDAP support.



Basic and LDAP authentication is supported. Additionally, different operator roles are provided to restrict user actions, giving operators a way to provide security on Kafka. Users can manage topic ACLs (Access Control Lists) to restrict readers and writers of data.


Lenses SQL Engine

Lenses SQL is a multi-purpose engine, it fully supports AVRO and JSON payload types and can execute both batch and real-time streaming SQL over Apache Kafka.