Data applications & integration


In Lenses, you can create and monitor your streaming applications. You will get an overview of your application ecosystem: Kafka Connectors, SQL Processors, discover consumers and register your applications and a generated topology of your flows demonstrating the interaction of your apps with the data. You can also monitor and manage your underlying Kafka Consumer groups.

Apps permissions 

For every application there are some operations in Lenses to create, delete, scale or other management actions which are goverened by the permission system. Application permissions are scoped based on your groups namespaces.

Types of streaming apps 

SQL Processors 

SQL processor details

SQL Processors are used to continuously process data in Kafka topics. They use Lenses SQL streaming syntax to create and deploy applications to process continuous streams of data. Lenses application deployment framework you get a seamless experience to create scalable processing components to filter, aggregate, join, transform kafka topics, which can scale natively to Kubernetes or other deployment targets. It’s a very simple way to immediately start creating stream processing flows in Kafka, without worrying about the development and deployment lifecycle, but also have a declarative way to express the logic of your stream processing needs. To create a SQL Processor you will need to use Lenses SQL Streaming syntax. It’s also recommended to get familiarised with streaming concepts and semantics, if it’s the first time to use it.

Kafka Connectors 

Connectors are reusable components to bring data in and out of a Kafka cluster. There is a variety of connectors available in the community, licenced or open-sourced, ready to be used to integrate data streams. At Lenses we have developed a collection of Apache-2 open source connectors which you can make available at your clusters but also integrate any other connector off-self or custom. You would need a connect cluster to be available with the relevant connector plugins available so you can start instantiate them and get data flowing. You can configure multiple connect clusters and if you have connectors already running Lenses will pick them up for monitoring and creating a visual topology with the interacting topics, as long as they follow standard configurations.

External Apps 

In any other case that you have a custom microservice or other streaming application running and producing, consuming or processing data in Kafka you can self-register the application so that you can catalog it, monitor and visualize it in your topology. There are two way you can achieve that in Lenses:

  • If you have a JVM based application, you can use our JVM client to instrument the app
  • or you can use a REST endpoint to register some metadata, runners, lineage information etc. which for example you include in your ci/cd or automation scripts.