Streaming SQL

Stream Processing is fully supported by Lenses SQL engine Streaming mode.

SQL Processors offer:

  • A no-code, stand-alone application, executing a given Lenses SQL query on current and future data
  • Query graph visualisation
  • Fully integrated experience within Lenses
  • ACLs and Monitoring functionality out-of-the-box
  • Ability to scale up and down workflows via Kubernetes

Streaming queries read from Kafka topics, select and calculate new fields on the fly, aggregate data based on specific fields and windows, and finally write the output to the desired Kafka topics. They run the query continuously.

Getting started 

What queries look like 

Below are two Lenses SQL queries that will be used as examples for the rest of this section.

Here is the first query:

SET defaults.topic.autocreate=true;

INSERT INTO daily-item-purchases-stats
    , COUNT(*) AS dailyPurchases
    , AVG(price / quantity) AS average_per_unit
FROM purchases
GROUP BY itemId;

Here is the second one:

SET defaults.topic.autocreate=true;
SET auto.offset.reset='earliest';

WITH countriesStream AS (
  FROM countries

WITH merchantsStream AS (
  FROM merchants

WITH merchantsWithCountryInfoStream AS (
    m._key AS l_key
    , CONCAT(surname, ', ', name) AS fullname
    , language
    , platform
  FROM merchantsStream AS m JOIN countriesStream AS c 
        ON = c._key  

WITH merchantsCorrectKey AS(
    l_key AS _key
    , fullname
    , country
    , language
    , platform
  FROM merchantsWithCountryInfoStream

INSERT INTO currentMerchants
FROM merchantsCorrectKey;

INSERT INTO merchantsPerPlatform
  COUNT(*) AS merchants
FROM merchantsCorrectKey
GROUP BY platform;

Details about the features used in the above queries can be found in Stream, Table, Projections and Aggregations.

Lenses SQL Streaming mode allows for streaming queries that read from Kafka topics (e.g. merchants and purchases), select and calculate new fields on the fly (e.g. fullname, and platform), aggregate data based on specific fields and windows, and finally write the output to the desired Kafka topics (e.g. currentMerchants, merchantsPerPlatform and daily-item-purchases-stats).

Queries are their own applications: SQL Processors 

As mentioned above, queries that are meant to be run on streaming data are treated by Lenses, via Lenses SQL Streaming, as stand-alone applications.

These applications, in the context of Lenses platform, are referred to as SQL Processors.

An SQL Processor encapsulates a specific Lenses SQL query, its details and everything else Lenses needs to be able to run the query continuously.

M-N topologies 

The UI allows to visualise any SQL Processor out of the box. For the second example query above, the following is what will be shown:

Multiple Kafka topic topology and SQL

This visualisation helps to highlight that the Lenses SQL fully supports M-N topologies.

What this means is that multiple input topics can be read at the same time, their data manipulated in different ways and then the corresponding results stored in several output topics, all as part of the same Processor’s topology.

This means that all processing can be done in one go, without having to split parts of a topology to different Processors (which could result in more data being stored and shuffled by Kafka).

Part of Lenses platform 

SQL Processors are fully integrated in Lenses’ platform. This means that all the management and monitoring tools that Lenses offers can be used with SQL Processors out-of-the-box.


Each SQL Processor exposes metrics that Lenses picks up out-of-the-box. These are visible from the UI and allow the user to see how a processor is performing.

Kafka SQL monitoring