In this guide you will learn about SQL Processor apps in Lenses, how to use them to quickly create streaming flows using SQL streaming, deploy and scale them.
For a full reference see the Lenses SQL section.
SQL Processors are used to continuously process data in Kafka topics. They use Lenses SQL streaming syntax to create and deploy applications to process continuous streams of data.
Under the hood, the Kafka Streams API is used and in combination with the internal Lenses application deployment framework you get a seamless experience to create scalable processing components to
filter, aggregate, join, transform kafka topics, which can scale natively to Kubernetes.
You can only process data within your data namespaces. If a topic is outside your namespace, you won’t be able to create a processor to either read or write to this topic.
To create a SQL Processor you will need to use Lenses SQL Streaming syntax.
It’s also recommended to get familiarised with streaming concepts and semantics, if it’s the first time to use it.
Streaming data are endless unbounded datasets so some of the concepts you may have worked with will not work the same way.
If you are already familiar with SQL syntax, you should be able to quickly pick up the concepts and start processing data in real-time!
You can also extend the capabilities of SQL streaming with UDFs to use as part of your processor.
Find out more about UDFs
SQL Processors are also supported by the CLI to enable automation scenarios.
CLI - API
Is Lenses SQL the same as KSQL?
No, it’s a different syntax and different technology.
Make sure you follow Lenses SQL documentation when creating a SQL Processor in Lenses.
Can I edit a SQL Processor?
You cannot edit an existing processor, but you can re-create it with the same processorID. However, if you change the SQL significantly this might impact your outcome.
Is there a way to automatically migrate to 4x processors?
No, if you are running older versions on processors with a newer version of Lenses they will still be visible and allow some basic operations, however you will need to follow the SQL documentation to manually migrate them to the new syntax.
Can I see historic metrics for my processor?
No, currently you can only get instant metrics from the UI or the API.
Can I get an alert if my processor or its runners are failing?
Currently you can’t get a processor alert. However, if you are using connect, you can enable the connectors alerts. You can also add custom alerts for consumer lag or topic data produced (see alerts).
On this page