If you are upgrading from an older version, make sure to check the upgrade notes.
Your data is much more than its content. With this release, Lenses brings a new unified experience for discovering and exploring data across multiple data stores. Apart from listing the Kafka topics or Elastic search indexes, the user will be able to search for keywords spanning the data name and description, its schema fields and their description, and applications.
The new SQL streaming support for Apache Kafka has been in the making for more than one year. The release unifies data exploration and stream processing by providing the same syntax and functions set between the two SQL engine modes: Streaming and Snapshot.
With this release, Lenses makes it a lot easier to process your Apache Kafka streaming data leveraging SQL. It still leverages Apache Kafka Streams, and it offers many new capabilities compared with the previous version:
- convert between storage formats like JSON and AVRO
- you can use the message content (i.e., field) as the message event time.
- introduces support for User Defined Functions (UDF) and User Defined Aggregation Functions(UDAF)
- support for non-equi joins
- simplified syntax for describing time windows
- specific SQL syntax to distinguish between stateful and stateless processing
- support for multi-inserts in one statement
- support for field unwrapping
- support for projecting nested payloads
- support for re-keying a topic based on a field from the message Key or Value content
- enhanced type checking for the projection and functions used which avoids runtime errors
- stricter requirements for topics schema. For example, a JSON topic requires a schema to be processed.
- fine-grained control over the output topics
- same function set as Snapshot
- control the application underlying Kafka Streams identifier
Lenses SQL processors, the Kafka Streams applications built with Lenses SQL, can be deployed:
- within the Lenses process (aimed at low volume data)
You can read more about the SQL engine.
Navigate to the help center for tutorials on how to use the Lenses SQL Streaming for Apache Kafka.
Data availability is paramount for critical business processes. With this new functionality, you can monitor and alert on data volume metrics. Whenever there is a spike or a drop in data, Lenses can raise an alert.
Previous releases have added support to register your application with Lenses and its data lineage topology. Starting with this release, Lenses can monitor your HTTP-registered applications health status.
The open-source connector brings a new level of reliable streaming ETL integration with S3. The sink supports storing these formats: Avro, Json, Parquet, CSV (including headers), and Bytes. Archiving Apache Kafka data is a common use case. Apart from storing the Kafka records optimally (avoiding small files), the sink supports data partition by the record field(s), the key, the key field(s), or headers. To round the functionality up, it can partition by a combination of both.
You can read more about the S3 connectors here: sink.
Kafka Connect provides a way to plugin a secret provider. There should be no compromise when it comes to security. With this release, we announce the support we offer via open source for Connect Secret Providers integrating with:
- AWS Secret Manager
- Azure Keyvault
- Environment variables
- Hashicorp Vault
You can read more about Kafka connectors and secrets.
- Improved Lenses upgrade process.
- License can be updated either by a
PUT /api/v1/licenseAPI or a CLI command.
- Fixes log viewing for SQL Processors when running in Kubernetes.
- SQL intellisense hints and validation
- New SQL functions.
- Add support for Data SLAs alerts
- Add support for updating the license