Viewing topic data

When navigating to a Kafka topic, you can query existing data and the live stream. Lenses immediately presents a snapshot of the latest data in table browsing mode allowing:

  1. To Filter by timestamp, offset or partition.
  2. To Drill into the data and expand the key and value payloads and any nested fields.
  3. To Work with any data type to Query, Insert, Delete.

Preview and quick filter 

When navigating to a topic you are automatically prompted to the data preview:

Topic data tab

Data record inspection can be done via the quick filter or SQL editor. The result set is available in 2 different formats: Tree and Grid view. You can switch views from the right-hand icons.

Topic data flat

Use the Download button to get the JSON representation of your messages in the result set.

Quick filtering

The quick filter provides an interactive way to filter data. Available filters include partitions, start offset or timestamp and limiting records returned.

Topic data filter

Drill down to message

From the result set, you can drill down to the messsage to view the details:

Topic message view button
Topic message view

Live stream 

Switch to live continuous query mode by selecting Live Sample.

Topic data live

Query with Lenses SQL 

To access the data in Kafka Topics, you can run Snapshot queries in the SQL Studio.

Topic data filter 2

We recommend to go through the Lenses SQL documentation to follow the syntax guidelines and understand the Kafka semantics when querying data.

lensesio sql studio

See Tips for querying Kafka Topics

Continuously process the data 

In SQL Studio you get the ability to fire queries on demand for the data already stored in the Topic. However, if you want to continuously process data and react to data in real-time use the SQL Processors. SQL Processors are enabled by the LLenses SQL Streaming engine and create an application component which continuously reacts to events. You can transform, filter, enrich, join the data with streaming semantics, or use your UDFs and store the output in a new topic. Processors are leveraging streaming semantics and the Kafka Streams API to process the data.

Last modified: July 17, 2024