Hands-On Walk Through of Community Edition
Simple walk through to introduce you to the Lenses 6 user interface.
Last updated
Was this helpful?
Simple walk through to introduce you to the Lenses 6 user interface.
Last updated
Was this helpful?
After you've run your docker compose command you can access it running locally at . CE will ask you to login:
User: admin Pass: admin
The very first time you login Lenses will ask you for your authorization code. This is easy to setup, just follow the link from the login screen:
That link will take you to thesetup page where you can enter your email address.
Click Sign in and then Lenses will then send you an email with your access code - be sure to check your junk folder if it doesn't arrive. Once you have your login code go back to http://localhost:9991 login URL and provide the code. You can continue to use that same code whenever you spin up Lenses Community Edition.
Click on Let's Start to access the Lenses UI. The first view you'll see is the Environments view. This is where Lenses displays all of your connected Kafka Environments. This can include: the Kafka clusters themselves, Kafka Connect, Schema Registry, connectors, consumers, even the Kubernetes clusters it's all running in. Environments mean your entire Kafka ecosystem not just the clusters themselves. For our demo setup we only have one environment connected, but you can have up to two at no charge with Community Edition.
Click on the link below Environments view to switch to the topics view. Here you'll see all the topics in your connected Environments. We are currently logged in as Admin so we can see all the Topics in our Environments. If we were logging in with a more restricted role we might only see the Topics we have permission to view.
Use the bottom scroll bar to scroll to the right so you can see further information about each topic.
You can see what type of schema it uses, how many partitions it uses, and much more.
The topics view is fully searchable. So for example if we wanted to build a "Customer Location View" for our web page using Kafka data — we could search for the keyword longitude here and see which topics include location data. Let's do a search for "latitude" in the topics view and see what comes up:
Three topics appear to have data about latitude, but let's dive a bit deeper. Tick on the "Search in Schema" tickbox to get Lenses to display the actual names of the keys in the schema.
This will surface the actual schema names that match your search.
Based on what we've discovered it seems like the nyc_yellow_taxi_trip_data might be useful for our theoretical project. Let's use Lenses to dive a bit deeper into that topic and view the actual data flowing through using SQL Studio. To get to SQL Studio from this view simply hover your mouse over the topic: nyc_yellow_taxi_trip_data. That will cause the interactive controls to appear. Click on the SQL shortcut when it pops up:
Clicking that button automatically opens up that topic in SQL Studio. You can now interact directly with the data flowing through that topic using SQL statements. Note when you first access SQL Studio it appears with both side "drawers" open. You can click on the drawer close icons on either side to make more room to work directly with your data and SQL.
Now you can go back and open those as needed later on, but this give you all the screen to view and work with your data. Toggle your view from Grid to List. Now you have your data in a more JSON style format. Expand out the JSON to view the individual key / value pairs. Across the top you'll see the metadata for each event: Partition, Offset, and Time Stamp. Below you can examine the key / value pairs. As you can see we've got plenty of longitude and latitude data to work with for our customer location visualization.
Now let's move on from data discovery to troubleshooting. Using the same taxi data topic we can troubleshoot a "live" problem. Several drivers are reporting errors with credit card transactions going through in the last 15 minutes. Let's use SQL Studio to examine taxi transactions in the last 15 minutes using a SQL search:
Copy that text and paste it into the SQL box in your SQL Studio. Then from the Time Range picker select the last 15 minutes to set your time frame and then hit run.
Next up let's clean up our view so the data is a bit easier to see. Go to the Columns button and get ride of the timestamp, partition, and offset columns. Now we just have our vendorID, fare_amount, and payment_type. Assuming payment_type = 1 means the customer paid cash and payment_type = 2 means card scroll down and notice that both types of payments seem to be going through. Maybe the problem is with a particular driver. Let's filter our results on vendorID. Select the Filters button and create a filter to just show vendorID = 1.
But for now let's move on to more other Lenses features. Let's switch back to the Environments View. Hover your mouse over our environment and some controls should appear. Click on the big arrow that appears in order to drill down into the specifics for this environment.
Now we're in the details page of this specific Environment. We can quickly see the health of all the components of the Kafka ecosystem. We can use any of the specific views on the left side, or drill down more interactively from details that appear on the main dashboard. Take a moment to look around at all the stats and data presented on this page before we move on.
On the lefthand side switch to the Topics view and select the backblaze_smart topic. That will open up the Topic View. Here we can see examples of the data but can also view much more detailed information about the topic. Be sure to click on the button to close the right side drawer to free up some screen space. Take a moment to toggle through the different topics view as listed across the top but then come back to the Data view.
Coming back to the Data View you'll notice that we have serial_number field displayed. This field is tied to registered owners and can be considered personally identifiable data. Luckily Lenses has the capability to block the view of this sensitve data. We need to setup a Data Policy to block this. Make a note of the name of the filed we want to obscure: serial_number.
Click on the Policy view on the left hand side and click on New Policy. Then fill out the form:
Name: serial-number-blocker
Redaction: last 3 (this means we'll mark out everything but the last 3 digits in the number)
Category: private_info (note after you type this in you'll need to hit enter to make it stick)
Impact Type: medium
Affected Datasets: don't change
Add Fields: serial_number (you'll need to hit return here as well to make it stick)
Once you're done it should look like this:
Then click "Create New Policy"
Now you'll see your new policy in the list. You can go back to the topics page and click on backblaze_smart topic again and verify that the serial_number field has been obfuscated.
It should look like this:
Congrats you've completed a basic introduction to Lenses 6. There's lots more to learn and features to use. Look for more tutorials to come soon.
The very first time you login to Lenses CE you will see our first start help screen. There is a video to watch as well as links to these docs, , our and .
Toggle the filter back and forth between vendorID = 1 and 2 and see that transactions of both types seem to be flowing through. So perhaps the drivers' reported problem is not here, maybe it's a wireless connectivity issue? We could check our wireless telecom topic to further troubleshoot this theoretical issue. We have a detailed guide to Lenses SQL and using SQL Studio in our docs here: