LogoLogo
HomeProductsDownload Community Edition
6.0
  • Lenses DevX
  • Kafka Connectors
6.0
  • Overview
  • What's New?
    • Version 6.0.5
      • Features / Improvements & Fixes
    • Version 6.0.4
      • Features / Improvements & Fixes
    • Version 6.0.3
      • Features / Improvements & Fixes
    • Version 6.0.2
    • Version 6.0.1
    • Version 6.0.0-la.2
      • Features / Improvements & Fixes
    • Version 6.0.0-la.1
      • Features / Improvements & Fixes
    • Version 6.0.0-la.0
      • Features / Improvements & Fixes
    • Version 6.0.0-alpha.20
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.19
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.18
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.17
      • Features / Improvements & Fixes
      • Helm
    • Version 6.0.0-alpha.16
    • Version 6.0.0-alpha.14
  • Getting Started
    • Setting Up Community Edition
      • Hands-On Walk Through of Community Edition
    • Connecting Lenses to your Kafka environment
      • Overview
      • Install
  • Deployment
    • Installation
      • Kubernetes - Helm
        • Deploying HQ
        • Deploying an Agent
      • Docker
        • Deploying HQ
        • Deploying an Agent
      • Linux
        • Deploying HQ
        • Deploying an Agent
    • Configuration
      • Authentication
        • Admin Account
        • Basic Authentication
        • SSO & SAML
          • Overview
          • Azure SSO
          • Google SSO
          • Keycloak SSO
          • Okta SSO
          • OneLogin SSO
          • Generic SSO
      • HQ
        • Configuration Reference
      • Agent
        • Overview
        • Provisioning
          • Overview
          • HQ
          • Kafka
            • Apache Kafka
            • Aiven
            • AWS MSK
            • AWS MSK Serverless
            • Azure EventHubs
            • Azure HDInsight
            • Confluent Cloud
            • Confluent Platform
            • IBM Event Streams
          • Schema Registries
            • Overview
            • AWS Glue
            • Confluent
            • Apicurio
            • IBM Event Streams Registry
          • Kafka Connect
          • Zookeeper
          • AWS
          • Alert & Audit integrations
          • Infrastructure JMX Metrics
        • Hardware & OS
        • Memory & CPU
        • Database
        • TLS
        • Kafka ACLs
        • Rate Limiting
        • JMX Metrics
        • JVM Options
        • SQL Processor Deployment
        • Logs
        • Plugins
        • Configuration Reference
  • User Guide
    • Environments
      • Create New Environment
    • Lenses Resource Names (LRNs)
    • Identity & Access Management
      • Overview
      • Users
      • Groups
      • Roles
      • Service Accounts
      • IAM Reference
      • Example Policies
    • Topics
      • Global Topic Catalogue
      • Environment Topic Catalogue
        • Finding topics & fields
        • Searching for messages
        • Inserting & deleting messages
        • Viewing topic metrics
        • Viewing topic partitions
        • Topic Settings
        • Adding metadata & tags to topics
        • Managing topic configurations
        • Approval requests
        • Downloading messages
        • Backup & Restore
    • SQL Studio
      • Concepts
      • Best practices
      • Filter by timestamp or offset
      • Creating & deleting Kafka topics
      • Filtering
      • Limit & Sampling
      • Joins
      • Inserting & deleting data
      • Aggregations
      • Metadata fields
      • Views & synonyms
      • Arrays
      • Managing queries
    • Applications
      • Connectors
        • Overview
        • Sources
        • Sinks
        • Secret Providers
      • SQL Processors
        • Concepts
        • Projections
        • Joins
        • Lateral Joins
        • Aggregations
        • Time & Windows
        • Storage format
        • Nullibility
        • Settings
      • External Applications
        • Registering via SDK
        • Registering via REST
    • Schemas
    • Monitoring & Alerting
      • Infrastructure Health
      • Alerting
        • Alert Reference
      • Integrations
      • Consumer Groups
    • Self Service & Governance
      • Data policies
      • Audits
      • Kafka ACLs
      • Kafka Quotas
    • Topology
    • Tutorials
      • SQL Processors
        • Data formats
          • Changing data formats
          • Rekeying data
          • Controlling AVRO record names and namespaces
          • Changing the shape of data
        • Filtering & Joins
          • Filtering data
          • Enriching data streams
          • Joining streams of data
          • Using multiple topics
        • Aggregations
          • Aggregating data in a table
          • Aggregating streams
          • Time window aggregations
        • Complex types
          • Unwrapping complex types
          • Working with Arrays
        • Controlling event time
      • SQL Studio
        • Querying data
        • Accessing headers
        • Deleting data from compacted topics
        • Working with JSON
    • SQL Reference
      • Expressions
      • Functions
        • Aggregate
          • AVG
          • BOTTOMK
          • COLLECT
          • COLLECT_UNIQUE
          • COUNT
          • FIRST
          • LAST
          • MAXK
          • MAXK_UNIQUE
          • MINK
          • MINK_UNIQUE
          • SUM
          • TOPK
        • Array
          • ELEMENT_OF
          • FLATTEN
          • IN_ARRAY
          • REPEAT
          • SIZEOF
          • ZIP_ALL
          • ZIP
        • Conditions
        • Conversion
        • Date & Time
          • CONVERT_DATETIME
          • DATE
          • DATETIME
          • EXTRACT_TIME
          • EXTRACT_DATE
          • FORMAT_DATE
          • FORMAT_TIME
          • FORMAT_TIMESTAMP
          • HOUR
          • MONTH_TEXT
          • MINUTE
          • MONTH
          • PARSE_DATE
          • PARSE_TIME_MILLIS
          • PARSE_TIME_MICROS
          • PARSE_TIMESTAMP_MILLIS
          • PARSE_TIMESTAMP_MICROS
          • SECOND
          • TIMESTAMP
          • TIME_MICROS
          • TIMESTAMP_MICROS
          • TIME_MILLIS
          • TIMESTAMP_MILLIS
          • TO_DATE
          • TO_DATETIME
          • TOMORROW
          • TO_TIMESTAMP
          • YEAR
          • YESTERDAY
        • Headers
          • HEADERASSTRING
          • HEADERASINT
          • HEADERASLONG
          • HEADERASDOUBLE
          • HEADERASFLOAT
          • HEADERKEYS
        • JSON
          • JSON_EXTRACT_FIRST
          • JSON_EXTRACT_ALL
        • Numeric
          • ABS
          • ACOS
          • ASIN
          • ATAN
          • CBRT
          • CEIL
          • COSH
          • COS
          • DEGREES
          • DISTANCE
          • FLOOR
          • MAX
          • MIN
          • MOD
          • NEG
          • POW
          • RADIANS
          • RANDINT
          • ROUND
          • SIGN
          • SINH
          • SIN
          • SQRT
          • TANH
          • TAN
        • Nulls
          • ISNULL
          • ISNOTNULL
          • COALESCE
          • AS_NULLABLE
          • AS_NON_NULLABLE
        • Obfuscation
          • ANONYMIZE
          • MASK
          • EMAIL
          • FIRST1
          • FIRST2
          • FIRST3
          • FIRST4
          • LAST1
          • LAST2
          • LAST3
          • LAST4
          • INITIALS
        • Offsets
        • Schema
          • TYPEOF
          • DUMP
        • String
          • ABBREVIATE
          • BASE64
          • CAPITALIZE
          • CENTER
          • CHOP
          • CONCAT
          • CONTAINS
          • DECODE64
          • DELETEWHITESPACE
          • DIGITS
          • DROPLEFT
          • DROPRIGHT
          • ENDSWITH
          • INDEXOF
          • LEN
          • LOWER
          • LPAD
          • MKSTRING
          • REGEXP
          • REGEX_MATCHES
          • REPLACE
          • REVERSE
          • RPAD
          • STARTSWITH
          • STRIPACCENTS
          • SUBSTR
          • SWAPCASE
          • TAKELEFT
          • TAKERIGHT
          • TRIM
          • TRUNCATE
          • UNCAPITALIZE
          • UPPER
          • UUID
        • User Defined Functions
        • User Defined Aggregate Functions
      • Deserializers
      • Supported data formats
        • Protobuf
  • Resources
    • Downloads
    • CLI
      • Environment Creation
    • API Reference
      • API Authentication
      • Websocket Spec
      • Lenses API Spec
        • Authentication
        • Environments
        • Users
        • Groups
        • Roles
        • Service Accounts
        • Meta
        • Settings
        • License
        • Topics
        • Applications
          • SQL Processors
          • Kafka Connectors
          • External Applications
        • Kafka ACLs & Quotas
        • Kafka Consumer Groups
        • Schema Registry
        • SQL Query Management
        • Data Policies
        • Alert Channels
        • Audit Channels
        • Provisioning State
        • Agent Metadata
        • Backup & Restore
        • As Code
Powered by GitBook
LogoLogo

Resources

  • Privacy
  • Cookies
  • Terms & Conditions
  • Community EULA

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.

On this page

Was this helpful?

Export as PDF
  1. Getting Started
  2. Setting Up Community Edition

Hands-On Walk Through of Community Edition

Simple walk through to introduce you to the Lenses 6 user interface.

PreviousSetting Up Community EditionNextConnecting Lenses to your Kafka environment

Last updated 2 days ago

Was this helpful?

After you've run your docker compose command you can access it running locally at . CE will ask you to login:

User: admin Pass: admin

The very first time you login Lenses will ask you for your authorization code. This is easy to setup, just follow the link from the login screen:

That link will take you to thesetup page where you can enter your email address.

Click Sign in and then Lenses will then send you an email with your access code - be sure to check your junk folder if it doesn't arrive. Once you have your login code go back to http://localhost:9991 login URL and provide the code. You can continue to use that same code whenever you spin up Lenses Community Edition.

Click on Let's Start to access the Lenses UI. The first view you'll see is the Environments view. This is where Lenses displays all of your connected Kafka Environments. This can include: the Kafka clusters themselves, Kafka Connect, Schema Registry, connectors, consumers, even the Kubernetes clusters it's all running in. Environments mean your entire Kafka ecosystem not just the clusters themselves. For our demo setup we only have one environment connected, but you can have up to two at no charge with Community Edition.

Click on the link below Environments view to switch to the topics view. Here you'll see all the topics in your connected Environments. We are currently logged in as Admin so we can see all the Topics in our Environments. If we were logging in with a more restricted role we might only see the Topics we have permission to view.

Use the bottom scroll bar to scroll to the right so you can see further information about each topic.

You can see what type of schema it uses, how many partitions it uses, and much more.

The topics view is fully searchable. So for example if we wanted to build a "Customer Location View" for our web page using Kafka data — we could search for the keyword longitude here and see which topics include location data. Let's do a search for "latitude" in the topics view and see what comes up:

Three topics appear to have data about latitude, but let's dive a bit deeper. Tick on the "Search in Schema" tickbox to get Lenses to display the actual names of the keys in the schema.

This will surface the actual schema names that match your search.

Based on what we've discovered it seems like the nyc_yellow_taxi_trip_data might be useful for our theoretical project. Let's use Lenses to dive a bit deeper into that topic and view the actual data flowing through using SQL Studio. To get to SQL Studio from this view simply hover your mouse over the topic: nyc_yellow_taxi_trip_data. That will cause the interactive controls to appear. Click on the SQL shortcut when it pops up:

Clicking that button automatically opens up that topic in SQL Studio. You can now interact directly with the data flowing through that topic using SQL statements. Note when you first access SQL Studio it appears with both side "drawers" open. You can click on the drawer close icons on either side to make more room to work directly with your data and SQL.

Now you can go back and open those as needed later on, but this give you all the screen to view and work with your data. Toggle your view from Grid to List. Now you have your data in a more JSON style format. Expand out the JSON to view the individual key / value pairs. Across the top you'll see the metadata for each event: Partition, Offset, and Time Stamp. Below you can examine the key / value pairs. As you can see we've got plenty of longitude and latitude data to work with for our customer location visualization.

Now let's move on from data discovery to troubleshooting. Using the same taxi data topic we can troubleshoot a "live" problem. Several drivers are reporting errors with credit card transactions going through in the last 15 minutes. Let's use SQL Studio to examine taxi transactions in the last 15 minutes using a SQL search:

SELECT VendorID, fare_amount, payment_type
FROM nyc_yellow_taxi_trip_data

Copy that text and paste it into the SQL box in your SQL Studio. Then from the Time Range picker select the last 15 minutes to set your time frame and then hit run.

Next up let's clean up our view so the data is a bit easier to see. Go to the Columns button and get ride of the timestamp, partition, and offset columns. Now we just have our vendorID, fare_amount, and payment_type. Assuming payment_type = 1 means the customer paid cash and payment_type = 2 means card scroll down and notice that both types of payments seem to be going through. Maybe the problem is with a particular driver. Let's filter our results on vendorID. Select the Filters button and create a filter to just show vendorID = 1.

But for now let's move on to more other Lenses features. Let's switch back to the Environments View. Hover your mouse over our environment and some controls should appear. Click on the big arrow that appears in order to drill down into the specifics for this environment.

Now we're in the details page of this specific Environment. We can quickly see the health of all the components of the Kafka ecosystem. We can use any of the specific views on the left side, or drill down more interactively from details that appear on the main dashboard. Take a moment to look around at all the stats and data presented on this page before we move on.

On the lefthand side switch to the Topics view and select the backblaze_smart topic. That will open up the Topic View. Here we can see examples of the data but can also view much more detailed information about the topic. Be sure to click on the button to close the right side drawer to free up some screen space. Take a moment to toggle through the different topics view as listed across the top but then come back to the Data view.

Coming back to the Data View you'll notice that we have serial_number field displayed. This field is tied to registered owners and can be considered personally identifiable data. Luckily Lenses has the capability to block the view of this sensitve data. We need to setup a Data Policy to block this. Make a note of the name of the filed we want to obscure: serial_number.

Click on the Policy view on the left hand side and click on New Policy. Then fill out the form:

Name: serial-number-blocker

Redaction: last 3 (this means we'll mark out everything but the last 3 digits in the number)

Category: private_info (note after you type this in you'll need to hit enter to make it stick)

Impact Type: medium

Affected Datasets: don't change

Add Fields: serial_number (you'll need to hit return here as well to make it stick)

Once you're done it should look like this:

Then click "Create New Policy"

Now you'll see your new policy in the list. You can go back to the topics page and click on backblaze_smart topic again and verify that the serial_number field has been obfuscated.

It should look like this:

Congrats you've completed a basic introduction to Lenses 6. There's lots more to learn and features to use. Look for more tutorials to come soon.

The very first time you login to Lenses CE you will see our first start help screen. There is a video to watch as well as links to these docs, , our and .

Toggle the filter back and forth between vendorID = 1 and 2 and see that transactions of both types seem to be flowing through. So perhaps the drivers' reported problem is not here, maybe it's a wireless connectivity issue? We could check our wireless telecom topic to further troubleshoot this theoretical issue. We have a detailed guide to Lenses SQL and using SQL Studio in our docs here:

AskMarios user forum
YouTube channel,
Community Slack
https://docs.lenses.io/latest/user-guide/sql-studio
http://localhost:9991
CE Login Screen
Accessing the login link generation page
Pop up for first time CE Logins
Lenses CE Environments View
Topics View
Simple search for "Latitude"
Search in schema shows the key names that match a search.
Access SQL Studio from the Topics View
Click on these icons to close your side drawers
Viewing data in SQL Studio
Search results looking for payment issues.
Filtered results.
Drill down into Environment specifics.
Different view options for topics.
Filled out Data Policy form
Serial numbers obfuscated!