Lenses Box

../../_images/Lenses-Box.jpg

Lenses Box is a Docker image which contains Lenses and a full installation of Kafka with all its relevant components. The image packs Lenses’ Stream Reactor Connector collection as well. It is all pre-setup and the only requirement is having Docker installed. It contains Lenses, Kafka Broker, Schema Registry, Kafka Connect, 25+ Kafka Connectors with SQL support and various CLI tools.

Prerequisites

Before you get started you will need to install Docker. Lenses Development Environment is a Docker image that contains not only Lenses but all the relevant services that are pre-setup for your local development.

This container will work even on a low memory 8GB RAM machine. It takes about 30-45 seconds for the services to become available and a couple of minutes for Kafka Connect to load all available connectors.

Don’t have Docker? Check installation instructions here.

Getting Started

Step 1: Get Lenses Development Environment

To run Lenses Development Environment you will need to get your personalized token. Get it now via registering at:

Lenses Development Environment

Step 2: Single command, Up and Running with Examples

docker run --rm -p 3030:3030 --name=lenses-dev --net=host -e EULA="https://dl.lenses.stream/d/?id=CHECK_YOUR_EMAIL_FOR_KEY" lensesio/box

Please note that the value of the --name parameter can be anything you want. Just remember to replace lenses-dev with your own value in the docker exec commands found in this page.

You can periodically check for new versions of Lenses:

docker pull lensesio/box

Note

Kafka Connect requires a few minutes to start up since it iterates and loads all the available connectors.

Access Lenses UI

In order to access the Lenses web user interface, open your browser and navigate to http://localhost:3030. Login with admin / admin and enjoy streaming!

../../_images/lenses-development-environment.png

Access Command Line

In order to access the various Kafka command-line tools, such as the console producer and the console consumer, from a terminal in the Docker container, you should execute the following command:

$ docker exec -it lenses-dev bash
root@fast-data-dev / $ kafka-topics --zookeeper localhost:2181 --list

Alternatively, you can directly execute a Kafka command such as kafka-topics as follows:

docker exec -it lenses-dev kafka-topics --zookeeper localhost:2181 --list

If you enter the container, you will discover that we even provide bash auto-completion for some of the tools!

Kafka Cheat Sheet

Check our cheat sheet for interacting with Kafka from the command line:

https://github.com/lensesio/kafka-cheat-sheet

Access Kafka Broker

Making full use of the development environment means you should be able to access the Kafka instance inside the Docker container. The way Docker and Kafka brokers work requires a few extra settings. A Kafka Broker advertises the endpoint accepting client connections and this endpoint must be accessible to your client. When running on macOS or Windows, Docker runs inside a virtual machine which means there will be an extra networking layer in place.

If you run Docker on macOS or Windows, you may need to find the address of the VM running Docker and export it as the advertised address for the broker (On macOS it usually is 192.168.99.100). At the same time, you should give the kafka-lenses-dev image access to the VM’s network:

docker run -e EULA="CHECK_YOUR_EMAIL_FOR_KEY" \
           -e ADV_HOST="192.168.99.100" \
           --net=host --name=lenses-dev \
           lensesio/box

If you run on Linux you don’t have to set the ADV_HOST but you can do something cool with it. If you set it to be your machine’s IP address you will be able to access Kafka from any clients in your network. If you decide to run kafka-lenses-dev in the cloud, you (and all your team) will be able to access Kafka from your development machines. Just remember to provide the public IP of your server!

Persist Data

If you want your data to persist on the Development environment between multiple executions, provide a name for your Docker instance and do not set the container to be removed automatically (--rm flag). For example:

docker run -p 3030:3030 -e EULA="CHECK_YOUR_EMAIL_FOR_KEY" \
       --name=lenses-dev lensesio/box

Once you want to free up resources, just press Control-C. Now you have two options: either remove the Development Environment:

docker rm lenses-dev

Or use it at a later time and continue from where you left off:

docker start -a lenses-dev

Custom Hostname

If you are using docker-machine or setting this up in a Cloud or DOCKER_HOST is a custom IP address such as 192.168.99.100, you will need to use the parameters --net=host -e ADV_HOST=192.168.99.100.

docker run --rm -p 3030:3030 --net=host -e ADV_HOST=192.168.99.100 -e EULA="https://dl.lenses.stream/d/?id=CHECK_YOUR_EMAIL_FOR_KEY" lensesio/box

Running Examples

Lenses provides a SQL engine for Apache Kafka handling both batch and stream queries. You can find more details about it here.

View Example Topics

The Docker container has been setup to create and produce data to a handful of Kafka topics. The producers (data generators) are enabled by default. In order to disable the examples from being executed set the environment variable -e SAMPLEDATA=0 in the docker run command.

Run SQL Processors

In order to see the SQL processors in action, the following Lenses SQL commands can be executed against the example topics:

SET autocreate = true;
INSERT INTO position_reports_Accurate
SELECT * FROM `sea_vessel_position_reports`
WHERE Accuracy IS true
SET autocreate = true;
INSERT INTO position_reports_latitude_filter
SELECT Speed, Heading, Latitude, Longitude, Radio
FROM `sea_vessel_position_reports`
WHERE Latitude > 58
SET autocreate = true;
INSERT INTO position_reports_MMSI_large
SELECT *
FROM `sea_vessel_position_reports`
WHERE MMSI > 100000

Of course, there is support for JSON messages as well:

SET autocreate = true;
INSERT INTO backblaze_smart_result
SELECT (smart_1_normalized + smart_3_normalized) AS sum1_3_normalized, serial_number
FROM `backblaze_smart`
WHERE _key.serial_number LIKE 'Z%'

Note

If you use custom serde, you can add your jar files under /opt/lenses/plugins.

Topology If you now visit the Topology page in Lenses UI you will be able to view your first data pipelines:

../../_images/dev-high-level-topology.png

Lenses SQL processors translate to Kafka Streams application. Lenses SQL covers the entire Kafka Streams API functionality. Setting consumer, producer and streaming properties, JOINING and aggregating can be seen in this example:

SET `autocreate`=true;
SET `auto.offset.reset`='earliest';
SET `commit.interval.ms`='120000';
SET `compression.type`='snappy';

INSERT INTO `cc_payments_fraud`
WITH tableCards AS (
  SELECT *
  FROM `cc_data` )

SELECT STREAM
  p.currency,
  sum(p.amount) as total,
  count(*) usage
FROM `cc_payments` AS p LEFT JOIN tableCards AS c ON p._key = c._key
WHERE c.blocked is true
GROUP BY tumble(1,m), p.currency

View the topology

The processor detail view displays the stream topology!, so you can see how your Kafka Streams application works. By clicking on any node of the topology, you can review the node configuration or view topic data. You can find a bit more in-depth information on the above query here

../../_images/streaming-sql-topology.png

Connectors

The Docker image ships with almost 25 Kafka Connectors. You will have one Kafka Connect Worker with all the connectors pre-setup in the classpath so they are ready to be used. Follow the instructions for each connector in order to set their configuration and launch them via the Lenses UI or via the Rest endpoints.

../../_images/lenses-kafka-connectors.png

FAQ

1. How can I run the Development Environment offline?

To run the Docker offline, save the download access token to a file when downloading from https://lenses.io/downloads/lenses/ and run this command:

LFILE=`cat license.json`
docker run --rm -it -p 3030:3030 -e LICENSE="$LFILE" lensesio/box:latest

2. How much memory does Lenses require?

Lenses has been built with “mechanical sympathy” in mind. Lenses can operate with 4GB RAM memory limit while handling a cluster setup containing 30 Kafka Brokers, more than 10K AVRO Schemas, more than 1000 topics and tens of Kafka Connect connector instances.

The Development Environment is running multiple services in a container: Kafka, ZooKeeper, Schema Registry, Kafka Connect, synthetic data generators and of course Lenses. That means that the recommendation is 5GB of RAM, although it can operate with even less than 4GB (your mileage might vary).

To reduce the memory footprint, it is possible to disable some connectors and shrink the Kafka Connect heap size by applying these options (choose connectors to keep) to the docker run command:

-e DISABLE=azure-documentdb,blockchain,bloomberg,cassandra,coap,elastic,elastic5,ftp,hazelcast,hbase,influxdb,jms,kudu,mongodb,mqtt,redis,rethink,voltdb,yahoo,hdfs,jdbc,elasticsearch,s3,twitter
-e CONNECT_HEAP=512m
../../_images/set_docker_memory.png

3. How can I connect Lenses to my existing Apache Kafka cluster?

To run Lenses linked to your existing Apache Kafka cluster, you will need to contact us in order to receive the Enterprise Edition which adds Kubernetes support, extra monitoring, extra Lenses SQL processor execution modes and more.

4. Why does my Developer Environment have topics?

To make your experience better, we have pre-configured a set of synthetic data generators to get streaming data out of the box. By default the Docker image will launch the data generators; however you can have them off by setting the environment variable -e SAMPLEDATA=0 in the docker run command.

5. Is the Development Environment free?

Developer licenses are free. You can get one or more licenses from our website. The license will expire in six months from the moment you have obtained it. Renewing your license is free! You just have to re-register. You may start your Lenses instance with a different license file and your setup will not be affected.

There are three ways to provide the license file.

  • The first one, which we already saw, uses EULA:
-e EULA="[EULA]"

* If you instead choose to save the license file locally you can provide the file path:
-v /path/to/license.json:/license.conf

* You can provide the license as an environment variable:
-e LICENSE="$(cat /path/to/license.json)"