Plugins

This page describes how to install plugins in Lenses.

The following implementations can be specified:

  1. Serializers/Deserializers Plug your serializer and deserializer to enable observability over any data format (i.e., protobuf / thrift)

  2. Custom authentication Authenticate users on your proxy and inject permissions HTTP headers.

  3. LDAP lookup Use multiple LDAP servers or your group mapping logic.

  4. SQL UDFs User Defined Functions (UDF) that extend SQL and streaming SQL capabilities.

Once built, the jar files and any plugin dependencies should be added to Lenses and, in the case of Serializers and UDFs, to the SQL Processors if required.

Adding plugins

On startup, Lenses loads plugins from the $LENSES_HOME/plugins/ directory and any location set in the environment variable LENSES_PLUGINS_CLASSPATH_OPTS. These locations Lenses is watching, and dropping a new plugin will hot-reload it. For the Lenses docker (and Helm chart) you use /data/plugins.

Any first-level directories under the paths mentioned above, detected on startup will also be monitored for new files. During startup, the list of monitored locations will be shown in the logs to help confirm the setup.

...
Initializing (pre-run) Lenses
Installation directory autodetected: /opt/lenses
Current directory: /data
Logback configuration file autodetected: logback.xml
These directories will be monitored for new jar files:
 - /opt/lenses/plugins
 - /data/plugins
 - /opt/lenses/serde
Starting application
...

Whilst all jar files may be added to the same directory (e.g /data/plugins), it is suggested to use a directory hierarchy to make management and maintenance easier.

An example hierarchy for a set of plugins:

├── security
   └── sso_header_decoder.jar
├── serde
   ├── protobuf_actions.jar
   └── protobuf_clients.jar
└── udf
    ├── eu_vat.jar
    ├── reverse_geocode.jar
    └── summer_sale_discount.jar

SQL Processors in Kubernetes

There are two ways to add custom plugins (UDFs and Serializers) to the SQL Processors; (1) via making available a tar.gz archive at an HTTP (s) address, or (2) via creating a custom docker image.

Archive served via HTTP

With this method, a tar archive, compressed with gzip, can be created that contains all plugin jars and their dependencies. Then this archive should be uploaded to a web server that the SQL Processors containers can access, and its address should be set with the option lenses.kubernetes.processor.extra.jars.url.

Step by step:

  1. Create a tar.gz file that includes all required jars at its root:

    tar -czf [FILENAME.tar.gz] -C /path/to/jars/ *
  2. Upload to a web server, ie. https://example.net/myfiles/FILENAME.tar.gz

  3. Set

    lenses.kubernetes.processor.extra.jars.url=https://example.net/myfiles/FILENAME.tar.gz

    For the docker image, set the corresponding environment variable

    LENSES_KUBERNETES_PROCESSOR_EXTRA_JARS_URL=https://example.net/myfiles/FILENAME.tar.gz`

Custom Docker image

The SQL Processors that run inside Kubernetes use the docker image lensesio-extra/sql-processor. It is possible to build a custom image and add all the required jar files under the /plugins directory, then set lenses.kubernetes.processor.image.name and lenses.kubernetes.processor.image.tag options to point to the custom image.

Step by step:

  1. Create a Docker image using lensesio-extra/sql-processor:VERSION as a base and add all required jar files under /plugins:

    FROM lensesio-extra/sql-processor:4.2
    ADD jars/* /plugins
    docker build -t example/sql-processor:4.2 .
  2. Upload the docker image to a registry:

    docker push example/sql-processor:4.2
  3. Set

    lenses.kubernetes.processor.image.name=example/sql-processor
    lenses.kubernetes.processor.image.tag=4.2

    For the docker image, set the corresponding environment variables

    LENSES_KUBERNETES_PROCESSOR_IMAGE_NAME=example/sql-processor
    LENSES_KUBERNETES_PROCESSOR_IMAGE_TAG=4.2

Last updated

Logo

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.