LogoLogo
HomeProductsDownload Community Edition
  • Lenses DevX
  • Kafka Connectors
  • Overview
  • Understanding Kafka Connect
  • Connectors
    • Install
    • Sources
      • AWS S3
      • Azure Data Lake Gen2
      • Azure Event Hubs
      • Azure Service Bus
      • Cassandra
      • GCP PubSub
      • GCP Storage
      • FTP
      • JMS
      • MQTT
    • Sinks
      • AWS S3
      • Azure CosmosDB
      • Azure Data Lake Gen2
      • Azure Event Hubs
      • Azure Service Bus
      • Cassandra
      • Elasticsearch
      • GCP PubSub
      • GCP Storage
      • HTTP
      • InfluxDB
      • JMS
      • MongoDB
      • MQTT
      • Redis
      • Google BigQuery
  • Secret Providers
    • Install
    • AWS Secret Manager
    • Azure KeyVault
    • Environment
    • Hashicorp Vault
    • AES256
  • Single Message Transforms
    • Overview
    • InsertFieldTimestampHeaders
    • InsertRecordTimestampHeaders
    • InsertRollingFieldTimestampHeaders
    • InsertRollingRecordTimestampHeaders
    • InsertRollingWallclock
    • InsertRollingWallclockHeaders
    • InsertSourcePartitionOrOffsetValue
    • InsertWallclock
    • InsertWallclockHeaders
    • InsertWallclockDateTimePart
    • TimestampConverter
  • Tutorials
    • Backup & Restore
    • Creating & managing a connector
    • Cloud Storage Examples
      • AWS S3 Source Examples
      • AWS S3 Sink Time Based Partitioning
      • GCP Source
      • GCP Sink Time Based Partitioning
    • Http Sink Templating
    • Sink converters & different data formats
    • Source converters with incoming JSON or Avro
    • Loading XML from Cloud storage
    • Loading ragged width files
    • Using the MQTT Connector with RabbitMQ
    • Using Error Policies
    • Using dead letter queues
  • Contributing
    • Developing a connector
    • Utilities
    • Testing
  • Lenses Connectors Support
  • Downloads
  • Release notes
    • Stream Reactor
    • Secret Providers
    • Single Message Transforms
Powered by GitBook
LogoLogo

Resources

  • Privacy
  • Cookies
  • Terms & Conditions
  • Community EULA

2024 © Lenses.io Ltd. Apache, Apache Kafka, Kafka and associated open source project names are trademarks of the Apache Software Foundation.

On this page
  • Configuration
  • Usage
  • Data encoding

Was this helpful?

Export as PDF
  1. Secret Providers

Environment

This page describes how to retrieve secrets from Environment variables for use in Kafka Connect.

Use Environment variables to hold secrets and use them in Kafka Connect.

Secrets will only be reloaded if the Connector restarts.

Configuration

Example Worker Properties:

worker.props
config.providers=env
config.providers.env.class=io.lenses.connect.secrets.providers.ENVSecretProvider
config.providers.env.param.file.dir=my-secret-dir

Usage

To use this provider in a connector, reference the ENVSecretProvider environment variable providing the value of the connector property.

The indirect reference is in the form ${provider::key} where:

  • provider is the name of the provider in the worker property file set above

  • key is the name of the environment variable holding the secret.

For example, if we store two secrets as environment variables:

  • MY_ENV_VAR_USERNAME with the value lenses and

  • MY_ENV_VAR_PASSWORD with the value my-secret-password

we would set:

connector.props
name=my-sink
class=my-class
topics=mytopic
username=${env::MY_ENV_VAR_USERNAME}
password=${env::MY_ENV_VAR_PASSWORD}

This would resolve at runtime to:

name=my-sink
class=my-class
topics=mytopic
username=lenses
password=my-secret-password

Data encoding

This provider inspects the value of the environment to determine how to process the value. The value can optionally provide value metadata to support base64 decoding and writing values to files.

To provide metadata the following patterns are expected:

where value is the actual payload and metadata can be one of the following:

  • ENV-base64 - the provider will attempt to base64 decode the value string

  • ENV-mounted-base64 - the provider will attempt to base64 decode the value string and write to a file

  • ENV-mounted - the provider will write the value to a file

if no metadata is found the value of the environment variable is returned.

PreviousAzure KeyVaultNextHashicorp Vault

Last updated 9 months ago

Was this helpful?