Data Flows & Microservices

The requirements for a modern data platform within an organization must accommodate both Traditional use cases like microservices, transactions, building data pipelines to bridge different systems, as well as Modern use cases like high volume data sets (IoT, click streams etc), machine learning and Stream processing. Building on top of Apache Kafka, an open source messaging system which meets the above criteria, has both business and technical challenges.

Microservices bring a whole new paradigm for event-driven approaches, and Apache Kafka is ideal for highly scalable architectures.

You can build your streaming flows utilizing pre-built components like Connectors, in order to move data from well-known data stores. Stream processing addresses the problem of continually reacting to and processing data as it flows through a business process.

Lenses supports you in this architectural and development paradigm shift in various ways.

  • Provides the building blocks to build visual streaming data flows
  • Monitors data pipelines end to end and alerts to never miss an SLA
  • Supports multiple data types out of the box like AVRO, JSON, XML, CSV, Protobufs etc
  • You can provide custom serializers and deserializers to support custom data sets
  • Custom Applications can be exposed in Lenses Topology and leverage its monitoring capabilities
  • Lenses SQL Engine supports your data pipelines to build Processors and customize Connectors
  • Monitor Producers & Consumers rates as well as deployment metrics
  • Access data and Debug your apps with an intuitive web interface where you can inspect & add data

Kubernetes Integration

Lenses integrates with Kubernetes to deploy your data flows. Lenses SQL Engine allows registering SQL processors which are automatically deployed and scaled leveraging your Kubernetes cluster.

Learn more about Scaling SQL Processors in Kubernetes Mode.

../_images/processor_runners_selected_2.png

Data Types

Lenses & Lenses SQL Engine supports multiple data types out of the box, while allows you to register custom serializers and deserializers.

You can browse, query and register processors for AVRO, JSON, Protobuf, XML, CSV, Array payloads, query plain text with Regex. For AVRO payloads, Lenses integrates with Schema Registry (Confluent and Hortonworks) but adds lineage and role-based access on top, to meet the demands of a corporate environment.

../_images/topic_serdes1.png

Lenses Topology View

Microservices architectures can be chaotic and require constant monitoring. Lenses visualizes the end to end flows with interactive nodes which gives you a global view of what services are currently connected to your systems.

../_images/topology_selected_lsql_processor_1.png

Access Data

Accessing data is a vital feature when you are building a central data platform within your organization. Lenses provides security and data governance features to allow access to the data for both development and business needs. You can insert data in order to debug and test your microservices via the User Interface or the Command Line. Lenses has a rich ecosystem to deliver data to other systems and leverages your favorite tools. Integrates with Kafka Connect, to move data around your systems, JDBC driver to integrate with BI tools, Python for data scientists to integrate with favorite notebooks such as Jupyter, Reduxjs library to get real-time data straight to your frontend application.

../_images/topic_.png

Producers and Consumers

Monitor your apps like a pro! Lenses provides in-app features to monitor your applications and apply alerts to never miss an SLA. Moreover, Lenses ships with rich Kafka specific dashboards in order to monitor your low-level metrics.

../_images/consumers_.png
../_images/kafka-producer-consumer-metrics-UI.png