Data Flows & Microservices¶
The requirements for a modern data platform within an organisation must accommodate both Traditional use cases like microservices, transactions, building data pipelines to bridge different systems, as well as Modern use cases like high volume data sets (IoT, click streams etc), machine learning and Stream processing. Building on top of Apache Kafka, an open source messaging system which meets the above criteria, has both business and technical challenges.
Microservices bring a whole new paradigm for event driven approaches, and Apache Kafka is ideal for highly scalable architectures.
You can build your streaming flows utilising pre-built components like Connectors, in order to move data from well known data stores. Stream processing, addresses the problem of continually reacting to and processing data as it flows through a business process.
Lenses supports you in this architectural and development paradigm shift in various ways.
- Provides the building blocks to build visual streaming data flows
- Monitors data pipelines end to end and alerts to never miss and SLA
- Supports multiple data types out of the box like AVRO, JSON, XML, CSV, Protobufs etc
- You can provide custom serializers and deserializers to support custom data sets
- Custom Applications can be exposed in Lenses Topology and leverage its monitoring capabilities
- Lenses SQL Engine supports your data pipelines to build Processors and customise Connectors
- Monitor Producers & Consumers rates as well as deployment metrics
- Access data and Debug your apps with an intuitive web interface where you can inspect & add data
Lenses integrates with Kubernetes to deploy your data flows. Lenses SQL Engine allows to register SQL processors which are automatically deployed and scaled leveraging your Kubernetes cluster.
Lenses & Lenses SQL Engine supports multiple data types out of the box, while allows you to register custom serializers and deserializers.
You can browse, query and register processors for AVRO, JSON, Protobuf, XML, CSV, Array payloads, query plain text with Regex. For AVRO payloads, Lenses integrates with Schema Registry (Confluent and Hortonworks) but adds lineage and role based access on top, to meet the demands of a corporate environment.
Lenses Topology View¶
Microservices architectures can be chaotic and requires constant monitoring. Lenses visualises the end to end flows with interactive nodes which gives you a global view of what services are currently connected to your systems.
Accessing data is a vital feature when you are building a central data platform within your organisation. Lenses provides security and data governance features to allow access to the data for both development and business needs. You can insert data in order to debug and test your microservices via the User Interface or the Command Line. Lenses has a rich ecosystem to deliver data to other systems and leverages your favourite tools. Integrates with Kafka Connect, to move data around your systems, JDBC driver to integrate with BI tools, Python for data scientists to integrate with favourite notebooks such as Jupyter, Reduxjs library to get real time data straight to your frontend application.
Producers and Consumers¶
Monitor your apps like a pro! Lenses provides in-app features to monitor you applications and apply alerts to never miss an SLA. Moreover, Lenses ships with rich Kafka specific dashboards in order to monitor your low level metrics.