Memory & CPU
This page describes the memory & cpu prerequisites for Lenses.
This documentation provides memory recommendations for Lenses.io, considering the number of Kafka topics, the number of schemas, and the complexity of these schemas (measured by the number of fields). Proper memory allocation ensures optimal performance and stability of Lenses.io in various environments.
Key Considerations
Number of Topics: Kafka topics require memory for indexing, metadata, and state management.
Schemas and Their Complexity: The memory impact of schemas is influenced by both the number of schemas and the number of fields within each schema. Each schema field contributes to the creation of Lucene indexes, which affects memory usage.
Baseline Memory Requirements
For a basic setup with minimal topics and schemas:
Minimum Memory: 4 GB
Recommended Memory: 8 GB
This setup assumes:
Fewer than 100 topics
Fewer than 100 schemas
Small schemas with few fields (less than 10 fields per schema)
Scaling with Topics
Memory requirements increase with the number of topics. Topics are used as the primary reference for memory scaling, with additional considerations for schemas.
Number of Topics / Partitions | Recommended Memory |
Up to 1,000 / 10,000 partitions | 12 GB |
1,001 to 10,000 / 100.000 partitions | 24 GB |
10,001 to 30,000 / 300.000 partitions | 64 GB |
Impact of Schemas and Their Complexity
Schemas have a significant impact on memory usage, particularly as the number of fields within each schema increases. The memory impact is determined by both the number of schemas and the complexity (number of fields) of these schemas.
Memory Addition Based on Schema Complexity
Schema Complexity | Number of Fields per Schema | Memory Addition |
Low to Moderate Complexity | Up to 50 fields | None |
High Complexity | 51 - 100 fields | 1 GB for every 1,000 schemas |
Very High Complexity | 100+ fields | 2 GB for every 1,000 schemas |
Cross-Reference Table for Topics and Schema Complexity
Number of Topics | Number of Schemas | Number of Fields per Schema | Base Memory | Additional Memory | Total Recommended Memory |
1,000 | 1,000 | Up to 10 | 8 GB | None | 12 GB |
1,000 | 1,000 | 11 - 50 | 8 GB | None | 12 GB |
5,000 | 5,000 | Up to 10 | 12 GB | None | 16 GB |
5,000 | 5,000 | 11 - 50 | 12 GB | None | 16 GB |
10,000 | 10,000 | Up to 10 | 16 GB | None | 24 GB |
10,000 | 10,000 | 51 - 100 | 24 GB | 10 GB | 34 GB |
30,000 | 30,000 | Up to 10 | 64 GB | None | 64 GB |
30,000 | 30,000 | 51 - 100 | 64 GB | 30 GB | 94 GB |
Example Configurations
To help illustrate how to apply these recommendations, here are some example configurations considering both topics and schema complexity:
Small Setup
Topics: 500
Schemas: 100 (average size 50 KB, 8 fields per schema)
Recommended Memory: 8 GB
Schema Complexity: Low → No additional memory needed.
Total Recommended Memory: 8 GB
Medium Setup
Topics: 5,000
Schemas: 1,000 (average size 200 KB, 25 fields per schema)
Base Memory: 12 GB
Schema Complexity: Moderate → No additional memory needed.
Total Recommended Memory: 16 GB
Large Setup
Topics: 15,000
Schemas: 3,000 (average size 500 KB, 70 fields per schema)
Base Memory: 32 GB
Schema Complexity: High → Add 3 GB for schema complexity.
Total Recommended Memory: 35 GB
High-Volume Setup Examples
30,000 Topics
Schemas: 5,000 (average size 300 KB, 30 fields per schema)
Base Memory: 64 GB
Schema Complexity: Moderate → Add 5 GB for schema complexity.
Total Recommended Memory: 69 GB
Additional Considerations
High Throughput: If your Kafka cluster is expected to handle high throughput, consider adding 20-30% more memory than the recommendations.
Complex Queries and Joins: If using Lenses.io for complex data queries and joins, consider increasing the memory allocation by 10-15% to accommodate the additional processing.
Monitoring and Adjustment: Regularly monitor memory usage and adjust based on actual load and performance.
Conclusion
Proper memory allocation is crucial for the performance and reliability of Lenses.io, especially in environments with a large number of topics and complex schemas. While topics provide a solid baseline for memory recommendations, the complexity of schemas—particularly the number of fields—can also significantly impact memory usage. Regular monitoring and adjustments are recommended to ensure that your Lenses.io setup remains performant as your Kafka environment scales.
Last updated