• Articles
  • Tutorials
  • Interview Questions

Kafka Hardware Requirement

Hardware Requirement for Kafka

Kafka uses 24 Giga bytes memory for dual quad-core machines.Memory need can be calculated using the command write_throughput*30.  We need a lot of memory for buffering active readers and writers. So keeping a backup of every bit of memory is very much essential.

This tool runs on Unix, Linux as well as in Solaris. Since Kafka consists a lot of Topics and links so it needs a large number of file descriptors. Even the socket buffer size is also enlarged for high-quality transfer of data between the centers.

The same Kafka drive should not be used again and again for other applications because it will reduce the throughput. Instead of using the drives several times, you can just RAID them together and perform mirroring on it. The replicated drive can be used for the applications.

The data should be arranged in proper partitions and then given to different directories so that unevenness does not occur among the drives.

Certification in Bigdata Analytics

When you arrange the drives in RAID, it reduces the unevenness of the drives, but also consumes a lot of disk space. But RAID arrangement has high fault tolerating capability. Since it reduces disk space as well as deactivates the broker during its recreation, hence it reduces the availability performances.

Course Schedule

Name Date Details
Big Data Course 14 Dec 2024(Sat-Sun) Weekend Batch View Details
21 Dec 2024(Sat-Sun) Weekend Batch
28 Dec 2024(Sat-Sun) Weekend Batch

About the Author

Technical Research Analyst - Big Data Engineering

Abhijit is a Technical Research Analyst specialising in Big Data and Azure Data Engineering. He has 4+ years of experience in the Big data domain and provides consultancy services to several Fortune 500 companies. His expertise includes breaking down highly technical concepts into easy-to-understand content.