Hardware Requirement for Kafka Kafka uses 24 Giga bytes memory for dual quad-core machines.Memory need can be calculated using the command write_throughput*30. We need a lot of memory for buffering active readers and writers. So keeping a backup of every bit of memory is very much essential. This tool runs on Unix, Linux as well as in Solaris. Since Kafka consists a lot of Topics and links so it needs a large number of file descriptors. Even the socket buffer size is also enlarged for high-quality transfer of data between the centers. The same Kafka drive should not be used again and again for other applications because it will reduce the throughput. Instead of using the drives several times, you can just RAID them together and perform mirroring on it. The replicated drive can be used for the applications. The data should be arranged in proper partitions and then given to different directories so that unevenness does not occur among the drives. When you arrange the drives in RAID, it reduces the unevenness of the drives, but also consumes a lot of disk space. But RAID arrangement has high fault tolerating capability. Since it reduces disk space as well as deactivates the broker during its recreation, hence it reduces the availability performances.