Explore Courses Blog Tutorials Interview Questions
0 votes
in Big Data Hadoop & Spark by (6.5k points)
Is there an ideal machine configuration to run Hadoop or we can run it on any hardware?

1 Answer

0 votes
by (11.3k points)
edited by

The ideal setup for running Hadoop operations are machines that have a dual-core configuration (physical, preferably) and 4GB to 8GB servers/nodes which use ECC memory. Focusing on good memory specifications is important because HDFS running smoothly is very highly reliant on memory efficiency and robustness. 

To gain more knowledge of big data, sign up for this Big Data Online Training.

Browse Categories