Back
The ideal setup for running Hadoop operations are machines that have a dual-core configuration (physical, preferably) and 4GB to 8GB servers/nodes which use ECC memory. Focusing on good memory specifications is important because HDFS running smoothly is very highly reliant on memory efficiency and robustness.
To gain more knowledge of big data, sign up for this Big Data Online Training.
31k questions
32.8k answers
501 comments
693 users