0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

From any node in a Hadoop cluster, what is the command to

  • identify the running namenode?
  • identify all running datanodes?

I have looked through the commands manual and have not found this.

1 Answer

0 votes
by (32.5k points)
edited by

Firstly, use the jps command to check the daemons that are running in the cluster.

To know the report of your HDFS daemons, you can execute this command:

bin/hdfs dfsadmin -report


Try to execute these commands:

hdfs getconf -namenodes

hdfs getconf -secondaryNamenodes

.. .. .. ..

If you want to know more about Hadoop, then do check out the following video tutorial:

Welcome to Intellipaat Community. Get your technical queries answered by top developers !