Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

From any node in a Hadoop cluster, what is the command to

  • identify the running namenode?
  • identify all running datanodes?

I have looked through the commands manual and have not found this.

1 Answer

0 votes
by (32.3k points)
edited by

Firstly, use the jps command to check the daemons that are running in the cluster.

To know the report of your HDFS daemons, you can execute this command:

bin/hdfs dfsadmin -report

Or 

Try to execute these commands:

hdfs getconf -namenodes

hdfs getconf -secondaryNamenodes

.. .. .. ..

If you want to know more about Hadoop, then do check out the following video tutorial:

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...