Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.9k points)

When I type in the start-all.sh command, it shows all the processes initializing properly as follows:

starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-namenode-jawwadtest1.out

jawwadtest1: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest1.out

jawwadtest2: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest2.out

jawwadtest1: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-secondarynamenode-jawwadtest1.out

starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-jobtracker-jawwadtest1.out

jawwadtest1: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest1.out

jawwadtest2: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest2.out

However, when I type the jps command, I get the following output:

31057 NameNode

4001 RunJar

6182 RunJar

31328 SecondaryNameNode

31411 JobTracker

32119 Jps

31560 TaskTracker

As you can see, there's no datanode process running. I tried configuring a single-node cluster but got the same problem. Would anyone have any idea what could be going wrong here? Are there any configuration files that are not mentioned in the tutorial or I may have looked over? I am new to Hadoop and am kinda lost and any help would be greatly appreciated.

EDIT: hadoop-root-datanode-jawwadtest1.log:

STARTUP_MSG:   args = []

STARTUP_MSG:   version = 1.0.3

STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/$

************************************************************/

2012-08-09 23:07:30,717 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loa$

2012-08-09 23:07:30,734 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$

2012-08-09 23:07:30,735 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$

2012-08-09 23:07:30,736 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$

2012-08-09 23:07:31,018 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$

2012-08-09 23:07:31,024 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$

2012-08-09 23:07:32,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to $

2012-08-09 23:07:37,949 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: $

        at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(Data$

        at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransition$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNo$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNod$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode($

        at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataN$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.$

        at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1$

2012-08-09 23:07:37,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: S$

/************************************************************

SHUTDOWN_MSG: Shutting down DataNode at jawwadtest1/198.101.220.90

************************************************************/

1 Answer

0 votes
by (32.1k points)

You need to do something like this:

bin/stop-all.sh (or stop-dfs.sh and stop-yarn.sh in the 2.x serie)

rm -Rf /app/tmp/hadoop-your-username/*

bin/hadoop namenode -format (or hdfs in the 2.x serie)

Related questions

+1 vote
1 answer
0 votes
1 answer
0 votes
1 answer

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...