Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

Whenever I am trying to create a directory in HDFS, it is showing me an error,i.e. :

root# bin/hadoop fs -mkdir temp

mkdir: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/root/temp. Name node is in safe mode.

It tells me that namenode is in safe mode. How can I get rid of this?

I have tried -safenode leave command but still, it does not make any difference.

hdfs# bin/hadoop fs -safemode leave

safemode: Unknown command


 

1 Answer

0 votes
by (32.3k points)
edited by

The command that you are using here is an fs command of Hadoop. An fs command refers to any file system, it could be either HDFS or local. That is why, you are getting an error because -safemode is not an argument of Hadoop fs(file system). Since, safenode is a command of Hadoop dfsadmin, try running your command in dfsadmin, so it directly refers to HDFS file system, not your local system.

hdfs dfsadmin -safemode leave

You can check out the following video you have any doubts regarding the same:

Related questions

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...