Back

Explore Courses Blog Tutorials Interview Questions
0 votes
3 views
in Big Data Hadoop & Spark by (6.5k points)
I installed Hadoop 3.2.0 to /home/angad/, but when I try to run any hadoop fs operation, it says the command isn't found.

1 Answer

0 votes
by (11.3k points)

Whenever you get this specific error, it means that you haven't set a path to the executable files of the program that you're trying to run or that the path that is set, is broken and your target executable does not exist there. This also happens when you set a path to a program and then change the location of that program on your file system or the path was never set in the first place. 

In your case, you need to set the Path for your Hadoop Ecosystem:

Open your terminal:

Open 'vi ~/.bashrc':

Add the following lines: 

export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.212.b04-0.el7_6.x86_64/jre   ## Change it according to your system
export HADOOP_HOME=/home/hadoop/hadoop   ## Change it according to your system
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

Save the file and exit.

Run 'source ~/.bashrc' to set the paths.

Now, try the same Hadoop statements again.

Hadoop environments can be a very tricky thing to set up. This hadoop tutorial real makes it an easy affair so you can get started with your big data training

Browse Categories

...