Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
+13 votes
4 views
in Big Data Hadoop & Spark by (250 points)

I installed Hadoop on the server running CentOs. As I run start-dfs.shor stop-dfs.sh I got the following error :

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I am using Hadoop 2.2.0 Version. Ihadoop-env.sh I have also added these two environment variables.  

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any solution what I have to do? 

4 Answers

+12 votes
by (13.2k points)
edited by

This error arises as the native library of Hadoop is build in for 32 bit system, your system maybe 64 bit.

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/" 


So, it is giving a warning but not much of Hadoop functionalities will be affected.You can ignore the warning or if you want to remove it then :

Either replace the 32 bit library by downloading Hadoop source code and recompiling it.

Or in hadoop-env.sh where you have added environment variables,

add the word NATIVE like this :

  export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

+1 vote
by (32.3k points)
edited by

There can be two reasons for this warning:

1. You have installed the wrong Java JDK8 package.

Please ensure that you have downloaded the 64-bit JDK8 and remove your current 32-bit JDK8.

2. It's a warning due to the Hadoop libraries being compiled for 32bits. You are probably running on a 64bit OS.

It's safe to ignore this warning.

For more information regarding the same, refer the following video:

0 votes
by (37.3k points)

This warning indicates that Hadoop is unable to load the native libraries. It is just a warning; it should not affect the working of the cluster. So you can ignore it for once, but if you want to fix it, here is the solution to it.
 

  1. First, check if the native libraries exist in the specified directory (/usr/local/hadoop/lib/native/). If they are missing, build them.

  2. Ensure that the version of the native libraries matches the Hadoop version you are using.

  3. Check the permissions of the directory and files where the native libraries are located to ensure that the user running Hadoop has read and execute permissions.

  4. Once the above steps are completed, you can restart the Hadoop services:

$HADOOP_HOME/sbin/stop-dfs.sh $HADOOP_HOME/sbin/start-dfs.sh

0 votes
ago by (1.1k points)

This is an error message often shown when Hadoop is incapable of identifying the required native libraries necessary to work at its peak performances. It means perhaps Hadoop has some type of inability in certain aspects and/or works a bit less efficiently and probably less at the expected performances.

Here are steps to address this problem:

1. Set Up Native Libraries 

You need the native libraries for Hadoop (like libhadoop.so). These libraries are typically included with the Hadoop binaries, but you may have to build or install them if they’re absent. You can try:

Install the hadoop-native libraries with yum or another package manager available at this time.

yum sudo install hadoop-native

If the native libraries are not available in your package manager, you may build them from source. Refer to Hadoop's native library building guidelines for how to do this.

2. Confirm the Path to Native Libraries

Ensure that the paths specified in hadoop-env.sh correctly point to the location of the native folder that contains libhadoop.so. Normally, the native libraries can be found in the lib/native subdirectory of Hadoop’s installation directory, not merely in the lib folder.

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"

Export HADOOP_COMMON_LIB_NATIVE_DIR=\\"/usr/local/hadoop/lib/native"

3. Check the Environment Variables

Verify that Hadoop correctly propagated changes you have made in hadoop-env.sh. You can inspect values for HADOOP_OPTS and HADOOP COMMON LIB NATIVE DIR by running:

echo "$HADOOP_OPTS echo ${HADOOP_COMMON_LIB_NATIVE_DIR}

Start Hadoop with:

start-dfs.sh

31k questions

32.8k answers

501 comments

693 users

Browse Categories

...