0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.

Message from terminal with command:

[email protected]:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)

[email protected]:/usr/local/hadoop$ 


After using sudo and adding hduser to sudouser:

[email protected]:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

[email protected]:/usr/local/hadoop$

1 Answer

0 votes
by (25.6k points)
edited by

When you are writing this code:

[email protected]:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input 

put: /usr/local/input-data (Permission denied)

Here, the user(hduser) does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive.

And in your second part of the code:

[email protected]:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /input put:org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

Here, the user root doesn’t have access to the HDFS directory(/input).

As you can see: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

It tells that hduser has just write access, so you need to change the permission.

I would suggest you to try this approach:

sudo -u hdfs hadoop fs -mkdir <dir path>

sudo -u hdfs hadoop fs -chown <dir path>

 

Then try -put command:

$ Hadoop fs –put <source-path> <destination-path>

Refer to the following video tutorial, if you want to know more about Hadoop:

 

...