Flat 10% & upto 50% off + Free additional Courses. Hurry up!

HDFS Operations


Starting HDFS

 Format the configured HDFS file system, open namenode (HDFS server) and execute the following command.

 $ hadoop namenode -format

Start the distributed file system. Following command will start the namenode as well as the data nodes in cluster.



Listing Files in HDFS

One can find the list of files in a directory, status of a file using ‘ls’. Syntax of ls can be passed to a directory or a filename as an argument like as follows:

$ $HADOOP_HOME/bin/hadoop fs -ls <args>


Inserting Data into HDFS

Steps mentioned below, should be followed to insert the required file in the Hadoop file system.

Step 1 : Create an input directory

$ $HADOOP_HOME/bin/hadoop fs -mkdir /user/input

Step 2 : Use put command transfer and store the data file from the local systems to the HDFS

$ $HADOOP_HOME/bin/hadoop fs -put /home/intellipaat.txt /user/input

Step 3 : Verify the file using ls command.

$ $HADOOP_HOME/bin/hadoop fs -ls /user/input


Retrieving Data from HDFS

For instance: if we have a file in HDFS called intellipaat. We may retrieve the required file from the Hadoop file system by carrying out:

Step 1 : View the data from HDFS using cat command.

$ $HADOOP_HOME/bin/hadoop fs -cat /user/output/intellipaat

Step 2 : Gets the file from HDFS to the local file system using get command

$ $HADOOP_HOME/bin/hadoop fs -get /user/output/ /home/hadoop_tp/


Shutting Down the HDFS

Shut down the HDFS by using the following command


"0 Responses on HDFS Operations"

Leave a Message

100% Secure Payments. All major credit & debit cards accepted Or Pay by Paypal.

Sales Offer

  • To avail this offer, enroll before 17th January 2017.
  • This offer cannot be combined with any other offer.
  • This offer is valid on selected courses only.
  • Please use coupon codes mentioned below to avail the offer

Sign Up or Login to view the Free HDFS Operations.