Hadoop-Basic HDFS Commands

  • date 21st February, 2019 |
  • by Prwatech |


♦ Open a terminal window to the current working directory.

==> /home/training

♦ Print the Hadoop version ⇒ hadoop version

♦ List the contents of the root directory in HDFS ⇒  hadoop fs -ls /

♦ Count the number of directories,files and bytes under the paths that match the specified file   pattern⇒  hadoop fs -count hdfs:/

♦ Run a DFS filesystem checking utility ⇒  hadoop fsck – /

♦ Run a cluster balancing utility ⇒  hadoop balancer

♦ Create a new directory named “hadoop” below the /user/training directory in HDFS. Since you’re currently logged in with the “training” user ID,/user/training is your home directory in HDFS ⇒  hadoop fs -mkdir /user/training/hadoop

♦ Add a sample text file from the local directory
named “data” to the new directory you created in HDFS during the previous step ⇒  hadoop fs -put data/sample.txt/user/training/hadoop


♦ List the contents of this new directory in HDFS ⇒  hadoop fs -ls /user/training/hadoop

♦ Add the entire local directory called “retail” to the /user/training directory in HDFS ⇒  hadoop fs -put data/retail /user/training/hadoop

♦ Since /user/training is your home directory in HDFS, any command that does not have an absolute path is interpreted as relative to that directory. The next command will therefore list your home directory, and should show the items you’ve just added there ⇒  hadoop fs -ls

♦ Delete a file ‘customers’ from the “retail”   directory ⇒  hadoop fs -rm hadoop/retail/customers

♦ Ensure this file is no longer in HDFS ⇒  hadoop fs -ls hadoop/retail/customers

♦ Delete all files from the “retail” directory using a wildcard ⇒  hadoop fs -rm hadoop/retail/*

♦ To empty the trash ⇒  hadoop fs -expunge

♦ Finally, remove the entire retail directory and all
of its contents in HDFS ⇒  hadoop fs -rm -r hadoop/retail

♦ List the hadoop directory again ⇒  hadoop fs -ls hadoop

♦ Add the purchases.txt file from the local directory named “/home/training/” to the hadoop directory you created in HDFS ⇒  hadoop fs -copyFromLocal /home/training/purchases.txt hadoop/

♦ To view the contents of your text file purchases.txt which is present in your hadoop directory ⇒  hadoop fs -cat hadoop/purchases.txt

♦ Add the purchases.txt file from “hadoop” directory which is present in HDFS directory to the directory “data” which is present in your local directory ⇒  hadoop fs -copyToLocal hadoop/purchases.txt /home/training/data

♦ cp is used to copy files between directories present in HDFS ⇒  hadoop fs -cp /user/training/*.txt /user/training/hadoop

♦ ‘-get’ command can be used alternaively to ‘-copyToLocal’ command ⇒  hadoop fs -get hadoop/sample.txt /home/training/

♦ Display last kilobyte of the file “purchases.txt” to stdout ⇒  hadoop fs -tail hadoop/purchases.txt

♦ Default file permissions are 666 in HDFS
Use ‘-chmod’ command to change permissions of a file ⇒  hadoop fs -ls hadoop/purchases.txt

sudo -u hdfs hadoop fs -chmod 600 hadoop/purchases.txt

♦ Default names of owner and group are training,training  Use ‘-chown’ to change owner name and group name simultaneously ⇒
hadoop fs -ls hadoop/purchases.txt

sudo -u hdfs hadoop fs -chown root:root hadoop/purchases.txt


Default name of group is training Use ‘-chgrp‘ command to change group name ⇒  hadoop fs -ls hadoop/purchases.txt

sudo -u hdfs hadoop fs -chgrp training hadoop/purchases.txt

♦ Move a directory from one location to other ⇒
hadoop fs -mv hadoop apache_hadoop

♦ Default replication factor to a file is 3. Use ‘-setrep’ command to change replication factor of a  file ⇒  hadoop fs -setrep -w 2 apache_hadoop/sample.txt

♦ Command to make the name node leave safe  mode ⇒  hadoop fs -expunge sudo -u hdfs hdfs dfsadmin -safemode leave

♦ List all the hadoop file system shell commands ⇒  hadoop fs

♦ Last but not least, always ask for help! ⇒  hadoop fs -help

2 thoughts on “Hadoop-Basic HDFS Commands”

  1. Inspirational content, have achieved a good knowledge from the above content on Hadoop Training useful for all the aspirants of Hadoop Training training.

Leave a Reply

Your email address will not be published. Required fields are marked *

Quick Support

image image