Hadoop Commands with example

Commands in Big Data Hadoop with examples.

Hadoop commands structure

All the commands which come under the Hadoop file system can be written by using following syntax:-

Hadoop fs –command

Ls Command

We use ls command to list all Hadoop files stored in HDFS.

For example:-





	$ Hadoop fs –ls
		

Mkdir

This command is used to create the directory in Hdfs

	
		
		$ Hadoop fs –mkdir techaltum
		

Note: - tech altum is my directory name

Cat

In linux file system we use cat command both to read and create the file. But in Hadoop system we cannot create files in HDFS. We can only load the data. So we use cat command in Hdfs only to read the file.


		$ Hadoop fs –cat wc.txt
		

Note: - wc.txt is my file name and it will show all the contents on the screen of this file.

How to load data from Local to Hadoop

Now the most important topic comes in which we have to see how we can load data from local file system to Hadoop. As we know until and unless we will not load data into HDFS we cannot process that data. We have so many ways to load data into Hadoop and using command is one of the ways.

copyFromLocal

This command is used to copy data from local file system to Hadoop file system.

Syntax
$ Hadoop fs –copyFromLocal source-path destination-path

In this command source-path will be the local files and destination-path will be the Hadoop file system path.

For Example

Create a file in local file system by using a cat command
$ cat > data.txt
Start typing your data and to save file use ctrl+d
Now use following command to show the file data
$ cat data.txt
Note: - data.txt is my file name.
Now copy this file into Hadoop by using the following command




		$ Hadoop fs –copyFromLocal data.txt db.txt
		

This command will copy data.txt into Hadoop with the name db.txt.

If you want to load the file into any Hadoop directory then first create the directory in Hadoop system and then use the following command.


		$ Hadoop fs –copyFromLocal data.txt /techaltum/data.txt
		

Put

We also use put command to copy data from local file system to Hadoop file system.


		$ Hadoop fs –put source-path destination-path
		

Note: - if we want to copy more than one file into hdfs then we have to give directory as a source.

For Example


$ Hadoop fs –put file1 file2 hadoop-dir
$ Hadoop fs –put abc.txt wc.txt techaltum
code>
		


MoveFromLocal

We also use this command to load data from local to hdfs but this command remove the file from the local.


$ Hadoop fs –moveFromLocal local-source-path hdfs-destination-path
		

Note: - we can also move more than one file

In this way we can load data from local system to Hadoop.

How to load data from Hadoop to Local

In the above command we have seen how to load data from local to Hadoop means in hdfs. Hadoop also provide us the facility to load hdfs data into local file system. By using following commands we can do this.

copyToLocal


$ Hadoop fs –copyToLocal source-hadoop-path destination-local-path
		

In this command source path will be our Hadoop hdfs path and destination path will be our local file system.

Get


$ Hadoop fs –get source-hadoop-path destination-local-path