Commands in Big Data Hadoop with examples.
All the commands which come under the Hadoop file system can be written by using following syntax:-
Hadoop fs –command
We use ls command to list all Hadoop files stored in HDFS.
$ Hadoop fs –ls
This command is used to create the directory in Hdfs
$ Hadoop fs –mkdir techaltum
Note: - tech altum is my directory name
In linux file system we use cat command both to read and create the file. But in Hadoop system we cannot create files in HDFS. We can only load the data. So we use cat command in Hdfs only to read the file.
$ Hadoop fs –cat wc.txt
Note: - wc.txt is my file name and it will show all the contents on the screen of this file.
Now the most important topic comes in which we have to see how we can load data from local file system to Hadoop. As we know until and unless we will not load data into HDFS we cannot process that data. We have so many ways to load data into Hadoop and using command is one of the ways.
This command is used to copy data from local file system to Hadoop file system.
$ Hadoop fs –copyFromLocal source-path destination-path
In this command source-path will be the local files and destination-path will be the Hadoop file system path.
Create a file in local file system by using a cat command
$ cat > data.txt
Start typing your data and to save file use ctrl+d
Now use following command to show the file data
$ cat data.txt
Note: - data.txt is my file name.
Now copy this file into Hadoop by using the following command
$ Hadoop fs –copyFromLocal data.txt db.txt
This command will copy data.txt into Hadoop with the name db.txt.
If you want to load the file into any Hadoop directory then first create the directory in Hadoop system and then use the following command.
$ Hadoop fs –copyFromLocal data.txt /techaltum/data.txt
We also use put command to copy data from local file system to Hadoop file system.
$ Hadoop fs –put source-path destination-path
Note: - if we want to copy more than one file into hdfs then we have to give directory as a source.
$ Hadoop fs –put file1 file2 hadoop-dir $ Hadoop fs –put abc.txt wc.txt techaltumcode>
We also use this command to load data from local to hdfs but this command remove the file from the local.
$ Hadoop fs –moveFromLocal local-source-path hdfs-destination-path
Note: - we can also move more than one file
In this way we can load data from local system to Hadoop.
In the above command we have seen how to load data from local to Hadoop means in hdfs. Hadoop also provide us the facility to load hdfs data into local file system. By using following commands we can do this.
$ Hadoop fs –copyToLocal source-hadoop-path destination-local-path
In this command source path will be our Hadoop hdfs path and destination path will be our local file system.
$ Hadoop fs –get source-hadoop-path destination-local-path