All the commands which come under the Hadoop file system can be written by using following syntax:-
Hadoop fs –command
We use ls command to list all Hadoop files stored in HDFS.
$ Hadoop fs –ls
This command is used to create the directory in Hdfs
$ Hadoop fs –mkdir techaltum
Note: - tech altum is my directory name
In linux file system we use cat command both to read and create the file. But in Hadoop system we cannot create files in HDFS. We can only load the data. So we use cat command in Hdfs only to read the file.
$ Hadoop fs –cat wc.txt
Note: - wc.txt is my file name and it will show all the contents on the screen of this file.
How to load data from Local to Hadoop
Now the most important topic comes in which we have to see how we can load data from local file system to Hadoop. As we know until and unless we will not load data into HDFS we cannot process that data. We have so many ways to load data into Hadoop and using command is one of the ways.
This command is used to copy data from local file system to Hadoop file system.
In this command source-path will be the local files and destination-path will be the Hadoop file system path.
Create a file in local file system by using a cat command
$ cat > data.txt
Start typing your data and to save file use ctrl+d
Now use following command to show the file data
$ cat data.txt
Note: - data.txt is my file name.
Now copy this file into Hadoop by using the following command
$ Hadoop fs –copyFromLocal data.txt db.txt
This command will copy data.txt into Hadoop with the name db.txt.
If you want to load the file into any Hadoop directory then first create the directory in Hadoop system and then use the following command.
In this way we can load data from local system to Hadoop.
How to load data from Hadoop to Local
In the above command we have seen how to load data from local to Hadoop means in hdfs. Hadoop also provide us the facility to load hdfs data into local file system. By using following commands we can do this.