Hadoop Tutorial
Written By:- Isha MalhotraHadoop Commands
Hadoop Commands to work with HDFS
All the commands which come under the Hadoop file system can be written by using following syntax:- Hadoop fs –command
Continue...Sqoop
What is Sqoop
Sqoop is an open source which is the product of Apache. SQOOP stands for SQL to Hadoop. It is the tool which is the specially designed to transfer data between Hadoop and RDBMS like SQL Server,...
Continue...Import Command in SQOOP
In SQOOP import command is used to import RDBMS data into HDFS. Using import command we can import a particular table into HDFS.
In this example I will show how we can import MySQL database table into HDFS.
Continue...Import Command with warehouse-dir
In my last article I explained how we can use import command. In that article I have explained the use of --m , --target-dir and --where using import command. In this article I am going to explore more option of import command in sqoop.
Continue...Hive & HiveQL
What is hive and hiveQL
Apache hive was introduced by Facebook firstly. Later on it is used and developed by Apache. It is used to analysis the data stored in HDFS. It is datawarehourse tool which provides SQL like language i.e HiveQL to process ...
Continue...Pig and Pig Latin
What is Pig and Pig Latin
Yahoo initially introduced the pig and pig latin. It is executed on Apache Hadoop Environment to process Big Data. It is having a scripting data flow high level language called pig latin to process Hadoop Data.
Continue...HBase Tutorial
What is Hbase and its Component
Hbase is an open source column oriented NO-SQL i.e. non-relational database which is built on the top of the Hadoop eco system.
Continue...