Sqoop Import from MySQL to Hive & HBase

Preparing Source data into MySQL

If you followed the last post you must be readily having the ‘databare_emp’ or ‘databare_employees’ tables in dataguy database. In case you don’t have, you may use the following scripts to create a table in MySQL which we will move using Sqoop to Hive & Hbase.

Create database dataguy;CREATE TABLE `dataguy`.`databare_emp ` (
`empid` INT NOT NULL,
`fname` VARCHAR(45) NOT NULL,
`lname` VARCHAR(45) NULL,
`dob` DATE NOT NULL,
`gender` CHAR NOT NULL,
`hire_date` DATE NOT NULL,
PRIMARY KEY (`empid`));
INSERT INTO `dataguy`.`databare_emp` (`empid`,`fname`,`lname`,`dob`,`gender`,`hire_date`)VALUES (1001,'Nitin','Khandelwal','1986-08-01','M','2017-10-20');
INSERT INTO `dataguy`.`databare_emp` (`empid`,`fname`,`lname`,`dob`,`gender`,`hire_date`)VALUES (1002,'Naveen','Khandelwal','1988-01-12','M','2018-09-19');
INSERT INTO `dataguy`.`databare_emp` (`empid`,`fname`,`lname`,`dob`,`gender`,`hire_date`)VALUES (1003,'Ravi','Aggarwal','1997-12-21','M','2020-06-03');
INSERT INTO `dataguy`.`databare_emp` (`empid`,`fname`,`lname`,`dob`,`gender`,`hire_date`)VALUES (1004,'Rupika','Kaur','1992-05-07','F','2019-05-14');
INSERT INTO `dataguy`.`databare_emp` (`empid`,`fname`,`lname`,`dob`,`gender`,`hire_date`)VALUES (1005,'Anurag','Pandiya','1977-03-21','M','2000-08-11');
Select * from `dataguy`.`databare_emp` ;

Sqoop import data from MySQL to Hive

We will now use following Sqoop command to perform this import to Hive data warehouse table hive_emp within hiveDB database.

sqoop import --connect jdbc:mysql://dbserver.databare.com/dataguy --table databare_emp --username dataguyAD --password mypassword -m 1 --hive-import --create-hive-table --hive-table hiveDB.hive_emp --fields-terminated-by '\t'--delete-target-dir
--hive-import         : for importing data into hive rather than HDFS
--create-hive-table : create a new table into hive
--hive-table : hive table name (databasename.tablename format)
--delete-target-dir : as you can depict from name itself, this arguments makes sqoop delete the target directory first if it already exists
Hive
use hiveDB
select * from hive_emp;

Sqoop import data from MySQL to HBase

We will now use following Sqoop command to perform the import to HBase table.

sqoop import --connect jdbc:mysql://dbserver.databare.com/dataguy --table databare_emp --username dataguyAD --password mypassword -m 1 --hbase-create-table --hbase-table hbase_emp --column-family employee --hbase-row-key empid
--hbase-create -table : create a new table into hbase
--hbase-table : hbase table name
--column-family : column family name
--hbase-row-key : hbase row key identifier
Hbase shell
scan 'hbase_emp'

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Dataguy! - databare.com

Dataguy! - databare.com

11 Followers

Hi! I am Amit your DataGuy. Folks do call me ‘AD’ as well. I have worked over a decade with into multiple roles - developer, dba, data engineer, data architect.