Big Data Hadoop & Spark

How to Import Table from MySQL to HBase

Importing Table from MySQL to HBase

In this blog, we will be discussing how we can export tables from MySQL database to an HBase table.
Before moving further, to know how we can import table contents from MySQL to HBase table, we should know first why HBase came into the picture and how it overpowered the use of RDBMS.

Why not RDBMS Architecture

Firstly, the size of data has increased tremendously, well into the range of petabytes. RDBMS finds it challenging to handle such huge data volumes. To address this, RDBMS added several central processing units and more memory to the database management system to scale up vertically.
Second, the majority of the data comes in a semi-structured or unstructured format from social media, audio, video, texts, and emails. However, this problem related to unstructured data is outside the scope of RDBMS, because relational databases just can’t categorize unstructured data. They are designed and structured to accommodate structured data, such as weblog sensor and financial data.
“Big data” is being generated at a very high velocity but RDBMS lags when it comes to high velocity because it’s designed for steady data retention rather than rapid growth. Even if RDBMS is capable of handling and storing “Big Data,” it will turn out to be very expensive.
As a result, the inability of relational databases to handle “Big Data” led to the emergence of new technologies, thus, Google came up with a solution in 2004 – 2005, where they developed a NoSQL distributed column-oriented database known as BigTable, which allows user to perform random, real-time read/write access to the data stored in the distributed cluster. This is how Apache HBase was introduced. HBase modeled after Google’s BigTable and provides BigTable-like capabilities on top of Hadoop and HDFS.

Why HBase?

To know more about HBase and it’s working you can refer our below posts
HBase Beginner’s Guide
Working of HBase Components
HBase Read Write Operations

Why Sqoop?

Sqoop is a tool designed to transfer data between Hadoop and relational databases or mainframes. We can use Sqoop to import data from a relational database management system (RDBMS) such as Mysql or Oracle or a mainframe into the Hadoop Distributed FileSystem (HDFS), transform the data in Hadoop MapReduce, and then export the data back into an RDBMS.
To know more about Sqoop you can go through our below posts.
Beginner’s Guide for Sqoop
Installing Mysql and Sqoop in Hadoop
Export Files from HDFS to Mysql using Sqoop
Incremental Import in Sqoop to Load data from Mysql to HDFS
So now lets us now learn how to import tables from Mysql database to HBase using Sqoop tool.
In our below example, we will be using an existing table which is present in MySQL emp database. Now let us log in to our MySQL database using below commands.

Start MySQL service:

Start the Mysql service by using below command in the terminal

sudo service mysqld start

Once the Mysql service is started,  enter Mysql shell using the below command in the terminal.

Login to MySQL shell:

mysql -u root -p

Password: cloudera

In the above command -u represents the user name and -p represents the password. Here username is root and password to Mysql shell is cloudera.

Show databases:

show databases;

As we have mentioned earlier we will be using emp database in our example which is already available in Mysql DB.

Use database emp:

Follow the below code to use database emp;

Use emp;

Show tables:
Let us use show tables command to list the tables which are present in the database emp.

Show tables;

We can observe from the above image there is our example table employee in the database emp.

Describe table:

We can use below command to describe employee table schema.

Describe employee;

The DESCRIBE TABLE command lists the following information about each column:

  • Column name
  • Type schema
  • Type name
  • Length
  • Scale
  • Nulls (yes/no)

Display the contents of the table employee:

We can use below command to display all the columns present in the table employee.

select * from employee;


Grant all permission:

We can use below command to grant superuser permission to root.

grant all on *.* to ‘root’@’localhost’ with grant option;

MySQL privileges are critical to the utility of the system as they allow each of the users to access and utilize only the areas needed to perform their work functions. This is meant to prevent, a user from accidentally accessing an area where he or she should not have access. Additionally, this adds to the security of the MySQL server. When you connect to a MySQL server, the host from which we connect and the user name we specify determines our identity. With this information, the server then grants privileges based upon this identity.
The above step finishes the Mysql part.
Now, we need to create a new table in Hbase to import table contents from Mysql database. So, follow the below steps to import the contents from Mysql to Hbase.

Enter Hbase shell:

Use below command to enter HBase shell.

hBase shell

Create table:

We can use create command to create a table in Hbase.

create ‘Academp’,’emp_details’

We can observe from the above image we have create a new table in Hbase with the name Academp and column family as emp_details

Scan table:

We can use scan command to see a table contents in Hbase.

scan ‘Academp’

We can observe from the above image no contents are available in the table Academp

Sqoop import command:

Now use below command to import Mysql employee table to HBase Academp table.

sqoop import --connect jdbc:mysql://localhost/emp --username root --password cloudera --table employee
--hbase-table Academp --column-family emp_id --hbase-row-key id -m 1


Scan HBase table:

Now again use Scan ‘Academp’ command to see the table contents which is successfully imported from the Mysql employee table.

scan 'Academp'

We can observe from the above image we have successfully imported contents from a Mysql table to HBase table using Sqoop. For more updates on Big Data Hadoop and other technologies visit Acadgild blog section.



is working with AcadGild as Big Data Engineer and is a Big Data enthusiast with 2+ years of experience in Hadoop Development. He is passionate about coding in Hive, Spark, Scala. Feel free to contact him at [email protected] for any further queries.


  1. Thanks for sharing this. I got an error after executing sqoop import command.
    Error msg is
    — “java.lang.RuntimeException: java.lang.IllegalArgumentException: Row length is 0”
    I could not understand how to fix this… Can you help me?

  2. Nice article very useful for those working on Hbase.

    1) Can you please share any urls where we can write sql type queries on cloudera HBase .
    2) how to move data when primary key is combination of several columns in rdbms


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles