Databases Reference
In-Depth Information
CHAPTER 5
Export
The previous three chapters had one thing in common: they described various use cases
of transferring data from a database server to the Hadoop ecosystem. What if you have
the opposite scenario and need to transfer generated, processed, or backed-up data from
Hadoop to your database? Sqoop also provides facilities for this use case, and the fol‐
lowing recipes in this chapter will help you understand how to take advantage of this
feature.
5.1. Transferring Data from Hadoop
Problem
You have a workflow of various Hive and MapReduce jobs that are generating data on
a Hadoop cluster. You need to transfer this data to your relational database for easy
querying.
Solution
You can use Sqoop's export feature that allows you to transfer data from the Hadoop
ecosystem to relational databases. For example, to export data from the export-dir
directory cities (the directory in HDFS that contains the source data) into table cities
(the table to populate in the database), you would use the following Sqoop command:
sqoop export \
--connect jdbc:mysql://mysql.example.com/sqoop \
--username sqoop \
--password sqoop \
--table cities \
--export-dir cities
 
Search WWH ::




Custom Search