Database Reference
In-Depth Information
--m 1
--target-dir /MsBigData/Customers --append
sqoop import --connect
"jdbc:sqlserver://Your_SqlServer;database=MsBigData;
Username=demo;Password=your_password;"
--table Customers
--m 1
--hive-import --hive-table CustomerImport
NOTE
There is also a Sqoop
import-all-tables
command. This imports
all tables and all their columns from the specified database. It functions
well only if all the tables have single column primary keys. Although
you can specify a list of tables to exclude with this command, it has less
flexibility and control than importing individual tables. Because of this,
it is recommended that you import tables one at a time in most cases.
Copying Data to SQL Server
The Sqoop
export
command enables you to export data from Hadoop
to relational databases. As with the
import
command, it uses the table
definition in the relational database to derive metadata for the operation, so
it requires that the database table already exists before you can export data
to the database:
sqoop export --connect
"jdbc:sqlserver://Your_SqlServer;database=MsBigData;
Username=demo;Password=your)password;"
--table Customers --export-dir /MsBigData/Customers
The arguments for the
export
command are similar to the
import
command. However, you have fewer options with the export.
--export-dir
indicates the folder in the Hadoop file system that will
be used as the source for records to load into the database. The
--table