Database Reference
In-Depth Information
Exporting the DynamoDB table to HDFS
Using the next Hive command, you can perform faster export operations, because Hive
0.7.1.1 will use HDFS as an intermediate step at the time of exporting data to S3. In this
example, hdfs:///directoryName should be a valid HDFS path, and a table called
givenHiveTableName , which references DynamoDB, should exist in Hive. Have a
look at the following command:
CREATE EXTERNAL TABLE givenHiveTableName (col1 string, col2
bigint, col3 array<string>)
STORED BY
'org.apache.hadoop.hive.dynamodb.DynamoDBStorageHandler'
TBLPROPERTIES ("yourdynamodb.table.name" = "uchit",
"dynamodb.column.mapping" =
"col1:givenname,col2:givenyear,col3:givendays");
INSERT OVERWRITE DIRECTORY 'hdfs:///directoryName' SELECT *
FROM givenHiveTableName;
Search WWH ::




Custom Search