Database Reference
In-Depth Information
Figure 10-21. Running in 32-bit mode
You can now schedule this package as a SQL Server job and run the data load on a periodic basis. You also might
want to apply some transformation to the data before it loads into the target SQL warehouse to clean it or to apply
necessary business logic using the inbuilt SSIS Data Flow Transformation components.
There are other programmatic ways through which you can initiate a Hadoop job from SSIS. For example, you
can develop your own custom SSIS components using .NET and use them to automate Hadoop jobs. A detailed
description on this approach can be found on the following MSDN whitepaper:
http://msdn.microsoft.com/en-us/library/jj720569.aspx
Summary
In this chapter, you had a brief look into SQL Server and its Business Intelligence components. You also developed a
sample package that connects to Hive using the Microsoft Hive ODBC Driver and imports data from the Hive table
stock_analysis to SQL Server. Once the data is in SQL Server, you can leverage warehousing solutions like Analysis
Services to slice and dice the data and use Reporting Services for powerful reporting on the data. This also enables you
to integrate nonrelational data to be merged with traditional RDBMS data and extract information from it as a whole.
 
Search WWH ::




Custom Search