Databases Reference
In-Depth Information
Long Latency Scenario
Assume you own a small company that is selling a key set of products
which are essentially static in nature; the base list of products just
doesn't change. New products may be added to the list, but the original
set of products remains the same. In this scenario, several of your
products are sold each day and the sales data arrives at your data ware-
house sometime after normal business hours. Further, your company is
headquartered in the United States. The business analysts on your team
want to see sales data no later than the next working day following the
actual sale. In this scenario, assume incremental processing of your cube
takes a relatively small amount of time (just 1-2 hours), which can be
completed before start of the next business day. It is also assumed that
data updates (information about new products added into the system) in
your relational databases arrive within a reasonable time.
The traditional approach to solving the outlined scenario would be to have the di-
mension storage as MOLAP and doing an incremental update of dimensions after
relational data update is completed. This approach is computation intensive and is
a fairly costly operation. Following that dimension data update, an incremental pro-
cess of the relevant measure groups is required, and once that completes, the con-
sumers of the cube can browse the results. This approach has advantages. Indeed,
this approach is good whenever your relational data updates occur regularly at a
specific time interval and you have sufficient time to update the cubes to appropriate
users. Several existing Analysis Services users in the retail space use this solution.
Data typically arrives during the night and the cubes are processed nightly for use
the next business day.
As with the traditional approach, you can do an incremental process of
dimensions and measure groups. Or for the sake of completeness, and
given the time, you could even do a full process of the entire cube. Again,
these things typically take place during the night so time is not often a
constraint. You could use SQL Server 2005 Integration Services to create
a package to do this job as seen in Chapter 15 . Alternatively, you can use
the proactive caching feature. There are two basic methods (with mul-
tiple variations) within proactive caching that can be used to initiate data
updates. They are query-based method and the timebased method; the
method you choose will depend on your needs.
 
Search WWH ::




Custom Search