Database Reference
In-Depth Information
5. The refresh variable vAr_CheckmaxLErr tests validate.msh's StDErr
output. If a validate.err file with a file length greater than zero bytes is found,
a failure notice is written to the LoadStatus table through the Procedure
ProC_LogFailureStatus and the validatevariance table is truncated via the
Procedure ProC_truncatevariance.
6. If validate.err has a zero length, the flat file output from report Script
validate.rep is imported to the table validateASosamp through the Interface
Int_Importvalidatereport.
7. The Procedure ProC_FactvsEssbase compares the data values from Essbase to
a SQL-consolidated result set from the fact data. The difference is written to the
table validationvariance. With luck and clean living the variance should (must)
always be zero for all members.
Three Notes about Validation
1. The SQL aggregation in PROC_FactVsEssbase sums full totals for all dimensions. If desired, multiple levels
of aggregation by dimension in SQL with similar Essbase extracts could be created to make the variance test
more granular.
2. This validation process does not guard against incomplete attribute dimension association. If ASOsamp's
Square Footage attribute dimension is part of the retrieval and all level-0 Stores are not associated with
Square Footage, then the retrieved data results will be wrong, A separate base dimension to attribute dimen-
sion validation must occur.
3. The process in PKG_ExtractValidateReport validates that all data sent to Essbase actually made it into the
cube. Beware, however, if the original extraction process from the true source to the FactData table failed to
pick up some data, that error will not be flagged.
One thing that you could do to help trap this kind of data extraction error is to have the users create a top-level
query directly from the source system. Then, write a procedure to add a check against that value as well as the one
we developed here from the extracted data. Completing this process will make your IT auditors very happy. While
this is yet another challenge, you should balance it against the serenity that happy auditors bestow on their victims
versus the sheer terror they invoke when system totals do not balance. Such are the troubles and travails of the valiant
Essbase developer.
2.4.6 Really Tying It All Together
So, now that we have a four-part solution to data quality in Essbase that:
•  Extracts dimension metadata to a table
•  Identifies and loads missing dimension metadata
•  Loads the fact data
•  validates the data in Essbase to the fact data,
the question arises: how does all of this get put together in a single process?
2.4.6.1 Packages I have used Packages to tie together steps within the oDI Project;
that is the traditional way to package oDI objects into a flow. The graphical linking
metaphor of drawing arrows from one object to another in Packages to denote process
steps is perfectly satisfactory for simple program flows. This approach breaks down
when the level of complexity becomes even moderately high.
to reduce complexity within Package diagrams, logical Packages can be compiled
into Scenarios that can contain their own Packages-compiled-to-Scenarios, but this
process of encapsulation gets difficult to manage as steps are added. At some point
Search WWH ::




Custom Search