Database Reference
In-Depth Information
values with zeros. This option can be much slower than using the add values or subtract
values options.
5.5.2.2.4 he Monday and hursday Loads The remainder of the processing includes
the monday and Thursday loads. The methods used to perform these loads are no differ-
ent than the Sunday afternoon loads, which were described previously. The great thing
to mention is that the monday and Thursday data files are very small (compared to the
Sunday files) and they load in very quickly, building the required aggregate views in a
flash. This does not disrupt the users at all. Essbase does not allow the data to be used
until the aggregate updates are complete, so there are absolutely no worries over users
accessing data that is invalid or is only partially complete. All in all, once it was thor-
oughly understood, this process was fairly easy to implement and deploy.
“So what's the catch?” yes, there is always something that is not so easy. With Data
Slices, the thing that is not so easy is fixing slices if they are wrong. Another challenge is
how to go about deleting slices. one of the frustrating things early on was the fact that
a slice cannot be identified by a name. Wouldn't it be nice if you could name a slice, e.g.,
this is my “budget” slice, and then manipulate the slice by name? That is not actually
possible (as you may have guessed by now). how are corrections and changes made,
especially if these corrections need to be made prior to merging the results into the main
database? In the end, several different strategies had to be developed. The strategy that
was deployed depended on what specifically was wrong, when it was caught, and what
data existed to work with.
In the most extreme case, if the entire week's processing is in question, the safest path
is to rebuild the cube from the exports taken the previous Saturday and just rerun the
entire process after the staging tables have been fixed using the exports created in the Dr
process. This assumes that there are staging tables to work with, and that another week
has not completely passed. This is a simple solution, easy to implement, and virtually
risk free. This solution has already been implemented twice and it works exceptionally
well. While it is an annoyance, it is not an issue.
There are two commands built into ASo that can be used to facilitate making cor-
rections. The first command is at the database level and is not one that would be used in
this use case because the ability to reload any of the history except for the current week
is not an option. The command, for future reference, is:
import database ${APP_NAME}.${DB_NAME} data from load_buffer with
buffer_id 1 override all data;
This import command would replace the contents of the entire database. The second
command is appropriate for this use case. If there are six incremental Data Slices when
weekly processing is complete, and they all need to be “deleted” so the process can
be started over, then this is the command that would be used. The first load would be
prepared, and then the script would be altered for that load as follows:
import database ${APP_NAME}.${DB_NAME} data from load_buffer with
buffer_id 1 override incremental data;
This import command (using the override statement) will replace all six incremental
slices with the new Data Slice. The subsequent loads can all be rerun using the original
Search WWH ::




Custom Search