Database Reference
In-Depth Information
The code used to complete the fourth Saturday task to reaggregate the cubes using the
default aggregation is:
execute aggregate process on database ${APP_NAME}.${DB_NAME};
5.5.2.2.2 The Sunday Morning Process updating the dimensions cannot occur until
Sunday morning because that is when the data warehouse updates occur. There is noth-
ing special about building the dimensions. The cube1 process would be considered a
normal dimension build process. The cube2 process might be considered abnormal by
some because it has aggregations in place already. The code used to complete the first
Sunday morning task to build dimensions is:
import database ${APP_NAME}.${DB_NAME} dimensions
connect as ${USER}identified by '${PWD}' using server rules_file
'${LD_RULE1}',
connect as ${USER}identified by '${PWD}' using server rules_file
'${LD_RULE2}'
on error append to '${ESS_ERROUT}';
The second task for Sunday morning is reaggregating cube1 after the dimen-
sion builds have completed. Cube1 has a custom view script built through the Query
tracking techniques described earlier in Section 6.5.1. The code used to complete the
second Sunday morning task to execute this custom aggregation is:
execute aggregate build on database ${APP_NAME}.${DB_NAME} using
view_file '${AGG_VIEW}';
5.5.2.2.3 he Sunday Aternoon Process he base cubes are now ready to have the
new week of data loaded. This data is loaded as one or more Data Slices. As was previ-
ously indicated in this use case, there are a significant number of data sources so each
cube has at least five data loads. They all use a similar process:
•  A load buffer is created.
•  one or more loads are imported into that buffer.
•  The buffer is loaded as a slice.
The code used to complete the Sunday afternoon task to load data is:
alter database ${APP_NAME}.${DB_NAME} initialize load_buffer with
buffer_id 1
resource_usage 1.0 property ignore_missing_values;
import database ${APP_NAME}.${DB_NAME} data connect as ${USER}
identified by '${PWD}'
using server rules_file ${LD_RULE} to load_buffer with
buffer_id 1
on error write to '${ESS_ERROR_DIR}${JOB_FILE_PREFIX}.${APP_
NAME}.${LD_RULE}.err';
import database ${APP_NAME}.${DB_NAME} data from load_buffer with
buffer_id 1 create slice;
This is a good place to note that there are additional options with the create slice syntax
as well. using the statement “override values create slice” will replace all #mISSIng
Search WWH ::




Custom Search