Database Reference
In-Depth Information
In an environment where Analysis Services cubes are built on top of a traditional data
warehouse, when we know when data is loaded into the data warehouse, then it's
easier to schedule Analysis Services processing as part of the overall data warehouse
ETL. That way, we can kick off processing when we're sure that the fact and dimension
tables have finished loading, and have finished successfully. Even if the cube needs to
be updated regularly during the day, scheduling that processing at regular intervals
using SQL Server Agent and Integration Services means that we can be sure when that
processing will take place, that it does not take place too often and therefore impact
query performance, and be able to log and handle any errors appropriately.
We only recommend the use of proactive caching on less formal Analysis Services
implementations, for example, when cubes are built on tables in an OLTP database.
In these cases, where there is no ETL or data warehouse in place and where data
volumes are relatively small, using proactive caching is often easier than having to
develop and maintain something like an Integration Services package.
SSAS Data Directory maintenance
By default, all of the files used by Analysis Services to store MOLAP data are stored
in the Analysis Services data directory. The location of this directory is set up during
installation, and can be changed by setting the value of the DataDir server property.
As with any other type of file handled by Windows, the files in the Analysis Services
data directory are prone to fragmentation. During processing, many files are created,
written, and deleted. If the disk is very fragmented, these files may be split across
different portions of the disk, slowing down access to them.
A regular defragmentation of the drive on which the data directory exists will
improve the overall speed of the disk subsystem, which will of course have benefits
for Analysis Services query and processing performance. As a result, we suggest
regular disk defragmentation of the Analysis Services data directory's drive as part
of the server's maintenance plan, making sure of course that it only takes place at a
time when we are not processing or when users are querying the cube.
Performing database backup
It may be stating the obvious, but after processing has finished, we should
always back up our Analysis Services database. This can be done very easily from
Integration Services using an Execute Analysis Services DDL task; once again, you
can generate the XMLA command needed to back up a database by right-clicking
on it in SQL Management Studio, selecting Back Up , and then pressing the Script
button on the Back Up Database dialog.
 
Search WWH ::




Custom Search