Databases Reference
In-Depth Information
TIPS AND INSIGHTS FOR DATABASE PROFESSIONALS
Tip 1. Normalize first, then consider denormalizing if performance is poor.
You can maximize the probability of a good logical design by carefully creating a
conceptual model using the entity-relationship (ER) approach or UML. These
modeling methods tend to result in relational databases that are close to being or
are already normalized. Normalization tends to reduce redundancy and provides a
high level of integrity to the database. When the actual tables are not conducive to
good performance (e.g., when they are so small that dominant queries must do
extra joins on them each time they are executed), then consider merging two
tables to avoid the join and reduce I/O time. If the benefit of this merge (and pos-
sible denormalization) in I/O time saved is greater than the cost in I/O time for
the redundancy of data needed to avoid a delete anomaly, in terms of updates,
then go ahead with the merge.
Tip 2. Denormalize addresses whenever possible. Addresses can be very long
and cumbersome to access, so it is often useful to store addresses separately and
access them through joins only when explicitly needed. Furthermore, addresses are
often stored redundantly across the database, so if one copy gets deleted, it can be
recovered elsewhere. Usually the performance gains of avoiding joins most of the
time and avoiding extra bytes in a query are worth the redundancy and the extra
updates needed. Addresses are usually fairly static and don't change often.
Tip 3. Make use of existing DBMS-provided software to synchronize data
between redundant data to support denormalization and the base data. Exam-
ples of data synchronization software include DB2 Everyplace, Oracle Data Hubs,
Oracle Streams, SQL Server Compare (Red-Gate Software), and SQL Server
Everywhere Edition.
15.5
Summary
In this chapter we explored an in-depth definition and example for the use of denormal-
ization to enhance performance of a relational database. The example reviews the life
cycle steps of logical design before the denormalization step of schema refinement to
increase efficiency for query processing.
15.6
Literature Summary
The idea for extending a table for usage efficiency came from Schkolnick and Sorenson
[1980], and practical advice on denormalization is given in Rodgers [1989].
Ramakrishnan, R., and Gehrke, J. Database Management Systems , 3rd ed. New York:
McGraw-Hill, 2004.
Search WWH ::




Custom Search