Databases Reference
In-Depth Information
15
Denormalization
At the time, Nixon was normalizing relations with China.
I figured that if he could normalize relations, then so could I.
—Ted Codd (1923-2003)
he phrase “smaller, faster, cheaper” has long been the credo of the builders of com-
puter chips, personal digital assistants (PDAs), quantum computers, and even
printers. It is well known to be a difficult task to optimize all three at the same time, and
most of the time there are important tradeoffs that need to be addressed. A similar
phrase, “faster, better, cheaper,” was used in NASA, but came into serious question after
several critical losses in the past decade, including the infamous Mars Climate Orbiter
with its disastrous mix of English and metric units in the same system. In database
design we would very much like to optimize performance (fast queries), maintainability
(fast updates), and integrity (avoiding unwanted deletes) if we can, but the reality is that
there are often serious tradeoffs in these objectives that need to be addressed.
First it is important to distinguish the difference between normalization and denor-
malization. Normalization is the process of breaking up a table into smaller tables to
eliminate unwanted side effects of deletion of certain critical rows and to reduce the
inefficiencies of updating redundant data often found in large universal tables. Some-
times, however, normalization is taken too far and some queries become extremely inef-
ficient due to the extra joins required for the smaller tables. Denormalization is the pro-
cess of adding columns to some tables to reduce the joins, and is done if the integrity of
T
337
Search WWH ::




Custom Search