Databases Reference
In-Depth Information
Observe the statistics, and you will notice that we have reduced the number of recursive
calls, consistent gets, and the number of the nested loops in the execution plan, the number
of sorts has also reduced after denormalization.
In step 8, we dropped the objects created in this recipe
so that the SH schema is in its initial state.
However, there is no such thing as a free lunch, we have introduced redundancy. If the
attributes in the lookup tables—which we have migrated in Phone table—change often,
to update a single row in the normalized schema, we have to reflect the same update in
thousands of records in the denormalized one.
The "rule of thumb", in this case, is to avoid denormalization, when the data is updated
often—very common in OLTP environments. We can think about denormalization, when the
data is almost fixed over a period of time, for example, to minimize the number of joins, as
illustrated in this recipe.
 
Search WWH ::




Custom Search