Databases Reference
In-Depth Information
model that can be purchased from a third party. This subsequently
can be customized for the organization.
Build Logical Data Model:
The logical data model is built iteratively. The
first view usually is done at a high level, beginning with a subject area or
conceptual data model. Subsequent levels contain more detail. The pro-
cess of normalization also is applied at this stage. There are many good
topics on normalization, so normal forms will not be covered. Foreign
key fields and potential indexes also can be considered here. It is not
necessary to build the logical data model for performance at this time,
and physical considerations are left until a later process.
Verify the Data Model:
The logical data model is validated iteratively
with users, the fields of the user interface, and process models. It is
not unusual to make changes to the data model during this verification
process. New requirements, which need to be fitted into the data mod-
el, also may be identified.
Build Data Architecture:
The data architecture is defined in the context
of the physical data environment. Considerations, such as the data-
base server, distribution, components, and partitioning, are consid-
ered in this step.
Build the Physical Data Model:
The logical data model is converted to
a physical data model based on the specific database that is used. The
physical data model will vary with the choice of database products
and tools. The physical data model also contains such objects as in-
dexes, foreign keys, triggers, views, and user-defined datatypes. The
physical data model is optimized for performance and usually is de-
normalized for this reason. Denormalization can result in redundancy,
but can improve system performance. Building the physical data mod-
el is not a one-stop process. Do not expect to build a final version of
the physical data model on the first attempt.
Refine the Data Model:
The physical data model is refined continuously
as more information becomes available, and the results of stress test-
ing and benchmarking become available to the database development
team. The logical data model also should be maintained as the physi-
cal data model is refined.
Complete Transaction Analysis:
Transaction analysis is used to review
system transactions so that the physical data model can be refined for
optimum system performance. Transaction analysis results are only
meaningful after the business requirements and systems design are
fairly solid. Transaction analysis produces statistics showing the ac-
cess frequency for the tables in the database, time estimates, and data
volumes.
Populate the Data:
After the database structure is established and the
database is created, it is necessary to populate the database. This can
be done through data scripts, applications, or data conversions. This
can be an extensive set of activities that requires substantial data
Search WWH ::




Custom Search