Database Reference
In-Depth Information
Chapter 1
Methodology
Performance tuning is a wide subject, probably a misunderstood subject; so it has become a common practice
among technologists and application vendors to regard performance as an issue that can be safely left for a tuning
exercise performed at the end of a project or system implementation. This poses several challenges, such as delayed
project deployment, performance issues unnoticed and compromised because of delayed delivery of applications for
performance optimization, or even the entire phase of performance optimization omitted due to delays in the various
stages of the development cycle. Most important, placing performance optimization at the end of a project life cycle
basically reduces opportunities for identifying bad design and poor algorithms in implementation. Seldom do they
realize that this could lead to potentially rewriting certain areas of the code that are poorly designed and lead to poor
performance.
Irrespective of a new product development effort or an existing product being enhanced to add additional
functionality, performance optimization should be considered from the very beginning of a project and should be
part of the requirements definition and integrated into each stage of the development life cycle. As modules of code
are developed, each unit should be iteratively tested for functionality and performance. Such considerations would
make the development life cycle smooth, and performance optimization could follow standards that help consistency
of application code and result in improved integration, providing efficiency and performance.
There are several approaches to tuning a system. Tuning could be approached artistically like a violinist who
tightens the strings to get the required note, where every note is carefully tuned with the electronic tuner to ensure
that every stroke matches. Similarly, the performance engineer or database administrator (DBA) could take a more
scientific or methodical approach to tuning. A methodical approach based on empirical data and evidence is a most
suitable method of problem solving, like a forensic method that a crime investigation officer would use. Analysis
should be backed by evidence in the form of statistics collected at various levels and areas of the system:
From functional units of the application that are performing slowly
During various times (business prime time) of the day when there is a significant
user workload
From heavily used functional areas of the application, and so forth
The data collected would help to understand the reasons for the slowness or poor performance because
there could be one or several reasons why a system is slow. Slow performance could be due to bad configuration,
unoptimized or inappropriately designed code, undersized hardware, or several other reasons. Unless there is
unequivocal evidence of why performance is slow, the scientific approach to finding the root cause of the problem
should be adopted. The old saying that “tuning a computer system is an art” may be true when you initially configure
a system using a standard set of required parameters suggested by Oracle from the installation guides; but as we
go deeper into testing a more scientific approach of data collection, mathematical analysis and reasoning must be
adopted because tuning should not be considered a hit-or-miss situation: it is to be approached in a rigorous scientific
manner with supporting data.
 
Search WWH ::




Custom Search