Information Technology Reference
In-Depth Information
response times for single transactions or activities in an empty system are verifi ed
against performance requirements as an extension of black box testing. This per-
formance baseline testing is a wakeup call for the developers, knowing that a slow
transaction in an empty system will get no faster as more transactions are added to
the system. As the performance baseline results begin to fall in line with require-
ments, load testing of large numbers of transactions is planned and performed. The
load testing decisions about the mix of transactions and how many of each trans-
action to test comes from a business workload analysis that will be discussed in
Chapter 9.
For the Ship or Install phase (fourth column line from the left), we suggest that
it is too late to test because the application is no longer available to the development
team. Another way to say it is, “when the application is ready to ship, by defi nition
the testing is done.”
For the Post Implementation phase (last column to the right), there are mag-
nifying glass and hammer strategies in the application under development row.
The static testing (magnifying glass) of implementation checklists and fi rst use of
operational manuals are done after the new installation is verifi ed correct. Les-
sons learned documents are also static tested for thoroughness, completeness, and
accuracy. The fi rst few days and weeks of new application operation are moni-
tored to compare business workload and application performance test results with
actual business workload and actual application performance under that work-
load in production. Comparison discrepancies found in either workload or perfor-
mance testing become issues either for short-term solutions, for example, faster
hardware, or longer term solutions, for example, redesign next release for better
performance.
When a company purchases a software package, the development and testing
situation is similar to the Final construction phase of custom-written software. The
only application artifacts to test are the documentation and executable code. No
requirements or specifi cations or source code are provided with purchased soft-
ware. So you test what is available, namely the documentation (magnifying glass)
and the input/output behavior (black box) against your company's purchase evalu-
ation criteria. Performance (hammer) testing is done in the intended production
environment with samples of real business data to validate the software package
performance against your company's performance criteria. Companies that do not
insist on testing a purchased package as a prerequisite to the purchase will always be
disappointed with the products they buy.
Next release testing
Changes, corrections, and additional features are an inevitable part of the
software development life cycle regardless of whether it is custom code or a
purchased package. Just consider how many “versions” of your word processor
you have installed in the last 5 years. For a next release, the development and
Search WWH ::




Custom Search