Information Technology Reference
In-Depth Information
• Time automated tests. Examine the test execution timings pro-
vided by your testing framework.
• Use a performance-testing tool to analyze certain areas of your
test code. In addition, many xUnit testing frameworks provide a
reporting utility that displays the time taken to execute each test.
• Use inspection tools to analyze your test code and test complexity.
• Verify that your unit tests are actually unit tests, not component
or system tests. A quick way to determine this is to remove your
network cable, shut down your database, and run your tests.
Which tests still run? The tests that still run are (or should be)
your unit tests.
After performing a high-level evaluation of the test environment,
you'll have a better idea of how to improve test performance. Several
strategies will suggest themselves, including the following.
• Separate automated tests by category—unit, component, and
system—and run these tests at different times (e.g., unit tests at
every commit, component/system tests in a secondary build).
See Chapter 6 for details on test categorization.
• Refactor your tests based on the results of the inspection tools.
• Use mocks/stubs for components that may otherwise be too dif-
ficult or complex to use in the unit-testing environment. For
example, a common implementation of a mock object is mock-
ing the interface for data access.
• Separate long-running integration tests into separate specialized
test suites.
• Execute your tests in parallel.
• Run different types of tests based on build type: A commit build
is followed by secondary builds, a full integration build, or
release build.
Stage Builds
As mentioned earlier, another approach to reducing build duration is to
run a lightweight build followed by a “heavyweight” build (which
 
Search WWH ::




Custom Search