Java Reference
In-Depth Information
Examining test error messages and output
In the Output tab of the JU nit window, you can examine the runtime output gen-
erated by each test. This includes any output sent to the standard output stream
as well as standard error (which is displayed in red text). If any exceptions were
thrown, or if a test failed an assertion, this output is displayed here as well. This
tab is also active while the tests are running, allowing you to monitor your tests'
progress. If you have a stack trace in your test's output, any reachable source code
reference is hot-linked to your editor, allowing you to quickly view or edit the
source code. In the case of an assertion failure where a message was specified in
the test, this message is shown along with a stack trace.
IDEA conveniently isolates each test's output. Clicking an individual test in the
tree shows only the output stemming from the selected test. As you might expect,
selecting any node in the test hierarchy includes the output of all the tests below it.
Tracking the time and memory usage of each test
The Statistics tab in the JU nit window shows how much time it took to run each
test case and how much memory was consumed during its run. You can view test
statistics summarized up to the suite or test case level by selecting the appropriate
level of hierarchy in the test case tree. Here's what the columns mean:
Time elapsed —The number of seconds it took to run this test.
Usage Delta —The amount of memory apparently consumed during this test.
Usage Before —The amount of memory in use at the start of the test.
Usage After —The amount of memory in use after the test has been com-
pleted.
Results —A summary of test results. For individual tests, this column shows
a pass or fail; but for suites and test cases, it shows the number of passed
and failed tests.
Take these statistics with a grain of salt—the timing and memory usage data are
collected only to give you an approximate gauge of test case performance. Many
things can affect the accuracy of this data; for example, if the garbage collector
runs during a test case execution, the amount of memory shown in the statistics is
wrong. Nevertheless, the Statistics view is a good way to keep an eye on the gen-
eral state of things, as shown in figure 7.5. For example, if your unit tests now take
twice as long, you may want to investigate. Perhaps there are just more tests than
before, but a recent change may have drastically affected system performance.
 
 
 
 
 
Search WWH ::




Custom Search