Game Development Reference
In-Depth Information
The Test Scene column in the grid contains a link to the COLLADA document used as input to the test. You can
open the COLLADA document by using the View Image option in the context menu. The Test Filename column
shows the name of the document prefaced by its path in the StandardDataSets directory.
The Test Scene column in the grid contains a link to the COLLADA document used as input to the test. You can
open the COLLADA document by using the View Image option in the context menu. The Test Filename column
shows the name of the document prefaced by its path in the StandardDataSets directory.
The Import , Render , Export , Import , Render , and Validate columns contain links to the output from each of
those tasks for a given test. The first Import column links to the application's native file generated from the Test Scene
COLLADA document. The second Import column links to the application's native file generated from the first cycle's
exported COLLADA document. The Blessed column shows what the rendered COLLADA document should look like.
Depending on what's being tested, if your Render columns don't resemble the blessed image, you know immediately
that something is wrong. (If rendering the image isn't the objective of the test, a non-match doesn't matter.) Don't
worry about anything the manual or the context menu says about blessing an execution. This is a legacy item that
isn't used.
What to Do When a Test Fails
The Result column tells you how each step in the test fared. Possible messages include Ignored, Warning, and Passed.
Green in the Result column and a badge column means the test executed correctly and a badge was awarded.
Green in the Result column and red in a badge column means that the application can execute the test correctly
(i.e., all the Import/Render/Export steps were successful) but the output files are not COLLADA conformant. When a
test crashes, the Result column turns red.
When a Validate task fails, the CTS tells you how many warnings and errors you got. View the errors by clicking on
View Report in the context menu for the Validate cell in question.
A test can fail if a prerequisite test has not been run and/or passed. When that happens, you get a message telling
you which test(s) you need to run, by name, not test number (e.g., “_Reference _Oneimage test case”). However, if you
run the suite in the order presented, you shouldn't encounter this situation.
Comparing Test Runs
Comparing test runs helps pinpoint bugs and areas of nonconformance. By comparing images, you can see if you are
getting warmer or colder.
To see the results of a previous test, right-click anywhere in the test row and select Show Previous. You can
tell which tests have been run more than once by checking the Different from Previous column. You can delete
an execution from the GUI by right-clicking the test and selecting Delete Execution. The histories are stored in
the TestProcedures directory. When you compare executions, the results of the selected tests display in a pop-up
window. Beware, though: executions are described only by date and sequence number, so keep track!
Comparing Images
Comparing images helps you diagnose problems in your application. You can compare your results with Khronos's
blessed images or with images you've produced during different test runs or steps (see Figure 6-9 ). Comparing
images involves three dialog boxes: “Compare Image With,” where you select the images you want to use; “Image
Comparison,” where you actually look at the images side by side; and “Image Diff,” which shows you the pixel-by-pixel
differences.
 
Search WWH ::




Custom Search