Hardware Reference
In-Depth Information
information from a wide range of organizations is not available for a survey. Obvi-
ously, data from CMMI evaluation or similar activities would give information on
real testing practice. However, it is difficult to access to details of such evaluation
processes, only a small percentage of organizations have been evaluated according to
this model and SEI public information does not include details of each process area.
Knowing the limitation, we decided to use as reference the list of best practices for
software testing of Quality Assurance Institute (www.qaiusa.org): one of advantage is
that QAI carried out several surveys using this list from 1994 to 1999 in the USA so
this reference of 20 practices was refined with their results and experience. In fact,
published surveys do not focus their attention in specific process models but in cus-
tomized list of questions covering from detailed techniques to organizational topics.
In the case of [15][16] items covered from general testing approach (independent
testing, etc.), budget or standards to specific methods and possible barriers to adoption
of specific practices. Other studies [17] were more focused on detailed methods (e.g.
use of notations like UML, structured diagrams, etc. for documentation) and specific
data on percentage of effort devoted to testing even related to project size; in the case
of [18] (also based on contacts of a network of practitioners and researchers) it was
focused on extremely detailed aspects of software unit testing although some conclu-
sions might be common to general testing practices.
Table 2. Summary of results from survey on testing practices implemented in organizations
QAI practice
Implem
1. Identified responsibility for testing processes in the organization?
28.57%
2. Is there and is used a standard for test plans?
23.33%
3. Is there and is used a standard for unit testing?
18.10%
4. Is there and is used a standard for test reports?
27.14%
5. Testing planning and execution process parallel to the whole development process?
28.57%
6. Check if software specifications are correct?
39.05%
7. Besides being correctly implemented. check if customer expectations are fulfilled?
48.57%
8. Testing staff check if development documents are complete and correct?
21.43%
9. Testing staff report defects to developers (and not to managers)?
41.43%
10. Testing staff identifies business risks before developing test plan?
11.43%
11. Are there measurable objectives for each tested system?
14.76%
12. Testing objectives are clearly linked to business risks?
14.29%
13. Are detected defects recorded, reported and used to improve development and
testing processes?
28.10%
14. Has testing staff defined defect expectations according to paste experience?
17.62%
15. Is there a testing processes improvement process?
18.10%
16. Defects are identified with a unique code?
20.95%
17. Does the organization record, report and used defect data to asses test effectiveness?
17.62%
18. Are metrics used for planning and evaluating testing processes?
9.05%
19. Are there specific training processes for the testing staff?
17.62%
20. Do testing tools represent a significant element of testing process?
12.50%
Or ganization s
Implem
No. Practices from 0 to 4
10.95%
No. Practices from13 to 16
19.05%
No. Practices from 5 to 8
31.90%
No. Practices from17 to 20
13.33%
No. Practices from 9 to 12
24.29%
Search WWH ::




Custom Search