Information Technology Reference
In-Depth Information
1.1 Related Work
Other studies of coding processes and reliability have been conducted over the last
few decades. The majority of these have been based either on studies of large systems
[3, 8] and mainframe based operations [8] or have analyzed software vendors [7]. In
the few cases where coding practices within individual organization have been
quantitatively analyzed, the organizations have been nearly always large telecommu-
nications firms [1, 2, 5, 6, 8] or have focused on SCADA and other critical system
providers [9] or are non-quantitative approaches [12, 13].
Whilst these results are extremely valuable, they fail to reflect the state of affairs
within the vast majority of organizations. With far more small to medium businesses
coupled with comparatively few large organizations with highly focused and dedi-
cated large scale development teams (as can be found in any software vendor), an
analysis of in-house practice is critical to both security and the economics of in-house
coding.
As the Internet comes to become all persuasive, internal coding functions are only
likely to become more prevalent and hence more crucial to the security of the
organization.
1.2 Our Contribution
In section 2 we present an analysis of the empirical study completed to determine the
cost of finding, testing and fixing software bugs. We model the discovery of bugs or
vulnerabilities in Section 3 using Cobb-Douglas function and calculate the defect rate
per SLOC (source line of codes) using Bayesian calculations. Finally paper is summa-
rized and concluded in Section 4.
2 An Analysis of Coding Practice
A series of 277 coding projects in 15 companies with in-house developers was ana-
lyzed over multiple years. The costs, both in terms of time and as a function of finan-
cial expenditure were recorded. The analysis recorded: format string errors, integer
overflows, buffer overruns, SQL injection, cross-site scripting, race conditions, and
command injection. The code samples were analyzed by the authors using a combina-
tion of static tools and manual verification to the OWASP 1 and SANS 2 secure coding
guidelines during both the development and maintenance phases. For the 277 coding
projects, the following data fields have been collected:
the total number of hours
o
Coding / Debugging (each recorded separately)
tloc (thousand lines of source code)
the number of bugs (both initially and over time as patches are released)
The coding projects where developed using a combination of the Java, C# (.Net), PhP
and C++ languages. The authors collected data between June 2008 and December
1 http://www.owasp.org/index.php/Secure_Coding_Principles
2 http://www.sans-ssi.org/
Search WWH ::




Custom Search