Hardware Reference
In-Depth Information
Each of these tasks is evaluated with respect to five main questions, which constitute
a simplified form of the sort of assessment a standard maturity model would require:
1. Is there a documented process for the task?
2. Is there an established process for the task?
3. If there is an established process, is it executed consistently?
4. If both an established, consistent process, and a documented process could be
found, do they match?
5. Is the process adequate for its intended purpose?
In order to produce assessment results that allow for comparison of a project's per-
formance in different areas, the answers to these questions are encoded in a prede-
fined, normalized form. These basic results, in turn, are used to compute indicators
that are integrated into the QualOSS model, and that, similar to other QualOSS met-
rics, are intended to contribute to an overall view of an OSS project's quality.
In order to address these questions for each of our selected tasks, we have already
defined simple evaluation procedures. In the following, we outline these procedures.
Question 1 is concerned with process documentation. Although process documen-
tation is seldom found under that name for Open Source projects, many projects have
indeed documented procedures for a variety of development tasks. The reasons for
providing documentation are often related with making it easier for external contribu-
tors to perform certain tasks (e.g., submit a problem report or a so-called patch file
with a correction), as well as with making certain tasks more reliable (release proc-
esses are a typical case). Our procedure for finding documentation for a task is based
on searching through the Internet resources made available by a given project for the
relevant information as follows:
1. Check project resources for documentation regarding the task. Perform an Internet
search if necessary. Acceptable documentation are explicit documents (Web/Wiki
pages, archived mail/forum messages) that contain direct instructions about per-
forming the task. In some cases, these are presented as templates, or as a set of ex-
amples.
2. If no explicit documentation was found, check if a tool is being used to support the
task. If this is the case, check if the tool can be used in a self-explanatory manner.
If this is the case, this can be accepted as documentation.
3. If 30 minutes of search do not yield any positive results, stop searching.
The final step confines the evaluation to a time box. This is important because, in fact,
we can never be sure that there is no documentation about a task, only that it could
not be found with reasonable search effort.
The second question is concerned with how established a process is. Notice that
this question is, to a large extent, independent from the first one, because undocu-
mented processes can nonetheless be well established, and documented processes may
not be followed as prescribed. In order to check for established processes, standard
maturity models use the fact that such processes leave a paper trail behind them that
can be used to observe them in a very reliable manner. If such a trail cannot be found,
the odds are very high that the process is not established, e.g., not followed at all, or
not followed in a consistent manner. Strictly speaking, of course, a paper trail cannot
be found for OSS processes, but a data trail is often seen when looking at the diverse
data repositories that belong to a project, such as:
Search WWH ::




Custom Search