Information Technology Reference
In-Depth Information
architecture. His law states that the increased
speed that is gained by using an improved mode
of execution is restricted by how much the new
mode is actually used. For example, if an execu-
tion mode that is used 10% of the time is modi-
fied to be 100% faster, the entire efficiency of the
system will only increase by about 5%. On the
other hand, if a mode of execution that is used
90% of the time is modified to be 50% faster than
it was before, the entire system will experience
a 45% increase in efficiency. This means that a
small improvement in a mode of execution that
is used frequently will have a much larger impact
on performance than a large improvement made
in a mode of execution that is seldom used. If
one wants to have the greatest impact on the ef-
ficiency of a system, one should try to improve
those processes that account for the greatest share
of the execution time. This means designers of
computer systems must pay attention to changes
in technology, identify those technologies that
have had the greatest speed-up, and then make
sure the old technologies that the new technology
has to interact with do not impede the speed-up
gained by the new technology.
As an example, the Central Processing Unit
(CPU) is one of those technologies that enjoyed
some great improvement in the recent past.
Although the speed and capacity of the newest
CPUs - as well as other key hardware compo-
nents - have been increasing, if the rest of the
computer system cannot keep pace with this
increase, then the full benefit of the increase will
not be realized. Just like the CPU, another part of
the computer system that might not be keeping
pace with the ever increasing speeds of hardware
is the Operating System.As Engler, Kaashoek, and
O'Toole explain, “Traditional operating systems
limit the performance, flexibility, and function-
ality of applications by fixing the interface and
implementation of operating system abstractions
such as interprocess communication and virtual
memory” (Engler, Kaashoek, & O'Toole, 1995).
Furthermore, as another example, John Ousterhout
states “Operating systems derived from UNIX
use caches to speed up reads, but they require
synchronous disk I/O of operations that modify
files. If this coupling isn't eliminated, a large
class of file-intensive programs will receive little
or no benefit form faster hardware” (Ousterhout,
1989). The new era of operating system design
demands that operating systems keep pace with
faster hardware or risk being the cause of computer
system speeds being stagnant.
the approach
As Lee Carver and others state, an operating
system is a necessary evil (Carver, Chen, &
Reyes, 1998). Therefore, computers will have
an operating system of one sort or another. The
growing requirements that operating systems
become faster and more flexible have encouraged
many researchers to consider operating systems
with radical designs. One of the new designs is
an extensible operating system.
An extensible operating system is simply an
operating system that is flexible to change. The
needs of the underlying hardware can be better
met by an operating system that can be easily
modified. The needs of the user applications can
also be better met by an operating system that
can be easily modified. Speed is achieved by an
extensible operating system because the system
can be easily changed and optimized. The speed
and flexibility issue are both addressed by an
extensible operating system. By providing the
hope of increased speed and a more flexible imple-
mentation, the approach offered by an extensible
operating system, at least momentarily, seems
to be one way to prevent the increased speed of
computer systems from becoming stagnant while
also addressing the rapidly changing needs of
user applications.
A group of researchers at the Massachusetts
Institute of Technology have implemented their
version of the extensible operating system in what
they have called the Exokernel Operating System.
Search WWH ::




Custom Search