Information Technology Reference
In-Depth Information
Following are some performance concepts and indicators that we use in the field in order
to quantify the performance of our cloud computing systems. We use them as maintenance
tools as well as root-cause analysis tools.
Input/Output Operations per Second (IOPS)
Because we are basically dealing with data and its inevitable storage, input/output operations
per second (IOPS) is one of the top performance indicators that we consider in the industry.
A computer system is not simply defined by its core processing components; that is, it's not
defined by its individual parts but by the synergy of the whole. It won't do anyone any good
to have powerful processors and lots of RAM if it takes too long to retrieve the data from
storage and then takes as much time putting it there again. That, as we all know, is the pro-
verbial bottleneck and would lower the system's overall throughput significantly. A computer
system is like a chain; it is only as strong or, in this case, as fast as its slowest component.
With that in mind, one of our main goals in performance monitoring should also include
finding potential bottlenecks and actively eliminating them.
Storage systems, which usually employ devices with mechanical parts such as hard disk
drives (HDDs), are by far the slowest components of any computer system. Motors and
actuators simply cannot keep up with the speed of electrons. There are now better alterna-
tives such as solid-state drives, but the technology is not yet mature enough to benefit from
economies of scale. Because of this, IOPS remains one of the top performance indicators in
any computer system, especially in cloud computing where data manipulation and storage
is central to the paradigm. FigureĀ 5.1 shows a screenshot of iostat, a common Linux tool.
FIGUREĀ 5.1 iostat is a common Linux tool used to measure IOPS.
IOPS is the most common measurement being used by the manufacturers themselves
to benchmark different storage devices. Because the conditions of the benchmark testing
are often standardized and testing is done in a controlled environment, the results do not
always coincide with real-world use, and in most cases the IOPS numbers being advertised
by the manufacturers are larger compared to what users get from their own tests. Again,
this is due to the discrepancy in the applications and other hardware that make up the vari-
ous computer systems of different users.
Search WWH ::




Custom Search