Java Reference
In-Depth Information
1.2. Benefits of Threads
When used properly, threads can reduce development and maintenance costs and improve the
performance of complex applications. Threads make it easier to model how humans work and
interact, by turning asynchronous workflows into mostly sequential ones. They can also turn
otherwise convoluted code into straight-line code that is easier to write, read, and maintain.
Threads are useful in GUI applications for improving the responsiveness of the user interface,
and in server applications for improving resource utilization and throughput. They also sim-
plify the implementation of the JVM—the garbage collector usually runs in one or more ded-
icated threads. Most nontrivial Java applications rely to some degree on threads for their or-
ganization.
1.2.1. Exploiting Multiple Processors
Multiprocessor systems used to be expensive and rare, found only in large data centers and
scientific computing facilities. Today they are cheap and plentiful; even low-end server and
midrange desktop systems often have multiple processors. This trend will only accelerate; as
it gets harder to scale up clock rates, processor manufacturers will instead put more processor
cores on a single chip. All the major chip manufacturers have begun this transition, and we
are already seeing machines with dramatically higher processor counts.
Since the basic unit of scheduling is the thread, a program with only one thread can run on at
most one processor at a time. On a two-processor system, a single-threaded program is giv-
ing up access to half the available CPU resources; on a 100-processor system, it is giving up
access to 99%. On the other hand, programs with multiple active threads can execute simul-
taneously on multiple processors. When properly designed, multithreaded programs can im-
prove throughput by utilizing available processor resources more effectively.
Using multiple threads can also help achieve better throughput on singleprocessor systems.
If a program is single-threaded, the processor remains idle while it waits for a synchronous
I/O operation to complete. In a multithreaded program, another thread can still run while the
first thread is waiting for the I/O to complete, allowing the application to still make progress
during the blocking I/O. (This is like reading the newspaper while waiting for the water to
boil, rather than waiting for the water to boil before starting to read.)
Search WWH ::




Custom Search