Java Reference
In-Depth Information
8.3.2. Managing Queued Tasks
Bounded thread pools limit the number of tasks that can be executed concurrently. (The
single-threaded executors are a notable special case: they guarantee that no tasks will execute
concurrently, offering the possibility of achieving thread safety through thread confinement.)
We saw in Section 6.1.2 how unbounded thread creation could lead to instability, and ad-
dressed this problem by using a fixed-sized thread pool instead of creating a new thread for
every request. However, this is only a partial solution; it is still possible for the application
to run out of resources under heavy load, just harder. If the arrival rate for new requests ex-
ceeds the rate at which they can be handled, requests will still queue up. With a thread pool,
they wait in a queue of Runnable s managed by the Executor instead of queueing up as
threads contending for the CPU. Representing a waiting task with a Runnable and a list
node is certainly a lot cheaper than with a thread, but the risk of resource exhaustion still re-
mains if clients can throw requests at the server faster than it can handle them.
Requests often arrive in bursts even when the average request rate is fairly stable. Queues can
help smooth out transient bursts of tasks, but if tasks continue to arrive too quickly you will
eventually have to throttle the arrival rate to avoid running out of memory. [4] Even before
you run out of memory, response time will get progressively worse as the task queue grows.
ThreadPoolExecutor allows you to supply a BlockingQueue to hold tasks awaiting
execution. There are three basic approaches to task queueing: unbounded queue, bounded
queue, and synchronous handoff. The choice of queue interacts with other configuration para-
meters such as pool size.
The default for newFixedThreadPool and newSingleThreadExecutor is to use
an unbounded LinkedBlockingQueue . Tasks will queue up if all worker threads are
busy, but the queue could grow without bound if the tasks keep arriving faster than they can
be executed.
A more stable resource management strategy is to use a bounded queue, such as an Ar-
rayBlockingQueue or a bounded LinkedBlockingQueue or Priority-Block-
ingQueue . Bounded queues help prevent resource exhaustion but introduce the question of
what to do with new tasks when the queue is full. (There are a number of possible satura-
tionpolicies for addressing this problem; see Section 8.3.3 . ) With a bounded work queue, the
queue size and pool size must be tuned together. A large queue coupled with a small pool
Search WWH ::




Custom Search