Java Reference
In-Depth Information
Visibility
JMM describes the conditions under which the effects produced by actions in one thread are visible to another thread.
Mainly, it describes, when a thread writes a value to a field, at what point that new value of that field can be visible
to another thread. I will discuss more about the visibility aspect of JMM when I discuss locks, synchronization, and
volatile variables later in this chapter. For completeness, the following are some of the visibility rules:
When a thread reads the value of a field for the first time, it will read either the initial value of
the field or some value that was written to that field by some other thread.
volatile variable is always written to the main memory. A read on a volatile
variable is always read from the main memory. That is, a volatile variable is never cached
in the working memory of a thread. In effect, any write to a volatile variable is flushed to the
main memory, immediately making the new value visible to other threads.
A write to a
When a thread terminates, the working memory of the thread is written to the main memory
immediately. That is, after a thread terminates, all variables values visible only to the
terminated thread are made visible to all threads.
synchronized block, that thread reloads the values of all variables in
its working memory. When a thread leaves a synchronized block, it writes all variables values
from its working memory to the main memory.
When a thread enters a
Ordering
JMM describes in what order actions are performed within a thread and among threads. It guarantees that all actions
performed within a thread are ordered. Actions in different threads are not guaranteed to be performed in any
order. You may achieve some ordering while working with multiple threads by using the synchronization technique
described later in this chapter.
each thread in a Java program uses two kinds of memory: working memory and main memory. a thread cannot
access the working memory of another thread. Main memory is shared among the threads. threads communicate with
each other using the main memory. every thread has its own stack, which is used to store local variables.
Tip
Object's Monitor and Threads Synchronization
In a multi-threaded program, a section of code that may have undesirable effects on the outcome of the program if
executed by multiple threads concurrently is called a critical section . Often, the undesirable effects result from the
concurrent use of a resource by multiple threads in the critical section. It is necessary to control the access to a critical
section in a program so only one thread can execute the critical section at a time.
In a Java program, a critical section can be a block of statements or a method. Java has no built-in mechanism to
identify a critical section in a program. However, Java has many built-in constructs that allow programmers to declare
a critical section, and to control and coordinate access to it. It is the programmer's responsibility to identify critical
sections in a program and control the access to that critical section by multiple threads. Controlling and coordinating
the access to a critical section by multiple threads is known as threads synchronization . Threads synchronization
is always a challenging task when writing a multi-threaded program. In Listing 6-4, the updateBalance() and
monitorBalance() methods are critical sections and you must synchronize the threads' access to these two methods
to get a consistent output.
 
 
Search WWH ::




Custom Search