Java Reference
In-Depth Information
many clever (or at least clever-looking) tricks were invented to reduce the impact of syn-
chronization—some good, some bad, and some ugly. DCL falls into the “ugly” category.
Again, because the performance of early JVMs left something to be desired, lazy initializa-
tion was often used to avoid potentially unnecessary expensive operations or reduce applica-
tion startup time. A properly written lazy initialization method requires synchronization. But
at the time, synchronization was slow and, more importantly, not completely understood: the
exclusion aspects were well enough understood, but the visibility aspects were not.
DCL purported to offer the best of both worlds—lazy initialization without paying the syn-
chronization penalty on the common code path. The way it worked was first to check wheth-
er initialization was needed without synchronizing, and if the resource reference was not
null , use it. Otherwise, synchronize and check again if the Resource is initialized, en-
suring that only one thread actually initializes the shared Resource . The common code
path—fetching a reference to an already constructed Resource —doesn't use synchroniza-
tion. And that's where the problem is: as described in Section 16.2.1 , it is possible for a thread
to see a partially constructed Resource .
The real problem with DCL is the assumption that the worst thing that can happen when read-
ing a shared object reference without synchronization is to erroneously see a stale value (in
this case, null ); in that case the DCL idiom compensates for this risk by trying again with
the lock held. But the worst case is actually considerably worse—it is possible to see a cur-
rent value of the reference but stale values for the object's state, meaning that the object could
be seen to be in an invalid or incorrect state.
Subsequent changes in the JMM (Java 5.0 and later) have enabled DCL to work if re-
source is made volatile , and the performance impact of this is small since volatile
reads are usually only slightly more expensive than nonvolatile reads. However, this is an
idiom whose utility has largely passed—the forces that motivated it (slow uncontended syn-
chronization, slow JVM startup) are no longer in play, making it less effective as an optimiza-
tion. The lazy initialization holder idiom offers the same benefits and is easier to understand.
Search WWH ::




Custom Search