Information Technology Reference
In-Depth Information
Best practices. Given the building blocks provided by synchronization
variables, what is a systematic way to write and reason about the code
for shared objects?
Finally, it is important to understand how the tools we use actually work,
so we dive into the details of how synchronization primitives are implemented.
Implementing synchronization primitives. How are locks and con-
dition variables implemented?
Multi-threaded programming has a reputation for being dicult. We agree
that it takes care. But, this chapter provides a set of simple rules that anyone
can follow to implement objects that can be shared by multiple threads.
5.1
Challenges
The start of this section outlined the core challenge of multi-threaded program-
ming: a multi-threaded program's execution depends on the interleavings of
dierent threads' access to shared memory, which can make it dicult to reason
about or debug these programs. In particular, cooperating threads' execution
may be affected by race conditions.
5.1.1
Race conditions
A race condition is when the behavior of a program depends on the interleaving
Denition: race condition
of operations of different threads. In effect, the threads run a race between their
operations, and the results of the program execution depends on who wins the
race.
Reasoning about even simple programs with race conditions can be dicult.
To appreciate this, we will look at several extremely simple multi-threaded pro-
grams.
The world's simplest cooperating-threads program.
Suppose we run a
program with two threads that do the following:
Thread 0 Thread 1
x=1; x=2;
Q: What are the possible final values of x ?
A: x=1 or x=2 depending on which thread wins or loses the \race" to set
x .
That was easy, so let's try one that is a bit more interesting.
Search WWH ::




Custom Search