Java Reference
In-Depth Information
7.5.3 a general upper bound for
divide-and-conquer running times
The analysis in Section 7.5.2 showed that, when a problem is divided into two
equal halves that are solved recursively—with O ( N ) overhead, an O ( N log N ) algo-
rithm is the result. What if we divide a problem into three half-sized problems with
linear overhead, or seven half-sized problems with quadratic overhead? (See Exer-
cise 7.20.) In this section we provide a general formula to compute the running
time of a divide-and-conquer algorithm. The formula requires three parameters:
The general for-
mula given in this
section allows the
number of subprob-
lems, the size of the
subproblems, and
the amount of addi-
tional work to
assume general
forms. The result
can be used with-
out understanding
of the proof.
A, which is the number of subproblems
n
B, which is the relative size of the subproblems (for instance B = 2
represents half-sized subproblems)
n
k, which is representative of the fact that the overhead is Θ ( N k )
n
The formula and its proof is presented as Theorem 7.5. The proof of the
formula requires familiarity with geometric sums. However, knowledge of the
proof is not needed for you to use the formula.
) ON k
The solution to the equation
T ()
=
AT N / B
(
+
()
, where
A 1
and
B 1
>
, is
Theorem 7.5
ON B A
log
for AB k
(
)
>
for A k
T ()
=
ON k
(
log
N
)
=
for AB k
ON k
()
<
Before proving Theorem 7.5, let us look at some applications. For the maxi-
mum contiguous subsequence sum problem, we have two problems, two halves,
and linear overhead. The applicable values are A = 2, B = 2, and k = 1. Hence the
second case in Theorem 7.5 applies, and we get O ( N log N ), which agrees with
our previous calculations. If we recursively solve three half-sized problems with
linear overhead, we have A = 3, B = 2, and k = 1, and the first case applies. The
result is O ( N log 2 3 ) = O ( N 1.59 ). Here, the overhead does not contribute to the total
cost of the algorithm. Any overhead smaller than O ( N 1.59 ) would give the same
running time for the recursive algorithm. An algorithm that solved three half-sized
problems but required quadratic overhead would have O ( N 2 ) running time
because the third case would apply. In effect, the overhead dominates once it
exceeds the O ( N 1.59 ) threshold. At the threshold the penalty is the logarithmic fac-
tor shown in the second case. We can now prove Theorem 7.5.
 
Search WWH ::




Custom Search