Java Reference
In-Depth Information
3.6
Analyzing Problems
You most often use the techniques of “algorithm” analysis to analyze an algorithm,
or the instantiation of an algorithm as a program. You can also use these same
techniques to analyze the cost of a problem. It should make sense to you to say that
the upper bound for a problem cannot be worse than the upper bound for the best
algorithm that we know for that problem. But what does it mean to give a lower
bound for a problem?
Consider a graph of cost over all inputs of a given size n for some algorithm
for a given problem. Define A to be the collection of all algorithms that solve
the problem (theoretically, there are an infinite number of such algorithms). Now,
consider the collection of all the graphs for all of the (infinitely many) algorithms
in A. The worst case lower bound is the least of all the highest points on all the
graphs.
It is much easier to show that an algorithm (or program) is in (f(n)) than it
is to show that a problem is in (f(n)). For a problem to be in (f(n)) means
that every algorithm that solves the problem is in (f(n)), even algorithms that we
have not thought of!
So far all of our examples of algorithm analysis give “obvious” results, with
big-Oh always matching . To understand how big-Oh, , and notations are
properly used to describe our understanding of a problem or an algorithm, it is best
to consider an example where you do not already know a lot about the problem.
Let us look ahead to analyzing the problem of sorting to see how this process
works. What is the least possible cost for any sorting algorithm in the worst case?
The algorithm must at least look at every element in the input, just to determine
that the input is truly sorted. Thus, any sorting algorithm must take at least cn time.
For many problems, this observation that each of the n inputs must be looked at
leads to an easy (n) lower bound.
In your previous study of computer science, you have probably seen an example
of a sorting algorithm whose running time is in O(n 2 ) in the worst case. The simple
Bubble Sort and Insertion Sort algorithms typically given as examples in a first year
programming course have worst case running times in O(n 2 ). Thus, the problem
of sorting can be said to have an upper bound in O(n 2 ). How do we close the
gap between (n) and O(n 2 )? Can there be a better sorting algorithm? If you can
think of no algorithm whose worst-case growth rate is better than O(n 2 ), and if you
have discovered no analysis technique to show that the least cost for the problem
of sorting in the worst case is greater than (n), then you cannot know for sure
whether or not there is a better algorithm.
Chapter 7 presents sorting algorithms whose running time is in O(n log n) for
the worst case. This greatly narrows the gap. With this new knowledge, we now
have a lower bound in (n) and an upper bound in O(n log n). Should we search
Search WWH ::




Custom Search