Information Technology Reference
In-Depth Information
Overall, we might use Table 6.2 as a guide to determine what
solutions scale up in a reasonable way. According to the table, if the
amount of work or number of steps involved in processing a data
set with n items is a polynomial (e.g., n or n 2 or n 3 or n 4 ), then the
time involved for relatively large amounts of data may be manage
able, and the solutions may scale adequately. However, if the
amount of processing requires a factorial function (as in the
Traveling Salesperson Problem) or an exponential (e.g., 20 n , as in
chess playing, or even 2 n ), then the solution probably does not scale
up well.
We should, therefore, evaluate research boasting of dramatic
advances in computer technology critically, keeping in mind the
effects of the combinatorial explosion. Note, as a practical mat
ter, that prototype systems often work on very small data sets.
Thus, when we hear claims about how finished applications will
work, it is important that we examine their approach very care
fully. As we have seen, some types of solutions scale nicely,
whereas others do not. One cannot always tell from advertising
which is which, so if you are asked to invest in a company, be
sure to check first!
How does complexity affect problem solving?
We have just observed that the time required to complete a task
often depends on how much data are involved. How processing
time relates to the size of a data set is called computational com-
plexity ; Table 6.2 shows that the time demands for some types of al
gorithms are much more difficult than for other types.
The next example illustrates that size also can affect the logic
used to solve a problem. This effect is called logical complexity .
Computers process data by running programs, and humans write
those programs. As problems and programs gain complexity, the
human authors must handle increasing logical complexity, and
this can have an important impact on the effectiveness of new
technology.
 
Search WWH ::




Custom Search