Digital Signal Processing Reference
In-Depth Information
Worst-Case Analysis aims to determine the upper bounds on the resource utiliza-
tion and latency that a given algorithm may exhibit for use with resource reservation.
By only reserving enough resources to deal with the worst-case permutation of
the algorithm, the designer can avoid setting aside too many resources, needlessly
increasing the cost and complexity of the system. To understand the technique,
consider an algorithm with a fixed number of inputs and outputs. Any such
algorithm can be expressed as a table of result values indexed by the input values.
In other words, the particular input values are used to look up the answer in the
table. This type of computation structure is thus called a look-up table. In many
cases, a table such as this is optimized using Boolean algebra to create hardware far
more efficient than implementing the complete table. Two different algorithms will
have two different tables, each of which may be optimized differently, potentially to
a different degree. To explore how algorithmic changes could affect the optimized
implementation, one could use the following technique. First, randomly generate
a large collection of lookup tables with the needed input and output data sizes.
Next, use Boolean reduction and optimization techniques to generate an FPGA-
based implementation. Then, synthesize this implementation and record its resource
use and latency. After a large number of random tables have been processed, the data
can be formed into a histogram, and statistical analysis can determine the expected
percentage of all possible permutations that are covered by a given resource and
latency allocation.
This technique provides more information to the designer to help in choosing the
amount of resources to reserve, however it still relies on a certain amount of intuition
and probability. The results from randomly generated permutations may contain
high-resource or high-latency outliers that may represent permutations that would
never actually be used in the design. The system designer may wish to restrict the
resource allocation to cover less than 100 % of the permutations in order to reduce
cost or because the number of inputs makes full coverage infeasible. On the other
hand, the future modification themselves may be outliers that are not captured by
this approach. An alternate method is to provide further restrictions on the generated
permutations, such that they do not represent an arbitrary algorithm that would never
be used for that system. For example, if the value of a specific input will always
be within a certain range, then any permutations for inputs that fall outside this
range can be discarded. Doing so, however, requires high levels of insight into the
functions and needs of the physics algorithms.
Fixed-Resource and Fixed-Latency Structures are alternative approaches that
guarantees the ability to accommodate arbitrary changes to the physics algorithms
without unreliable guesses about the amount of resources to reserve and without
introducing potential problems due to FPGA routing. The basic concept is simple:
build logic structures that, given a specific number of inputs and outputs, can
implement any possible algorithm in that structure. As in the worst-case analysis,
this approach may use a look-up table to store the output results for all possible
input combinations; the inputs index into that table to look up the output. However
in this case, the table is not optimized, and is instead placed as-is into a memory
structure. The size and performance of such a table is invariant to its contents, and
Search WWH ::




Custom Search