Information Technology Reference
In-Depth Information
First, the base functions for the approximated associations have to be predetermined (e.g.: linear,
logarithmic, logistic, exponential, etc.). With this assumption it is possible to approximate the
unknown function by estimating the parameters of an appropriate regression model.
Second, a function can be locally approximated by developing an appropriate polynomial. This
principle is used for the Taylor or Fourier series expansion. A disadvantage of these techniques is
that their approximation is only accurate within local limits of the function.
Third, unknown functions can also be reconstructed without prior knowledge of their shape by
so-called universal approximators (Tikk et al., 2001). Hence these techniques are able to learn a
function from mere empirical observations without the need to narrow down some base function.
This characteristic becomes increasingly important whenever the unknown function to be approxi-
mated is of nonlinear type. As it can be shown by numerous examples, microeconomic functions
which usually underly strategic reasoning are almost never of linear type (Hillbrand, 2003, pp.
201ff.). The reasons for this observation are manifold: Saturation as well as scale effects or resource
limitations are only a few causes for the nonlinearity of relations between business variables. One
well known example is the association between the market price and the customer demand for a
certain product: Raising prices will not linearly result in an increasing demand. Rather it is likely
that there is some maximum price level the customer is willing to pay, which therefore provides
a limit for the demand. As an example Allen (1964) supposes demand functions to follow some
S-shaped—also known as sigmoidal or logistic—pattern.
For these reasons it is essential to abandon all restrictions regarding a priori assumptions about the
unknown function underlying a cause-and-effect relation. This postulate leads to the necessity to em-
ploy universal function approximators in order to identify the functional form of the causally proven
associations. Therefore this approach studies the potential and limitations of artificial neural networks
(ANNs) for universal causal function approximation. The theoretic foundations of this property of
ANNs is the result of the endeavors to approximate an unknown mapping by the combination of known
functions. The central theory in this area has been proposed by Kolmogorov (1957) who argued that
any arbitrary unknown function f can be approximated by two nested known functions ø and ψ. This
theory is usually regarded as the central concept for universal function approximation in the relevant
literature (Tikk et al., 2001, p. 2):
Theorem 3 (Kolmogorov's superposition theorem): For all n ≥ 2, and for any continuous real function
f of n variables on the domain [0, 1], f: [0,1] n → , there exist n(2n+1) continuous, monotone increasing
univariate functions on [0, 1], by which f can be reconstructed according to the following equation
2
n
n
( )
(
)
∑ ∑
f x x
,
=
x
1
n
q
pq
p
q
=
1
p
=
1
Further enhancements of Kolmogorov's superposition theory are developed by several authors which
lead to the notion of ANNs as universal function approximators (De Figueiredo, 1980; Hecht-Nielsen,
1987). Since the inner function of Kolmogorv's theorem can be highly nonsmooth it has to be weighted
with a factor λ in order to use specific continuous functions (squashing functions) for this purpose.
Therefore the resulting function can be represented by a multi layer perceptron (MLP).
Search WWH ::




Custom Search