Information Technology Reference
In-Depth Information
r ecurrent neural networks
Recurrent neural networks (RNN) are a type of ANN, which allow output signals of some of their
neurons to flow back and serve as inputs for the neurons of the same layer or those of the previous lay-
ers. RNN serve as a powerful tool for many complex problems, in particular when time series data is
involved. The training method called “backpropagation through time” can be applied to train a RNN
on a given training set (Werbos, 1990). The Elman network implements backpropagation through time
as a two-layer backpropagation neural network with a one step delayed feedback from the output of
the hidden layer to its input (Elman, 1990). Figure 2 shows schematically the structure of RNN for the
supply chain demand-forecasting problem. The arrows represent connections within the neural network
with the thicker ones representing recurrent connection weights.
Support Vector Machines
Support vector machines (SVM) are a newer type of universal function approximators that are based
on the structural risk minimization principle from statistical learning theory (Vapnik, 1995) as opposed
to the empirical risk minimization principle on which ANN and Multiple Linear Regression (MLR)
models, to name a few, are based. The objective of structural risk minimization is to reduce the true
error on an unseen and randomly selected test example as opposed to ANN and MLR, which minimize
the error for the currently seen examples.
Support vector machines project the data into a higher or lower dimensional space and maximize
the margins between classes or minimize the error margin for regression using support vectors. Pro-
jecting into a higher or lower dimensional space permits identifying patterns that may not be clear in
the input space, but which become better identifiable in a space with a different number of dimensions.
Margins are “soft”, meaning that a solution can be found even if there are contradicting examples in the
training set. The problem is formulated as a convex optimization problem with no local minima, thus
Figure 2. Recurrent neural network for demand forecasting
Recurrent Neural Network
Hidden
Layer
Output
Layer
Input Layer
Recurrent
Connections
Search WWH ::




Custom Search