Biomedical Engineering Reference
In-Depth Information
complexity and estimation accuracy since DEKF defines a node as the mutually
exclusive weight group. If there are multiple input training sequences with respect
to time stamp, the complexity can increase by the power of input channel number.
To overcome these limitations, we do not adopt the mutually exclusive weight
groups. Instead, we adopt the channel number for the mutually exclusive groups to
propose the coupling technique to compensate the computational accuracy using
multiple sensory channel inputs. We call this new proposed method Hybrid motion
estimation based on EKF (HEKF).
The contribution of this study is twofold: First, we propose a new approach to
split the whole RMLP with the complicated neuron number into a couple of
RMLPs with the simple neuron number to adjust separate input channels. Second,
we present a new method for the respiratory motion estimation using EKF which
adapts the coupling technique using multiple channel inputs for the mutually
exclusive groups to compensate the computational accuracy, instead of mutually
exclusive weight groups.
This chapter is organized as follows. In Sect. 4.2 , the theoretical background for
the proposed algorithm is briefly discussed. In Sect. 4.3 , the proposed hybrid
implementation based on EKF for RNN with multiple sensory channel inputs are
presented in detail. Section 4.4 presents and discusses experimental results of
proposed filter design method—efficient estimation of the measurements, opti-
mized group number for RMLP, prediction overshoot analysis, prediction time
horizon, and computational complexity of HEKF and DEKF. A summary of the
performance of the proposed method is presented in Sect. 4.5 .
4.2 Related Work
4.2.1 Recurrent Neural Network
A recurrent neural network (RNN) is a class of neural network where connections
between units form a directed cycle. This creates an internal state of the network
which allows it to exhibit dynamic temporal behavior. A network with a rich
representation of past outputs is a fully connected recurrent neural network, known
as the Williams-Zipser network, as shown in Fig. 4.1 [ 48 ]. This network consists
of three layers: the input layer, the processing layer and the output layer. For each
neuron i (i = 1, 2, …, N), the elements u j of the input vector (j = 1, 2, …,
M ? N ? 1) to a neuron u are as follows:
u j
ðÞ¼ xk 1
½
ð
Þ; ... ; xk M
ð
Þ; 1 ; y 1 k 1
ð
Þ; ... ; y N k 1
ð
Þ
;
ð 4 : 1 Þ
where M is the number of external inputs, N is the number of feedback connec-
tions, ( ) T denotes the vector transpose operation, and the (M ? N ? 1) 9 1
dimensional vector u comprises both the external and feedback inputs to a neuron,
Search WWH ::




Custom Search