Biomedical Engineering Reference
In-Depth Information
else compute a new guess ˛ .j C 1/ , .j C 1/ , for instance by setting
D
J R . u .j/ .j/ ; .j/ /
i
Ǜ .j C 1/
i
D Ǜ .j/
i
! .j/
i
; iD 1;2;:::;K
(6.33)
J R . u .j/ .j/ ; .j/ /
D
D
! .j/
K C 1
.j C 1/
D .j/
;
where ! .j i , i D 1;2;:::;K C 1 are numerical coefficients that drive the
convergence of the procedure.
This approach, based on ( 6.33 ), belongs to the family of steepest descent methods
and the parameters ! i define the step performed in updating the solution along the
line identified by the gradient. These coefficients, in general, may be dynamically
determined at each iteration. Other iterative methods may be considered for the
sake of effectiveness. Among others, a method that usually outperforms the steepest
descent approach is the Broyden-Fletcher-Goldfarb-Shanno (BFGS) method (see,
e.g., [ 58 ]); another common choice is the Gauss-Newton method. The latter finds
the roots of D
J R =DŒ˛; D 0 using the Newton method, that means that, at
each iteration j, the minimization of the paraboloid tangent to
J R in ˛ .j/ ; .j/ is
performed. The method is potentially second order, but it has the drawback that the
Hessian of the functional
J R is needed.
The most troublesome step in the previous algorithm is the computation of the
gradients D
J R . u .j/ .j/ ; .j/ /=DŒ˛;. Let us address two possible methods.
Gradient Computation Through Sensitivities
A possible way for computing the gradients relies upon the chain rule
@ u
i
!
LJ LJ LJ LJ Ǜ .j/ D
LJ LJ LJ LJ u .j/
LJ LJ LJ LJ Ǜ .j/
i
LJ LJ LJ LJ Ǜ .j/
i
D
J R
i
@
J R
@ u
@
J R
i
C
; iD 1;2;:::;KC 1
where for easiness of notation we set Ǜ K C 1 D . We call sensitivities the derivatives
@ u
i
i
;
8i D 1;2;:::;KC 1
as they quantify the sensitivity of the solution to each control variable. From
LJ LJ LJ LJ
LJ LJ LJ LJ u .j/
LJ LJ LJ LJ
.j/
i
.j/
.j/
F
i
D
@
@ u
@
F
i
. u .j/ .j/ / D 0 )
F
D
C
D 0;
we have
LJ LJ LJ LJ u .j/
LJ LJ LJ LJ
.j/
i
.j/
@ u
F
i
@
@
D
:
(6.34)
 
Search WWH ::




Custom Search