Biomedical Engineering Reference
In-Depth Information
where
∂
2
l
(
β
)
∂
2
β
∂
2
l
(
β
)
∂β
1
∂β
2
···
∂
2
l
(
β
)
∂β
1
∂β
p
∂
2
l
(
β
)
∂β
2
∂β
1
∂
2
l
(
β
)
∂
2
β
2
∂
2
l
(
β
)
∂β
2
∂β
p
···
U
(
β
)=
−
N
·
Hessian
(
β
)=
N
·
.
(3
.
8)
···
···
···
···
∂
2
l
(
β
)
∂β
p
∂β
1
∂
2
l
(
β
)
∂β
p
∂β
2
···
∂
2
l
(
β
)
∂
2
β
p
The Hessian matrix is positive definite, so it is strictly concave on
β
.How-
ever, the computation is obviously more complex. In practice, we use soft-
ware to carry out this process for the MLE.
3.2.
Non-Linear Least Square Fit
Least square regression (LSE) is a very popular and useful tool used in
statistics and other fields. Suppose we want to find a relationship between
a dependent (response) variable
Y
and an independent (predictor) variable
X
, in which a statistical relation is
Y
=
g
(
X
|
θ
)+
,
(3
.
9)
where
is the error, and
θ
is a vector of parameters to be estimated in
function
g
.If
g
assumes a non-linear format in terms of
X
,wearefacinga
non-linear regression. Suppose
X
=(
x
1
,
,x
m
)
τ
,
Y
=(
y
1
,
,y
m
)
τ
.We
···
···
define
f
i
(
θ
)=
y
i
−
y
i
=
y
i
−
g
(
x
i
|
θ
)
(3
.
10)
θ
which minimizes
F
(
θ
),
The non-linear least square regression is to find
where
F
(
θ
) is defined as
m
F
(
θ
)=
1
2
(
f
i
(
θ
))
2
=
1
2
=
1
2
f
(
θ
)
τ
f
(
θ
)
.
2
f
(
θ
)
(3
.
11)
i
=1
There are many non-linear algorithms for finding
θ
. These well-developed
algorithms include the Gauss-Newton method, the Levenberg-Marquardt
method, and Powell's Dog Leg method (see [7] for example). In this study,
we use the Gauss-Newton method. It is based on the implementation of
first derivatives of the components of the vector function. In special cases,
it can give quadratic convergence as the Newton-method does for general
optimization [8]. The Gauss-Newton method is based on a linear approxi-
mation to the components of
f
(a linear model of f) in the neighborhood
of
θ
: For small
h
, we see from the Taylor expansion that
f
(
θ
+
h
)
≈
(
θ
):=
f
(
θ
)
J
(
θ
)
h,
(3
.
12)
Search WWH ::
Custom Search