Information Technology Reference
In-Depth Information
6.4 MATHEMATICS OF REGRESSION
Two approaches for regressing data are available in Aspen Plus, minimizing an objec-
tive function by solving simultaneous nonlinear equations similar to linear regression
and minimizing an objective function subject to the constraints of equilibrium. The
partial derivatives of the objective functions with respect to the parameters are calcu-
lated numerically by perturbing each of the parameters in turn by a small value, a ,
a small fraction of the parameter, while keeping the remaining parameters constant.
Using the perturbed and current values of the objective function, partial derivatives are
calculated using
∂a = ψ a + a − ψ a
(6.23)
a
6.4.1 Newton-Raphson Method for Solution of Nonlinear Equations
The generalized Newton - Raphson method involves the arrangement of a set of n
equations to be solved in the format
f 1 (x 1 ,x 2 ,...,x n ) = 0
f 2 (x 1 ,x 2 ,...,x n ) = 0
.
f n (x 1 ,x 2 ,...,x n ) = 0
Each of the functions is expressed as a first-order Taylor's series about a point X k
to
represent the functions at a point X k + 1
in the vicinity of X k :
X k (x k + 1
X k (x k + 1
X k (x k + 1
∂f 1
∂x 1
∂f 1
∂x 2
∂f 1
∂x n
f 1 ( X k + 1 ) = f 1 ( X k ) +
x 1 ) +
x 2 ) +···+
x n )
n
1
2
X k (x k + 1
X k (x k + 1
X k (x k + 1
∂f 2
∂x 1
∂f 2
∂x 2
∂f 2
∂x n
f 2 ( X k + 1 ) = f 2 ( X k ) +
x 1 ) +
x 2 ) +···+
x n )
1
2
n
.
X k (x k + 1
X k (x k + 1
X k (x k + 1
∂f n
∂x 1
∂f n
∂x 2
∂f n
∂x n
f n ( X k + 1 ) = f n ( X k ) +
x 1 ) +
x 2 ) +···+
x n )
n
1
2
Hypothesizing that all functions are zero at the point X k + 1 , evaluating all derivatives
at the point X k , and changing the nomenclature such that
x i = x k + 1
x i
i
produces the following linear set of equations, which can be solved for the vector of
x .
X k x 1 +
X k x 2 +···+
X k x n
∂f 1
∂x 1
∂f 1
∂x 2
∂f 1
∂x n
f 1 ( X k + 1 ) =
= f 1 ( X k ) +
0
Search WWH ::




Custom Search