Information Technology Reference
In-Depth Information
X k x 1 +
X k x 2 +···+
X k x n
∂f 2
∂x 1
∂f 2
∂x 2
∂f 2
∂x n
f 2 ( X k + 1 ) =
= f 2 ( X k ) +
0
.
X k x 1 +
X k x 2 +···+
X k x n
∂f n
∂x 1
∂f n
∂x 2
∂f n
∂x n
f n ( X k + 1 ) =
= f n ( X k ) +
0
The algorithm for solving a set of nonlinear simultaneous equations by the
Newton - Raphson method is given below. If the equations to be solved are not
analytical, it is necessary to calculate derivatives numerically.
1. Guess X k .
2. Evaluate all f i ( X k ) .
3. If all
f i ( X k )
, the problem is solved.
4. Calculate all partial derivatives at X k .
5. Solve the matrix - vector equations
|
|
<
ε
X = P 1 ( F )
P X =− F for X
where P is the matrix of partial derivatives, X is the vector of x moves, and F
is the vector of function values.
6. For all x i , calculate
x k + 1
i
= x i
+ x i
7. Return to step 2, substituting X k + 1
for X k .
6.4.2 Direct Optimization of an Objective Function
The default regression method in Aspen Plus is maximum likelihood. In this method
the objective function is given by
(T i
n data
T i ) 2
σ
(P i
P i ) 2
σ
(x i
x i ) 2
σ
(y i
y i ) 2
σ
ψ =
+
+
+
(6.24)
2
T,i
2
P,i
2
x,i
2
y,i
i
where the superscripts e and m refer to estimated and measured values. For each point
in the set of data the following constraints apply. For simplicity, Poynting correction
is ignored; then for each component one may write
V
− γ i x i f i
y i φ
i P
=
0
(6.25)
γ i = γ
(T , x)
(6.26)
i
V (T,P,y)
φ
= φ
(6.27)
Search WWH ::




Custom Search