Information Technology Reference
In-Depth Information
1
1
1
(
)
1
j
j
L
j
R
j
ji
ji
R
j
L
ji
ji
R
θ
b
,
b
,
b
=
b
x
+
x
b
x
+
x
;
~
~
i
2
2
a
X
6
6
12
j
j
1
1
1
(
)
2
j
j
L
j
R
j
ji
ji
L
j
R
ji
ji
L
θ
b
,
b
,
b
=
b
x
x
+
b
x
x
.
~
~
i
j
1
1
a
X
6
6
12
j
Using propositions 2.1, 2.2, 6.1 - 6.4, let us determine the weighed segments
(Let us determine the weighed segments, using propositions 2.1, 2.2, 6.1 - 6.4)
1
m
(
)
1
m
(
)
0
0
1
j
j
L
j
R
0
0
2
j
j
L
j
R
b
b
+
θ
b
,
b
,
b
,
b
+
b
+
θ
b
,
b
,
b
,
i
=
1
n
~
~
~
~
L
i
j
R
i
j
6
a
X
6
a
X
j
j
j
=
1
j
=
1
for model output data
~
~
~
~
~
i
i
m
Y
=
a
+
a
X
+
...
+
a
X
.
i
0
1
1
m
Let us consider a functional
n
(
~
)
=
2
F
=
f
Y
i Y
,
,
i
which characterizes an affinity measure between initial and model output data. It
is easy to demonstrate that
i
1
2
n
m
1
1
(
)
0
0
i
i
L
1
j
j
L
j
R
F
=
b
b
y
+
y
+
θ
b
,
b
,
b
+
~
~
L
1
i
j
6
6
a
X
j
i
=
1
j
=
1
2
n
1
1
m
(
)
0
0
i
i
R
2
j
j
L
j
R
+
b
+
b
y
y
+
θ
b
,
b
,
b
.
~
2
~
R
i
j
6
6
a
X
j
i
=
1
j
=
1
The optimization problem is set as follows:
(
n
)
(
~
)
=
j
j
L
j
R
2
F
b
,
b
,
b
=
f
Y
,
Y
min;
i
i
i
1
b
j
L
0
b
j
R
0
j
=
0
m
.
(
)
(
)
1
j
j
L
j
R
2
j
j
L
j
R
θ
b
,
b
,
b
θ
b
,
b
,
b
As
and
are piecewise linear functions in the field
~
~
~
~
i
i
j
a
X
a
X
j
j
j
b
j
0
b
j
0
, then F is piecewise differentiable function, and
solutions of an optimization problem are found by means of known methods
[152].
Under condition of nonnegative regression coefficients, the optimization
problem is formulated as follows:
,
,
j
=
0
m
2
n
1
1
m
1
1
1
F
=
b
0
b
0
y
i
+
y
i
L
+
b
j
x
ji
x
ji
L
b
j
L
x
ji
x
ji
L
+
L
1
1
1
6
6
6
6
12
i
=
1
j
=
1
2
n
1
1
m
1
1
1
+
b
0
+
b
0
y
i
y
i
R
+
b
j
x
ji
+
x
ji
R
+
b
j
R
x
ji
+
x
ji
R
min
.
R
2
2
2
6
6
6
6
12
i
=
1
j
=
1
 
Search WWH ::




Custom Search