Information Technology Reference
In-Depth Information
In conclusion
If all the partial outputs
μ Q 1 Q 2 , ..., μ Q n , are consequences of the input
μ P ,
also the final output
μ Q
is a consequence of
μ P .
If at least one of the partial outputs
μ Q i
(
1
i
n
)
is just a conjecture of the
input
μ P , also the final output
μ Q
is a conjecture of
μ P .
Nevertheless, it is not usual that
μ Q results to be a consequence of the single input
μ P . Let's us introduce a necessary and sufficient condition for it in the particular
case in which there is only one rule represented by J
(
a
,
b
) =
ab (Larsen).
Letitbe“If x is P , then y is Q ”( x
,
y
X ), and
P μ Q )(
x
,
y
) = μ P (
x
Q (
y
)
,
with the input x
=
x 0 . Then,
μ Q (
y
) = μ P (
x 0 Q (
y
),
y
X
.
Provided
μ { x 0 } = μ 0 ,tohave
μ { x 0 } (
y
) μ Q (
y
) = μ P (
x 0 Q (
y
)
, it is neces-
sary that, with y
=
x 0 ,1
μ P (
x 0 Q (
x 0 )
or 1
= μ P (
x 0 ) = μ Q (
x 0 )
. Hence,
μ Q
Cons
( { μ { x 0 } } )
implies
μ P (
x 0 ) = μ Q (
x 0 ) =
1.
Provided
μ P (
x 0 ) = μ Q (
x 0 ) =
1, from
μ Q (
y
) = μ P (
x 0 Q (
y
)
, follows
μ Q (
y
) = μ Q (
y
)
, for all y
X , and,
=
If y
x 0 ,
μ { x 0 } (
x 0 ) =
1
= μ Q (
x 0 )
=
μ { x 0 } (
) =
μ Q (
)
If y
x 0 ,
y
0
y
,
that is
μ { x 0 } (
y
) μ Q (
y
)
, for all y
X
.
Hence, in this particular case, the necessary and sufficient condition for being
μ Q
Cons
( { μ { x 0 } } )
is that
μ P (
x 0 ) = μ Q (
x 0 ) =
1. Nevertheless, what happens in
most of the cases is that
μ Q
Conj
( { μ { x 0 } } )
, with
μ Q
Sp
( { μ { x 0 } } )
,or
μ Q
Hyp
( { μ { x 0 } } )
.
3.7 Two Final Examples
Let's show an example in which the output is a speculation of the input and other in
which the output is a hypothesis.
Example . Rule, “If x is small, then y is big”, with X
=
Y
=[
0
,
10
]
and J
(
a
,
b
) =
ab
y
10 , and x 0 =
x
(Larsen), with
μ S (
x
) =
1
10 ,
μ B (
y
) =
5. It is
5
10 ,
y
10 ) =
y
20 ,
μ Q (
) =
(
y
J
1
and from the graphic
 
 
Search WWH ::




Custom Search