Information Technology Reference
In-Depth Information
It should be pointed out that Zadeh's CRI is not a “result” but a meta-rule. It is a
“directive” allowing to reach a solution to our problem, and it should be noticed
that when P
P it is not in general Q
=
=
Q . For example, in the case of ML -
implications it is:
μ Q (
y
) =
Sup
x
Min
P (
x
),
T
P (
x
), μ Q (
y
)))
X
Sup
x
T
P (
x
), μ Q (
y
)) =
T
(
Sup
x
X μ P (
x
), μ Q (
y
)) = μ Q (
y
),
X
provided that Sup
μ P
=
1, and because of T
P (
x
), μ Q (
y
)) μ Q (
y
)
and T is
continuous. But, for example, if T
=
Min , Sup
μ P
=
0
.
9 and Sup
μ Q
=
1, then
μ Q (
y
) =
Min
(
0
.
9
, μ Q (
y
)) = μ Q (
y
)
. Notice that for all the cases in which
μ P is
normalized (
μ P (
x 0 ) =
1forsome x 0
X ), Mamdani-Larsen implications do verify
μ Q = μ Q whenever
μ P = μ P .
Fourth Step: Numerical Input
This is the case in which
. That is “ x is P ”isthe
μ P
=
∈{
x 0 }
is exactly x
x 0 or x
statement “ x is x 0 ” and hence
1f x
=
x 0
μ P (
x
) =
0f x
=
x 0
In that case,
μ Q (
y
) =
Sup
x
T 1 P j (
x
),
J
P (
x
), μ Q (
y
))) =
J
P (
x 0 ), μ Q (
y
)),
X
for all y
Y .
For example, let J be a ML -implication, then
μ Q (
) =
P (
x 0 ), μ Q (
))
y
T
y
.
If X
=[
0
,
10
]
, Y
=[
0
,
1
]
, P
=
close to 4, Q
=
big , with uses as shown in the
following figure and moreover x 0 =
3
.
5, with J
(
a
,
b
) =
Min
(
a
,
b
)
, then
μ Q (
y
) =
Min
P (
3
.
5
), μ Q (
y
)) =
Min
(
0
.
5
, μ Q (
y
))
, with
μ P (
x
) =
x
3 between 3 and 4.
1
1
μ
μ
Q
P
0.5
0
0
0.5
1
3
4
5
10
 
Search WWH ::




Custom Search