Image Processing Reference
InDepth Information
S
must satisfy the equation:
yS
S
(
x
)
y
=
xS
S
(
y
)
x
[A.3]
the general solution of which is:
f
[
A
e
]
k
+
f
[
A
e
]
k
=1


[A.4]
by assuming that
S
is twice differentiable.
A.3.4.
Probabilities inferred from functional equations
e
])
k
, which is referred to as the
probability of
A
conditionally to
e
. The two functional equations then become:
We then set by convention
p
(
A

e
)=
f
([
A

p
(
AB

e
)=
p
(
A

Be
)
p
(
B

e
)
,
[A.5]
e
)+
p
(
A
p
(
A


e
)=1
.
[A.6]
We have thus demonstrated the relations imposed axiomatically in the traditional
approach (Kolmogorov). Furthermore, we are dealing from the start with conditional
probabilities (related to a state of knowledge), whereas these probabilities were only
introduced later in the traditional theory. Finally, we infer the relation that leads to the
probability of the union:
p
(
A
+
B

e
)=
p
(
A

e
)+
p
(
B

e
)
−
p
(
AB

e
)
,
[A.7]
and therefore the additivity of the probabilities of exclusive events (usually imposed
axiomatically in the traditional theory). We also infer Bayes' rule:
Be
)=
p
(
B

Ae
)
p
(
A

e
)
p
(
A

.
[A.8]
p
(
B

e
)
Note that this approach leads to subjective probabilities that are additive, in con
tradiction with the general conception in the 17
th
century and with that of one of the
schools of thought in the 20
th
century.
A.3.5.
Measure of uncertainty and information theory
In an approach similar to that of Cox, Jaynes defined a series of criteria in order
to obtain a measure of uncertainty [JAY 57]. In his works, he tried to bring together
Search WWH ::
Custom Search