Image Processing Reference
In-Depth Information
were developed for infinite sequences, but did not specify how to proceed in practice.
This was the case for Venn's limit, Fisher's hypothetical infinite population (1912),
or von Mises's infinite random sequences (1919). Von Mises clearly stated the dis-
tinction between abstract mathematical theory and the application of this theory: the
essential property of these infinite random sequences has to be that the probability
of success must remain the same regardless of the sub-sequence (which is infinite),
which is an abstract concept, and is restricted in particular to those fields where this
definition is reasonable. His argument was that it is not necessary to actually repeat
the experiment indefinitely for the probability to exist and therefore he limited himself
to physical probabilities and random processes, excluding problems where we would
ask ourselves, for example, the probability for X to die at age 60.
The 20 th century also saw the development of the Gaussian distribution. Already
known to Moivre, it was obtained by Gauss (1823) by using the maximum likelihood
principle in problems about the estimation of the observation error. In the middle of
the 19 th century, it was rediscovered by Herschel, based on geometric considerations
for estimating measurement errors in the positions of stars, and also by Maxwell while
he studied the speed distributions of molecules in a gas [DEM 93].
However, despite the strong frequentist context of the time, distinctions similar to
those made today were expressed. For example, Poisson, in his research on the prob-
ability of judgments (1837), distinguished chance and probability. Chance is specific
to the event itself, regardless of our knowledge, whereas probability is related to our
knowledge. This distinction is similar to Lambert's. The distinction between objective
probability and subjective probability is also explicit in Cournot's Exposition de la
Théorie des Chances et des Probabilités (1843). But even if the distinction is explicit,
theories developed in the 19 th century only make it possible to solve problems related
to physical or objective probabilities.
Boole's works, in a way, constitute a link between frequentist and epistemological
methods. In his topic Laws of Thought (1854), he attempted to combine, on an episte-
mological level, evaluations made locally on various attributes of the information. He
supported the idea that probabilities are obtained from frequencies, but acknowledged
the impossibility of estimating, in many situations, the joint frequencies, which thus
need to be generated on a subjective level.
A.1.4. The 20 th century: a return to subjectivism
In the 20 th century, traditional methods continued to be developed, with increas-
ingly strong mathematical foundations, particularly under the impulse of Kolmogorov,
and the frequentist method remained present and strong (particularly in signal and
image processing), benefiting from the works of Neyman, Pearson, Feller [FEL 66].
At the same time, with the birth of artificial intelligence and its growing importance,
Search WWH ::




Custom Search