Information Technology Reference
In-Depth Information
This notion is naturally extended to that of probability measure . Given a set of
elements, usually called events (and the set is a space of events ) we may assign
to each of them a probability which determines the possibility of their appearance
with respect to a given (observation) context. An axiomatic approach, establishing
natural conditions which a probability measure has to satisfy, has been developed
in the last century by Kolmogorov. However, it is easy to realize that probability
theory is the natural mathematical setting for studying statistical phenomena. Many
important statistical parameters obey specific laws expressed by means of concepts
developed in the theory of probability. The basic concepts of probability theory are
those of random variable and probability distribution . A real random variable is
a variable ranging on real numbers that assigns to each real interval I a measure, less
than or equal to 1, to the event that the variable takes values inside that interval. The
theory of probability is a discipline which emerged in the modern age, with some
anticipations by the Italian mathematician Girolamo Cardano (1501-1576) ( Liber
de ludo aleae , that is, The topic about the dice game), Galileo (1564-1642) ( Sopra
le scoperte dei dadi , that is, About discoveries on dice), and Christian Huygens
(1629-1695) ( De ratiociniis in ludo aleae , that is, The rules of dice game). The first
main intuitions were developed by Pascal (1623-1662) and Fermat (1601-1665),
but a systematic treatise on this subject was the topic on Ars Conjectandi , written by
Jacob Bernoulli (1654 -1705), who used binomial coefficients in the distribution of
boolean variables, and found the law of large numbers (frequencies approximate to
probabilities for large populations), called Theorema Aureus . The idea of assigning
degrees of possibility to events, is an aspect of great innovation with respect to the
classical mathematics, and implicitly, it replaces facts in suppositione objecti (farts
as they are) with that of facts in suppositione subjecti (facts as they are judged).
Namely, reality is not what it is, but what an observer judges it could be, according
to a percentage of “actuality”, in the context of a potential space of events.
The most important discoveries in probability theory, after Bernoulli's work,
were developed since the 17th and18th centuries, starting with De Moivre (1667-
1754), who found the normal distribution as a curve for approximating large bino-
mial coefficients. Bayes (1702 -1752) discovered the theorem named by his name
(independently discovered also by Laplace), concerning the inversion of conditional
probabilities . Laplace (1749 -1827) extended the De Moivre's result about normal
distribution. Finally, Gauss (1777-1855) recognized the normal distribution as the
law of casual error distribution, and Poisson (1781-1840) introduced the distribu-
tion named by his name as the law of rare events.
A space of events is a special boolean algebra (with 0, 1, sum, product, and nega-
tion, analogous to the operations
over propositions). Moreover, a probability
measure can be easily assigned to a proposition, by measuring in some way, with
numbers in [0,1], the class of models where the proposition holds. However, an im-
portant remark about probability is the distinction of two different aspects: i) the
definition of the space of events which is more appropriate in a given context, and,
ii) the application of probabilistic methods for analyzing and deducing information
within a given event space. The methods of the theory of probability constitute a
conceptual framework which is, for many aspects, independent from the specific
¬,∧,∨
Search WWH ::




Custom Search