Information Technology Reference
In-Depth Information
the ball has landed. The assistant now throws a second ball onto the table and
reports to Bayes only that it landed to the left or right of the original ball. If it
landed to the left, Bayes can deduce that the initial ball is slightly more likely
to be in the right half of the table than in the left. The friend tosses another
ball and reports that it lands to the right of the first ball. From this information,
Bayes knows that the original ball cannot be at the extreme right of the table.
With more and more throws, Bayes can narrow down the range of positions for
the first ball and assign relative probabilities for different ranges. Bayes showed
how it was possible to modify his initial guess for the position of the first ball,
his prior probability, and to produce a new posterior probability by taking into
account the additional data he had been given.
Although Bayes was first to suggest the use of probability to represent
belief, it was the French mathematician Pierre-Simon Laplace ( B.14.2 ) who
developed this idea into a useful tool for many different types of problems.
Laplace had become interested in probability through reading a topic on gam-
bling and did not initially know of Bayes' work. Laplace published his first
paper on this subject in 1774 with the title “Mémoire sur la probabilité des
causes par les événements” (Memoir on the Probability of the Causes of Events),
so that his method is often abbreviated as just the “probability of causes.” One
of the first major applications of his new theory was to an analysis of the data
on births in London and Paris. He wanted to know whether the data supported
the suggestion of Englishman John Graunt that slightly more boys were born
than girls. Using the christening records from London and Paris, Laplace con-
cluded that he was “willing to bet that boys would outnumber girls for the next
179 years in Paris and for the next 8,605 years in London.” 3 Later in his life,
Laplace turned to frequentist techniques to deal with the large quantities of
reliable data on all sorts of subjects. In 1810, he proved what is now called the
central limit theorem , which justifies the taking of the average of many measure-
ments to arrive at the most probable value for a quantity. When the French
government published detailed data on such events as thefts, murders, and
suicides, all the governments in Europe started studying statistical data on a
whole range of subjects. Bayes' idea that the probability of future events could
be calculated by determining their earlier frequency was lost in a welter of
numbers. As the nineteenth century progressed, few people regarded the idea
that the uncertainty of some prediction could be modified by something as
subjective as “belief” as a serious scientific approach. Apart from a few isolated
instances, it was not until the middle of the twentieth century that mathema-
ticians and scientists again took seriously a Bayesian interpretation of proba-
bility and considered it a valid tool for research. Today, the Bayesian approach
has a wide variety of uses. For example, doctors employ it to diagnose diseases,
and genetic researchers use it to identify the particular genes responsible for
certain traits.
B.14.2. Pierre-Simon Laplace
(1749-1827) was one of the giants
of mathematics and science. He
is often referred to as the “French
Newton” because his works made a
major contribution to many areas
of knowledge including astronomy,
mechanics, calculus, statistics, and
philosophy. One of his tasks as a
member of the French Academy was
to standardize European weights and
measures and, in 1799, the meter
and the kilogram were introduced as
standards.
Bayes' Rule and some applications
The modern revival of Bayesian thinking began in the 1940s. In 1946,
Richard Cox, a physicist at Johns Hopkins University, looked again at the fun-
damentals of the Bayesian view of probability. In particular, he wanted to find
 
Search WWH ::




Custom Search