Cryptography Reference
In-Depth Information
have probability distributions, and if the range of a random variable is a subset of the
real numbers, then the random variable may also have an expectation and a variance
(or a standard deviation, respectively). All of these values are put into perspective
in some inequalities (e.g., Markov's inequality and Chebyshev's inequality) that are
frequently used in probability theory. In the following chapter, we use probability
theory and apply it to information theory.
References
[1]
Ghahramani, S., Fundamentals of Probability , 2nd edition. Prentice-Hall, Upper Saddle River,
NJ, 1999.
[2]
Chung, K.L., A Course in Probability Theory , 2nd edition. Academic Press, New York, 2000.
[3]
Ross, S.M., A First Course in Probability , 6th edition. Prentice-Hall, Upper Saddle River, NJ,
2001.
[4]
Jaynes, E.T., Probability Theory: The Logic of Science . Cambridge University Press, Cambridge,
UK, 2003.
Search WWH ::




Custom Search