Information Technology Reference
In-Depth Information
simulation, for example for comparing similar systems and in optimization and
variance reduction algorithms [5,6,7]), and they do not require special hardware.
Let s 0 ,s 1 ,s 2 ,... denote the successive states of our RNG, and u 0 ,u 1 ,u 2 ,... be
the corresponding sequence of output values. After selecting s 0 (which might be
random), the successive states follow the deterministic recurrence s i = f ( s i− 1 )
for a given transition function f :
S→S
,andthe output at step i is u i = g ( s i )
for some output function g :
(0 , 1). This output sequence is necessarily
(eventually) periodic, with period ρ that cannot exceed the number of distinct
states,
S→
2 b and we
usually require that ρ be close to 2 b , to avoid wasting memory. Values of b for
certain practical RNGs can be as much as 20,000 or even more [8,9,10], but for
simulation, a few hundred is probably large enough.
Besides a long period, other standard requirements for RNGs include high
running speed (in 2008, fast RNGs can produce over 100 million U (0 , 1) random
numbers per second on a standard computer), reproducibility and portability
(the ability to reproduce the same sequence several times on the same com-
puter, and also on any standard computing platform), and the possibility to
quickly jump ahead by an arbitrarily large number of steps in the sequence. The
latter is frequently used to split the sequence into several shorter (but still very
long) disjoint subsequences, in order to provide an arbitrary number of virtual
generators (often called RNG streams) [5,6].
However, these basic properties are not sucient To see why, just note that
an RNG defined by u i = s i =( i +1 / 2) / 2 10000 mod 1 easily satisfies all the above
properties, and the values cover the interval (0 , 1) very evenly in the long run,
but this RNG certainly fails to provide a good imitation of independent U (0 , 1)
random variables. The problem is the lack of (apparent) independence between
successive values.
A key observation is that we have both uniformity and independence if and
only if for any integer s> 0, ( u 0 ,...,u s− 1 ) is a random vector with the uniform
distribution over the s -dimensional unit hypercube (0 , 1) s .WewanttheRNG
to provide a good approximation of this property. If the seed s 0 is selected at
random in
|S|
. When the state occupies b bits of memory, we have ρ
, then the vector ( u 0 ,...,u s− 1 ) has the uniform distribution over
the finite set Ψ s =
S
, which can be viewed as a discrete
approximation of the uniform distribution over (0 , 1) s . For this approximation
to be good, Ψ s must provide a dense and uniform coverage of the unit hypercube
(0 , 1) s , at least for moderate values of s . This is possible only if
{
( u 0 ,...,u s− 1 ): s 0 ∈S}
has large
cardinality. In fact, this is a more important reason for having a long period than
the risk of exhausting the RNG cycle. More generally, we want high uniformity
of the sets of the form Ψ (
S
u
)=
{
( u i 1 ,...,u i d ): s 0 ∈S}
,where
u
=
{
i 1 ,...,i d }
is
an arbitrary set of integers such that 0
<i d .
But how should we measure the uniformity of Ψ s (and of the sets Ψ (
i 1 <
···
))?
There are many ways. But a crucial requirement is that the selected measure
of uniformity must be computable eciently without generating the random
numbers (or enumerating the points of Ψ s ) explicitly, because there are just too
many of them. For nonlinear RNGs, easily computable measures of uniformity
u
Search WWH ::




Custom Search