Random number generation: Difference between revisions
m added nag reference |
mNo edit summary |
||
Line 101: | Line 101: | ||
There are a couple of methods to generate a random number based on a [[probability density function]]. These methods involve transforming a uniform random number in some way. Because of this, these methods work equally well in generating both pseudo-random and true random numbers. One method, called the [[Inverse transform sampling|inversion method]], involves integrating up to an area greater than or equal to the random number (which should be generated between 0 and 1 for proper distributions). A second method, called the [[Rejection sampling|acceptance-rejection method]], involves choosing an x and y value and testing whether the function of x is greater than the y value. If it is, the x value is accepted. Otherwise, the x value is rejected and the algorithm tries again.<ref>{{cite web |
There are a couple of methods to generate a random number based on a [[probability density function]]. These methods involve transforming a uniform random number in some way. Because of this, these methods work equally well in generating both pseudo-random and true random numbers. One method, called the [[Inverse transform sampling|inversion method]], involves integrating up to an area greater than or equal to the random number (which should be generated between 0 and 1 for proper distributions). A second method, called the [[Rejection sampling|acceptance-rejection method]], involves choosing an x and y value and testing whether the function of x is greater than the y value. If it is, the x value is accepted. Otherwise, the x value is rejected and the algorithm tries again.<ref>{{cite web |
||
| last = The MathWorks | first = | title = Common generation methods | date = | work = | url=http://www.mathworks.de/help/toolbox/stats/br5k9hi-1.html | accessdate = 2011-10-13 }}</ref> |
| last = The MathWorks | first = | title = Common generation methods | date = | work = | url=http://www.mathworks.de/help/toolbox/stats/br5k9hi-1.html | accessdate = 2011-10-13 }}</ref> |
||
<!-- [http://www.mathworks.com/access/helpdesk/help/toolbox/stats/bqttfc1.html#bqt8l8g]-- |
<!-- [http://www.mathworks.com/access/helpdesk/help/toolbox/stats/bqttfc1.html#bqt8l8g]--><ref>{{ cite web | last = The Numerical Algorithms Group | first = | title = G05 - Random Number Generators | date = | work = NAG Library Manual, Mark 23 | url = http://www.nag.co.uk/numeric/fl/nagdoc_fl23/pdf/G05/g05intro.pdf | accessdate = 2012-02-09 }}</ref> |
||
== By humans == |
== By humans == |
Revision as of 16:45, 9 February 2012
This article needs additional citations for verification. (June 2009) |
A random number generator (RNG) is a computational or physical device designed to generate a sequence of numbers or symbols that lack any pattern, i.e. appear random.
The many applications of randomness have led to the development of several different methods for generating random data. Many of these have existed since ancient times, including dice, coin flipping, the shuffling of playing cards, the use of yarrow stalks (by divination) in the I Ching, and many other techniques. Because of the mechanical nature of these techniques, generating large amounts of sufficiently random numbers (important in statistics) required a lot of work and/or time. Thus, results would sometimes be collected and distributed as random number tables. Nowadays, after the advent of computational random number generators, a growing number of government-run lotteries, and lottery games, are using RNGs instead of more traditional drawing methods. RNGs are also used today to determine the odds of modern slot machines.[1]
Several computational methods for random number generation exist, but often fall short of the goal of true randomness — though they may meet, with varying success, some of the statistical tests for randomness intended to measure how unpredictable their results are (that is, to what degree their patterns are discernible).
Practical applications and uses
Random number generators have applications in gambling, statistical sampling, computer simulation, cryptography, completely randomized design, and other areas where producing an unpredictable result is desirable.
Note that, in general, where unpredictability is paramount — such as in security applications — hardware generators are generally preferred (where feasible) over pseudo-random algorithms.
Random number generators are very useful in developing Monte Carlo-method simulations, as debugging is facilitated by the ability to run the same sequence of random numbers again by starting from the same random seed. They are also used in cryptography - so long as the seed is secret. Sender and receiver can generate the same set of numbers automatically to use as keys.
The generation of pseudo-random numbers is an important and common task in computer programming. While cryptography and certain numerical algorithms require a very high degree of apparent randomness, many other operations only need a modest amount of unpredictability. Some simple examples might be presenting a user with a "Random Quote of the Day", or determining which way a computer-controlled adversary might move in a computer game. Weaker forms of randomness are used in hash algorithms and in creating amortized searching and sorting algorithms.
Some applications which appear at first sight to be suitable for randomization are in fact not quite so simple. For instance, a system that "randomly" selects music tracks for a background music system must only appear random, and may even have ways to control the selection of music: a true random system would have no restriction on the same item appearing two or three times in succession.
"True" random numbers vs. pseudorandom numbers
There are two principal methods used to generate random numbers. One measures some physical phenomenon that is expected to be random and then compensates for possible biases in the measurement process. The other uses computational algorithms that produce long sequences of apparently random results, which are in fact completely determined by a shorter initial value, known as a seed or key. The latter type are often called pseudorandom number generators.
A "random number generator" based solely on deterministic computation cannot be regarded as a "true" random number generator, since its output is inherently predictable. How to distinguish a "true" random number from the output of a pseudo-random number generator is a very difficult problem. However, carefully chosen pseudo-random number generators can be used instead of true random numbers in many applications. Rigorous statistical analysis of the output is often needed to have confidence in the algorithm.[citation needed]
Generation methods
Physical methods
The earliest methods for generating random numbers — dice, coin flipping, roulette wheels — are still used today, mainly in games and gambling as they tend to be too slow for most applications in statistics and cryptography.
A physical random number generator can be based on an essentially random atomic or subatomic physical phenomenon whose unpredictability can be traced to the laws of quantum mechanics. Sources of entropy include radioactive decay, thermal noise, shot noise, avalanche noise in Zener diodes, clock drift, the timing of actual movements of a hard disk read/write head, and radio noise. However, physical phenomena and tools used to measure them generally feature asymmetries and systematic biases that make their outcomes not uniformly random. A randomness extractor, such as a cryptographic hash function, can be used to approach a uniform distribution of bits from a non-uniformly random source, though at a lower bit rate.
In 2010, Kanter et al. at Bar-Ilan University were able to create a physical random bit generator at a 300 Gbit/s rate, making it faster than any created before it.[2]
Various imaginative ways of collecting this entropic information have been devised. One technique is to run a hash function against a frame of a video stream from an unpredictable source. Lavarand used this technique with images of a number of lava lamps. HotBits measures radioactive decay with Geiger–Muller tubes,[3] while Random.org uses variations in the amplitude of atmospheric noise recorded with a normal radio.
Another common entropy source is the behavior of human users of the system. While people are not considered good randomness generators upon request, they generate random behavior quite well in the context of playing mixed strategy games.[4] Some security-related computer software requires the user to make a lengthy series of mouse movements or keyboard inputs to create sufficient entropy needed to generate random keys or to initialize pseudorandom number generators.[5]
Computational methods
Pseudo-random number generators (PRNGs) are algorithms that can automatically create long runs of numbers with good random properties but eventually the sequence repeats (or the memory usage grows without bound). The string of values generated by such algorithms is generally determined by a fixed number called a seed. One of the most common PRNG is the linear congruential generator, which uses the recurrence
to generate numbers. The maximum number of numbers the formula can produce is the modulus, m. To avoid certain non-random properties of a single linear congruential generator, several such random number generators with slightly different values of the multiplier coefficient a can be used in parallel, with a "master" random number generator that selects from among the several different generators.[citation needed]
A simple pen-and-paper method for generating random numbers is the so-called middle square method suggested by John Von Neumann. While simple to implement, its output is of poor quality.
Most computer programming languages include functions or library routines that purport to be random number generators. They are often designed to provide a random byte or word, or a floating point number uniformly distributed between 0 and 1.
Such library functions often have poor statistical properties and some will repeat patterns after only tens of thousands of trials. They are often initialized using a computer's real time clock as the seed, since such a clock generally measures in milliseconds, far beyond the person's precision. These functions may provide enough randomness for certain tasks (for example video games) but are unsuitable where high-quality randomness is required, such as in cryptographic applications, statistics or numerical analysis. Better pseudo-random number generators such as the Mersenne Twister are widely available. Much higher quality random number sources are available on most operating systems; for example /dev/random on various BSD flavors, Linux, Mac OS X, IRIX, and Solaris, or CryptGenRandom for Microsoft Windows.
An example of a simple pseudo-random number generator is the Multiply-with-carry method invented by George Marsaglia. It is computationally fast and has good (albeit not cryptographically strong) randomness properties (note that this example is not thread safe):[6]
m_w = <choose-initializer>; /* must not be zero */
m_z = <choose-initializer>; /* must not be zero */
uint get_random()
{
m_z = 36969 * (m_z & 65535) + (m_z >> 16);
m_w = 18000 * (m_w & 65535) + (m_w >> 16);
return (m_z << 16) + m_w; /* 32-bit result */
}
Generation from a probability distribution
There are a couple of methods to generate a random number based on a probability density function. These methods involve transforming a uniform random number in some way. Because of this, these methods work equally well in generating both pseudo-random and true random numbers. One method, called the inversion method, involves integrating up to an area greater than or equal to the random number (which should be generated between 0 and 1 for proper distributions). A second method, called the acceptance-rejection method, involves choosing an x and y value and testing whether the function of x is greater than the y value. If it is, the x value is accepted. Otherwise, the x value is rejected and the algorithm tries again.[7] [8]
By humans
Random number generation may also be done by humans directly. However, most studies find that human subjects have some degree of nonrandomness when generating a random sequence of, e.g., digits or letters. They may alternate too much between choices compared to a good random generator.[9]
Post-processing and statistical checks
Even given a source of plausible random numbers (perhaps from a quantum mechanically based hardware generator), obtaining numbers which are completely unbiased takes care. In addition, behavior of these generators often changes with temperature, power supply voltage, the age of the device, or other outside interference. And a software bug in a pseudo-random number routine, or a hardware bug in the hardware it runs on, may be similarly difficult to detect.
Generated random numbers are sometimes subjected to statistical tests before use to ensure that the underlying source is still working, and then post-processed to improve their statistical properties.
- See also: Statistical randomness
Other considerations
Random numbers uniformly distributed between 0 and 1 can be used to generate random numbers of any desired distribution by passing them through the inverse cumulative distribution function (CDF) of the desired distribution. Inverse CDFs are also called quantile functions. To generate a pair of statistically independent standard normally distributed random numbers (x, y), one may first generate the polar coordinates (r, θ), where r~χ22 and θ~UNIFORM(0,2π) (see Box–Muller transform).
Some 0 to 1 RNGs include 0 but exclude 1, while others include or exclude both.
The outputs of multiple independent RNGs can be combined (for example, using a bit-wise XOR operation) to provide a combined RNG at least as good as the best RNG used. This is referred to as software whitening.
Computational and hardware random number generators are sometimes combined to reflect the benefits of both kinds. Computational random number generators can typically generate pseudo-random numbers much faster than physical generators, while physical generators can generate "true randomness."
Low-discrepancy sequences as an alternative
Some computations making use of a random number generator can be summarized as the computation of a total or average value, such as the computation of integrals by the Monte Carlo method. For such problems, it may be possible to find a more accurate solution by the use of so-called low-discrepancy sequences, also called quasirandom numbers. Such sequences have a definite pattern that fills in gaps evenly, qualitatively speaking; a truly random sequence may, and usually does, leave larger gaps.
Activities and demonstrations
The SOCR resource pages contain a number of hands-on interactive activities and demonstrations of random number generation using Java applets.
See also
- Flipism
- Hardware random number generator
- List of random number generators
- PP (complexity)
- Procedural generation
- Randomization
- Randomized algorithm
- Random number generator attack
- Random password generator
- Randomness
- Slot machine
References
- ^ "Introduction to Slot Machines". Retrieved 2010-05-14.
- ^ An optical ultrafast random bit generator Kanter, Ido; Aviad, Yaara; Reidler, Igor; Cohen, Elad; Rosenbluh, Michael Nature Photonics, Volume 4, Issue 1, pp. 58–61 (2010).
- ^ Walker, John. "HotBits: Genuine Random Numbers". Retrieved 2009-06-27.
- ^ Halprin, Ran. "Games for Extracting Randomness" (PDF). Department of Computer Science and Applied Mathematics, Weizmann Institute of Science. Retrieved 2009-06-27.
{{cite journal}}
: Cite journal requires|journal=
(help); Unknown parameter|coauthors=
ignored (|author=
suggested) (help) Main site - ^ TrueCrypt Foundation. "TrueCrypt Beginner's Tutorial, Part 3". Retrieved 2009-06-27.
- ^ Marsaglia, George (1999-01-12). "sci.stat.math". Retrieved 2010-02-10.
- ^ The MathWorks. "Common generation methods". Retrieved 2011-10-13.
- ^ The Numerical Algorithms Group. "G05 - Random Number Generators" (PDF). NAG Library Manual, Mark 23. Retrieved 2012-02-09.
- ^ W. A. Wagenaar (1972). "Generation of random sequences by human subjects: a critical survey of the literature". Psychological Bulletin. 77 (1): 65–72. doi:10.1037/h0032060.
Further reading
- Donald Knuth (1997). "Chapter 3 – Random Numbers". The Art of Computer Programming. Vol. Vol. 2: Seminumerical algorithms (3 ed.).
{{cite book}}
:|volume=
has extra text (help) - Press, WH; Teukolsky, SA; Vetterling, WT; Flannery, BP (2007). "Chapter 7. Random Numbers". Numerical Recipes: The Art of Scientific Computing (3rd ed.). New York: Cambridge University Press. ISBN 978-0-521-88068-8Template:Inconsistent citations
{{cite book}}
: CS1 maint: postscript (link)
External links
- Random and Pseudorandom on In Our Time at the BBC