Quote:There is no RND algorithm that can generate *truly* random numbers because a truly random number has no pattern. A pattern is something that can be replicated. An algorithm can be run twice with the same results: it generates a "pattern".
Aga's right. However, the "pattern" can be so long before it repetes that it is *very* difficult to distinguish from "truely random" data. (I put that in quotations because some people believe that, if one has access to *all* the information about *any* system, then all future states of that system are calculable. I happen to be one of those people!!!) The two important observeables in a PRNG (Pseudo Random Number Generator) are the sequence and the repete length. Consider a 64 bit generator that returns a single byte as it's random data. The 64-bit is the size of the internal state, or the "system size". There is some initial state, the return function, and the update sequence. So...you might, at first guess, suppose that a 64-bit system would have a repete length of 2^64...that is, from any given internal state, the system will return to that state after 2^64 update events. an example of such a system would be a simple counter...add 1 to the state, and the whole thing will repete after 2^64. Turns out that this makes for one ultra-crappy PRNG!! In reality, a 64-bit PRNG that had a cyle-size of 2^48 would be pretty good...iffffff the PRNG output was able to pass statistical tests for randomness. The Mersine twister was mentioned earlier as a good RRNG...this system is something like 25000 bits with a cycle-length of ~2^19000. On practical computers, such a system will never repete even on a geologic time-scale.
I have created a c++ class that spits out 32-bit "random" integers from my 1.5 GHz machine at between 300 MB/sec (whole state returned between update events) to 10 MB/sec (1/30 internal state returned between update events) if anyone is interested. It has a calculated average cycle-length of 2^(0.75 * system-size) with some weak-key states that are more like 2 ^ (0.25 * system-size). Even with weak keys, this beast's cycle length will be something like 2^16000 with 1kb returned per cycle.
If you are interested in probing the quality of "random" data, Google "Diehard random". There is a version of this test that tests a 12MB data-set, and one (diehardII) that tests a 300MB data-set). If you test the QB RND function, you will find it fails this test miserably....as does the c-function rand(). However there are many good-quality generators that will pass.
This topic is closely related to cryptography. Suppose that you have a file that consists of 100 MB of highly repetitive data...say 100MB of 'zeros'. After encryption, you would like this data to be indistinguishable from "random" data...and this is the challenge of modern cryptography...and all good cryptography schemes succeed in this endeavor...
ANyway...I (again) strayed from the topic at hand...However aga is right...computers (which follow rules by definition) can never, from a given internal state and a given input, produce variable output...However, it is fairly easy to make systems that produce chaotic output that cannot be predicted except by allowing the program to run and observing the output. Put another way...a good PRNG has the property that there are no "shortcuts" to predicting the next output bit, other than to have knowledge of the system's internal initial state, and the number of update events that have occured. The QB RND does not have this property...that is, it is easy to determine the internal state based on the RNG output, and thus, if you use RND to deal cards, if the stakes are high enough, someone will figure out your internal state, which will allow them to *know* the next card!!!
If you need good random data, make sure you understand the issues well enough to get it!!