Preferences

Dylan16807 parent
In what way is randomness on a GPU hard? The nVidia SDK apparently comes with a few of them, and you can implement an LCG in handful of arithmetic operations. If you need a seed feed it in from the CPU.

I don't see the benefit of having a high-cryptographic-quality RNG for rendering.


palish
You'll have to take my word for it, or try it yourself: there are visually noticeable repetitive patterns that are a direct result of the naive PRNG.

For example I once modeled raindrops as falling billboards that struck the ground at some random (x,y) in your field of view, followed by a "splash" animation billboard where it struck the ground. I started by using C rand() to determine the random (x,y) of the raindrop. And it looked terrible! The raindrops would mysteriously clump together in sinewave-like patterns, and seemingly avoid some spots altogether.

I dropped in a Mersenne Twister RNG and the problem instantly went away. I didn't change a thing except the RNG.

So yeah, a few arithmetic ops gets you rand() quality. But humans are unfortunately good at noticing the overall bias in the resulting patterns.

And you'll run into similar problems when e.g. trying to implement a non-repeating perlin noise function on a GPU. I tried it with a weighted combination of several different perlin noise textures... but I had to be very careful about aliasing artificts in some of the more interesting effects; while also being careful of avoiding visually-repetitive patterns (which almost always occur).

The value of Intel's RNG, I'd imagine, is that it's uniformly random and also unbiased -- meaning repetitive moire patterns won't show up in your results.

Of course it's not deterministic, which is tricky, but I can still think of several use cases for something like this.

marshray
So what you need is a high-quality Pseudorandom Number Generator (PRNG). You do not need a Cryptographically Secure PRNG.

This is good, what you need can be implemented significantly more efficiently than the CS version. As you mentioned, it doesn't help your application to be truly nondeterministic.

C rand() is about the worst one you can find. Mersenne Twister is great statistically, but it's quite expensive in terms of memory consumption.

As I understand GPU programming, you want to maximize parallelism. Ideally, you would have something small enough to have an independent generator for each processor. You should be able to generate perfectly random-looking numbers using 64 or 128 bits of state storage and a handful of instructions per 32-bits of output.

For example, the Skein hash function and CSPRNG can generate random data at 5 cycles per byte of output. But that's cryptographically secure, you can probably cut its state size and computational expense each in half to get statistical randomness.

tripzilch
I've been reading New Kind Of Science recently, intrigued by some of Wolfram's grandiose claims I ran some tests and it indeed turns out that the Rule 30 cellular automaton is a very high quality PRNG (as long as you pick the right bits). It's a very simple algorithm and very parallel too.

It's just that my version picked one bit per generation (the middle one) and the period of the PRNG is determined by the size of the buffer, so it might be a bit memory intensive for a GPU, if you need several random bits per pixel. I couldn't find any patterns within the period, though. Also it might be possible to get more than one random bit per generation, though I expect the ones right next to eachother are probably too correlated.

Formula's real simple, if the previous state and the two neighbours are labeled L,M and R, rule 30 boils down to the new state will be L xor (M and A).

palish
Whoa, cool. Thanks!

I don't suppose you have the code lying around...? Even if it's not pretty, it would still be a big help! =)

palish
As I understand GPU programming, you want to maximize parallelism. Ideally, you would have something small enough to have an independent generator for each processor. You should be able to generate perfectly random-looking numbers using 64 or 128 bits of state storage and a handful of instructions per 32-bits of output.

A perfect assessment.

It's why I keep learning about every field --- there are so many areas that turn out to be applicable in different domains.

For example, "variance shadow maps" create soft shadows using statistical analysis. (I wish they were practical. It's just an example.)

I'll check out the Skein function, thanks.

T-hawk
> But humans are unfortunately good at noticing the overall bias in the resulting patterns.

Humans are also, of course, good at noticing bias in a distribution even when there isn't any. Ask any online poker player about shufflers.

This item has no comments currently.