So, Rule30RND was ok as a fun experiment, but not a PRNG that someone could take seriously.

I kept tinkering, as if I was playing a game: change this and that, run the tests, check my score.

Side note: after I published the last post, I realized that my test scripts had fundamental flaws, like adding text headers in the randomly generated bytes.

Radius-2

No matter how hard I tried, it was obvious that a Rule 30-based algorithm would not pass BigCrush unless I changed it so much that would become something else with a sprinkle of CA.

Now, Rule 30 works like this: If C is our bit, L is the one on its left and R the one on its right:
new_bit = L XOR (C OR R)

That’s a "radius-1 CA". What if I tried with a radius-2 CA, that uses two bits on the left and two on the right?

Unfortunately, there is much less literature for radius-2 than radius-1 CAs. After some search that did not return a documented good candidate for what I wanted to build, and a few failed experiments, I decided to extend Rule 30 in a “symmetric” way.

new_bit = (L2 XOR L1) XOR (C OR (R1 OR R2))

This gave significantly better results.

Not exceptional. The new algorithm failed 5/15 SmallCrush tests (Gap, SimpPoker, CouponCollector, WeightDistrib, HammingIndep).

But these are the tests I might be able to fix with a diffusion function.

Searching for a diffusion fucntion

I wanted a diffusion function, so I let Claude test 7 different functions, run the tests, collect the results, and let me know when it was done.

Mixing/Diffusion Approach SmallCrush Performance vs math/rand Status
Rotation (5,14,27) 12/15 1.08×
5 XOR Rotations 13/15 0.94×
xoshiro256++ Style 11/15 1.13×
MurmurHash 15/15 0.95×
SplitMix64 15/15 0.93×
Hybrid 15/15 1.03×
Per-Word Varying 15/15 0.90×

The winner was the “Hybrid” approach that combines rotation, multiplication by the golden ratio, and shift-XOR:

func mix(x uint64) uint64 {
    x ^= bits.RotateLeft64(x, 13)
    x *= 0x9e3779b97f4a7c15  // Golden ratio constant
    x ^= x >> 27
    return x
}

SmallCrush is the first step, but I needed it to pass BigCrush.

And, YES! My rule-30-radius-2-diffused algo scored 160/160 on BigCrush for the first time!

Staying true

Now, this worked, but it was also cheating: I applied mix() to every state evolution: s1, s2=mix(rule30(s1)), s3=mix(rule30(s2)), ..., so this was no longer a CA algorithm.

What if I applied mix() only to the output? I.e. keep a pure Rule 30, radius-2 algorithm that generates states s1, s2, ..., but return mix(s1), mix(s2), .... Would this work?

IT WORKED! IT WORKED!

And it was also 6% faster.

I present you “R30R2”

Along the way, I fixed some annoying Makefile issues, optimized how Read() reads data from the state, and tweaked a few things here and there.

I was ready to release it, but when “Rule 30” is mentioned in literature, it means a radius-1 CA.

I decided to rename the project, the documentation, the function names, etc. And it's name is... R30R2.

R30R2 is a radius-2 Rule 30-based PRNG with a diffusion function applied on output. It scores 160/160 on BigCrush, and is 2× faster than math/rand/v2.

You can find it at https://github.com/vrypan/r30r2/

RNG BigCrush Score Performance vs math/rand
math/rand 159/160 1.00× (baseline)
math/rand/v2 160/160 ✓ 0.56×
Rule 30 160/160 1.03×

🎉🍾🍻