To anyone thinking this is not random, with 42 votes in:
The p-value is 0.895 (this is the probability of seeing at least this much non-randomness, assuming a uniform distribution)
The entropy is 2.302bits instead of log(5) = 2.322bits, for 0.02bits KL-distance (this is the number of bits you lose for encoding one of these votes as if it was random)
If you think you see a pattern here, you should either see a doctor or a statistician.
Looks like we’re better at randomness than the rest of the population. If I asked random people for a random number from 1 to 10, I wouldn’t be surprised to see substantially less than 3.322 bits of entropy per number (e.g., many more than 10% of the people choosing 7).
To anyone thinking this is not random, with 42 votes in:
The p-value is 0.895 (this is the probability of seeing at least this much non-randomness, assuming a uniform distribution)
The entropy is 2.302bits instead of log(5) = 2.322bits, for 0.02bits KL-distance (this is the number of bits you lose for encoding one of these votes as if it was random)
If you think you see a pattern here, you should either see a doctor or a statistician.
I wish I could see a doctor-statistician. Or at least a doctor who understood statistics.
Yvain might some day have his own practice.
Here is one: http://www.ted.com/talks/ben_goldacre_battling_bad_science.html
Looks like we’re better at randomness than the rest of the population. If I asked random people for a random number from 1 to 10, I wouldn’t be surprised to see substantially less than 3.322 bits of entropy per number (e.g., many more than 10% of the people choosing 7).
Well, it’s worth noting people seem to be trainable to choose randomly: http://dl.dropbox.com/u/85192141/1986-neuringer.pdf
Apropos of the PRNG discussion in http://blog.yunwilliamyu.net/2011/08/14/mindhack-mental-math-pseudo-random-number-generators/ for which I wrote some flashcards: http://pastebin.com/CKif0fEf