I can see a couple of reasons why the post does belong here:
It concerns Less Wrong itself, specifically it’s origin and motivation. This should be of interest to community members.
You (Eliezer) are the most visible advocate and practitioner of human rationality improvement. If it turns out that you are not particularly rational, then perhaps the techniques you have developed are not worth learning.
Psy-Kosh’s answer seems perfectly reasonable to me. I wonder why you don’t just give that answer, instead of saying the post doesn’t belong here. Actually if I had known this was one of the reasons for starting OB/LW, I probably would have paid more attention earlier, because at the beginning I was thinking “Why is Eliezer talking so much about human biases now? That doesn’t seem so interesting, compared to the Singularity/FAI stuff he used to talk about.”
I can see a couple of reasons why the post does belong here:
It concerns Less Wrong itself, specifically it’s origin and motivation. This should be of interest to community members.
You (Eliezer) are the most visible advocate and practitioner of human rationality improvement. If it turns out that you are not particularly rational, then perhaps the techniques you have developed are not worth learning.
Psy-Kosh’s answer seems perfectly reasonable to me. I wonder why you don’t just give that answer, instead of saying the post doesn’t belong here. Actually if I had known this was one of the reasons for starting OB/LW, I probably would have paid more attention earlier, because at the beginning I was thinking “Why is Eliezer talking so much about human biases now? That doesn’t seem so interesting, compared to the Singularity/FAI stuff he used to talk about.”
E.Y. has given that answer before:
Rationality: Common Interest of Many Causes