That’s not the idea that really scares Less Wrong people.
Here’s a more disturbing one; try to picture a world where all the rational skills you’re learning on Less Wrong are actually somehow flawed, and actually make it less likely that you’ll discover the truth or made you correct less often, for whatever reason? What would that look like? Would you be able to tell the difference.
I must say, I have trouble picturing that, but I can’t prove it’s not true (we are basically tinkering with the way our mind works without a software manual, after all).
That’s not the idea that really scares Less Wrong people.
Here’s a more disturbing one; try to picture a world where all the rational skills you’re learning on Less Wrong are actually somehow flawed, and actually make it less likely that you’ll discover the truth or made you correct less often, for whatever reason? What would that look like? Would you be able to tell the difference.
I must say, I have trouble picturing that, but I can’t prove it’s not true (we are basically tinkering with the way our mind works without a software manual, after all).
related: http://lesswrong.com/lw/9p/extreme_rationality_its_not_that_great/