Let me start by trying to summarise Eliezer’s argument—perhaps using slightly different terminology. If I have a given stream of sensory experience, what is the correct interpretation of it? I would say it is that which allows you to compress the stream (and the theory with which you explain it) down to the smallest possible size. You can then use this as a means of making a prediction of what the next bit of the sensory stream might be.
This has quite a few nice features—not least of which is if someone else comes up with a different interpretation of the stream, you can simply weigh it against yours, and if theirs weighs more, that renders it statistically more unlikely, and statistically more likely to give incorrect predictions as well. And weighing compressed data sets is mathematics, and not a matter of opinion. You can reasonably say that their heavier ‘interpretation’ is adding information that you know—from your compression—is not in the stream. Where did this extra information come from? It’s just wrong.
The next question is obvious—how could someone possibly consider themselves to be logically right to do something other than this? And here Eliezer is home and dry—this is the only logically right thing to do. Anyone doing something different is assuming the laws of statistics and reason do not apply to them. In all of this I’m with Eliezer all the way. It’s inductive reasoning, which means we only have expectations about what we have not yet seen, and not certainty. But at least we know that we can’t do better.
All of this is beyond question, and not my point. There is another major choice, which is to disbelieve rationality altogether, or regard it as of limited applicability. Throw it out—baby, bathwater, everything. And trust something else instead. And explicitly believe that this something else is NOT a purely rational means of truth, but something else. This gives you absolute license to impose any number of interpretations on the data. Of course the rationalists are blind—they tell me this data only tells me X, but I can see so much more in it than that! Two and two may be four, rationally, but in fact the whole is more than the sum of its parts. If someone proves that the extra stuff isn’t actually in the data, well fine—I knew that. These things aren’t knowable by the rational mind, one needs divine revelation, or natural intuition, or spiritual sensitivity..… One comes to believe the world is fundamentally not rational, not rationally explainable, not rationally reducible, and certainly not statistically analyzable. Forget all that stuff, and just trust your animal instincts.
And here you end at an impasse. Eliezer, in his article, states that he expects nature to give such irrational persons a few lessons in the school of hard knocks. They are living in a mindset full of confabulations and perceptual artifacts. The irrationalists would see him as living his life with his head under a bucket, restricted to what logic can tell him, and missing out on every other part of his humanity.
Who is right? Rationally, Eliezer. Irrationally, I have no idea—is there even such a thing as ‘right’ in this case? Would I even care? If one denies rationality, one can believe in anything, if believe is the right word for it.
Just to be clear, I do not believe in extra-rational means of knowledge, and I believe rationality to be universally applicable. But I regard this as a belief, as any attempt at proof is begging this question on one side or the other.
It’s not a “rationalist” thing, it’s a human thing. What are you evaluating the adequacy of rituals of cognition with? You’re already what you are, which is what you use. There are no universally convincing arguments, and one accepts, say, Occam’s razor, not because it’s “rational”, but because we are the kind of agents that are compelled by this principle. Don’t distinguish between “rational” and “magical”, ask what moves you, on reflection, what do you believe to get you the results, and whether you believe the argument for why it does.
Believe it or not Vladmir, Eliezer and I all understand the limitations of thought, the dependence on initial priors. Searching for “anti-inductive” will get you some hits. That we still claim that you need to use every resource you have at your disposal to evaluate your resources is significant.
Who is right? Rationally, Eliezer. Irrationally, I have no idea—is there even such a thing as ‘right’ in this case?
Eliezer, no, no.
If one denies rationality, one can believe in anything, if believe is the right word for it.
There is one line of reasoning that I find is actually more effective on irrational people than rational ones. Argumentum ad baculum.
To other readers: Can anyone think of the Eliezer post that is on the tip of my tongue? I can’t find the link without recalling the keywords!
Mainly
http://lesswrong.com/lw/k1/no_one_can_exempt_you_from_rationalitys_laws/
but these also seem relevant:
http://lesswrong.com/lw/gr/the_modesty_argument/
http://lesswrong.com/lw/h9/tsuyoku_vs_the_egalitarian_instinct/
It is not easy to escape this problem.
Let me start by trying to summarise Eliezer’s argument—perhaps using slightly different terminology. If I have a given stream of sensory experience, what is the correct interpretation of it? I would say it is that which allows you to compress the stream (and the theory with which you explain it) down to the smallest possible size. You can then use this as a means of making a prediction of what the next bit of the sensory stream might be.
This has quite a few nice features—not least of which is if someone else comes up with a different interpretation of the stream, you can simply weigh it against yours, and if theirs weighs more, that renders it statistically more unlikely, and statistically more likely to give incorrect predictions as well. And weighing compressed data sets is mathematics, and not a matter of opinion. You can reasonably say that their heavier ‘interpretation’ is adding information that you know—from your compression—is not in the stream. Where did this extra information come from? It’s just wrong.
The next question is obvious—how could someone possibly consider themselves to be logically right to do something other than this? And here Eliezer is home and dry—this is the only logically right thing to do. Anyone doing something different is assuming the laws of statistics and reason do not apply to them. In all of this I’m with Eliezer all the way. It’s inductive reasoning, which means we only have expectations about what we have not yet seen, and not certainty. But at least we know that we can’t do better.
All of this is beyond question, and not my point. There is another major choice, which is to disbelieve rationality altogether, or regard it as of limited applicability. Throw it out—baby, bathwater, everything. And trust something else instead. And explicitly believe that this something else is NOT a purely rational means of truth, but something else. This gives you absolute license to impose any number of interpretations on the data. Of course the rationalists are blind—they tell me this data only tells me X, but I can see so much more in it than that! Two and two may be four, rationally, but in fact the whole is more than the sum of its parts. If someone proves that the extra stuff isn’t actually in the data, well fine—I knew that. These things aren’t knowable by the rational mind, one needs divine revelation, or natural intuition, or spiritual sensitivity..… One comes to believe the world is fundamentally not rational, not rationally explainable, not rationally reducible, and certainly not statistically analyzable. Forget all that stuff, and just trust your animal instincts.
And here you end at an impasse. Eliezer, in his article, states that he expects nature to give such irrational persons a few lessons in the school of hard knocks. They are living in a mindset full of confabulations and perceptual artifacts. The irrationalists would see him as living his life with his head under a bucket, restricted to what logic can tell him, and missing out on every other part of his humanity.
Who is right? Rationally, Eliezer. Irrationally, I have no idea—is there even such a thing as ‘right’ in this case? Would I even care? If one denies rationality, one can believe in anything, if believe is the right word for it.
Just to be clear, I do not believe in extra-rational means of knowledge, and I believe rationality to be universally applicable. But I regard this as a belief, as any attempt at proof is begging this question on one side or the other.
It’s not a “rationalist” thing, it’s a human thing. What are you evaluating the adequacy of rituals of cognition with? You’re already what you are, which is what you use. There are no universally convincing arguments, and one accepts, say, Occam’s razor, not because it’s “rational”, but because we are the kind of agents that are compelled by this principle. Don’t distinguish between “rational” and “magical”, ask what moves you, on reflection, what do you believe to get you the results, and whether you believe the argument for why it does.
Links:
http://lesswrong.com/lw/rn/no_universally_compelling_arguments/
http://lesswrong.com/lw/hk/priors_as_mathematical_objects/
http://lesswrong.com/lw/o5/the_second_law_of_thermodynamics_and_engines_of/
http://wiki.lesswrong.com/wiki/Futility_of_chaos
Believe it or not Vladmir, Eliezer and I all understand the limitations of thought, the dependence on initial priors. Searching for “anti-inductive” will get you some hits. That we still claim that you need to use every resource you have at your disposal to evaluate your resources is significant.
Eliezer, no, no.
There is one line of reasoning that I find is actually more effective on irrational people than rational ones. Argumentum ad baculum.
Where Recursive Justification Hits Bottom