I believed the inscrutability was intentional; a Dark Arts technique where no one can refute my position if no one can agree on what my position actually is. Then all criticism can be summarily dismissed by a courtier’s reply or some mystical ad-hominem like “you are too low on Kegan scale to even understand what we are saying.”
Fixing this would be a huge improvement. It would make rational discussion actually possible.
Rationalists are afflicted with a frustrating Dunning-Kruger illusion: they cannot understand that there is something they cannot understand.
I am quite sure there are things that have Kolmogorov complexity larger than the size of my brain. As a trivial example, random strings of sufficiently large size.
But for the record, “there is something you don’t understand” doesn’t necessarily imply “therefore, I am right”. Neither does “there is some thing you don’t understand” necessarily imply “and it is this specific thing”.
Just as irrationalists can’t understand the rationalist critique, rationalists can’t understand the meta-rational critique.
Speaking for myself, I would mostly like to have some assurance that what you are trying to explain is not one of the following:
the Straw Vulcans are actually not rational;
people who identify as “rationalists” are still stupid humans and make a lot of stupid human mistakes;
some people who self-identify as “rationalists” are actually quite embarrassing;
and generally, merely applying a label on oneself does not make a human more rational;
some things cannot be modeled properly by too simple (e.g. linear) equations, because they have many important details;
armchair reasoning is not a substitute for actual empirical data;
(...and some other similar stuff I forgot now...)
Because if it happens to be one of those, then we already have an agreement, we just seem not to have the common knowledge that we already have the agreement.
And if it really is something else… then we have an agreement that a simpler explanation for the simple creatures too low on the Kegan scale would really be helpful.
.
Frankly, this all seems to me, from outside, like another status move. First you have a group that claims higher status by being inscrutable. And it works, just like it worked for many other groups. But then a problem happens: other people can play the inscrutability game, too.
You could try competing with them, but you actually don’t want to go too far in that direction. You want to stay within the shouting distance from rationalists; far enough to keep higher status, not too far to become irrelevant. If there only was a way to have your cake and eat it, too… Oh, there is one! You can do the same status move again, and counter-signal deep wisdom by being less inscrutable.
Congratulations on becoming the world’s first meta-meta-rationalist!
But this view does not make me appreciate your move less. Zero-sum status games within a group can still produce externalities for the rest of the world (for example, when rich people decide to compete by donating to charity instead of buying expensive cars). Looking forward to the positive externalities of meta-meta-rationality in form of insightful and easy-to-understand articles. If they are good, I will just call them “rationality” in my head. ;)
Except for the no-one-size-fits-all-epistemology epistemology with a flavor of Westernized Buddhism, I guess.
This is such a cheap trick I wonder why people keep falling for it: “Other people have epistemologies. I don’t have an epistemology. I have a meta-epistemology!” “Other people have beliefs. I don’t have beliefs. I have meta-beliefs!” “Other people use strategies. I don’t have a strategy. I have a meta-strategy!” “Other people use algorithms. I don’t use an algorithm. I use a meta-algorithm!” “Other people try to be rational. I don’t try to be rational. I try to be meta-rational!”
But this is not how it works. For certain definitions, meta-X is still a subset of X; you don’t get beyond X by saying “But I am meta!”. Universal Turing machine is still a Turing machine. A compiler or an interpreter is still a program. An algorithm which tries several algorithms and chooses the one which seems to work best, is still an algorithm. Saying “meta” is not a get-out-of-jail-free card. If a statement is true for all algorithms, it is also true for the “algorithm that tries several algorithms”; there is no free lunch.
Similarly, saying: “I don’t have an epistemology; instead I have several epistemologies, and I use different ones in different situations” is a kind of epistemology. Also, some important details are swept under the rug, for example: How do you choose which epistemology is appropriate for which situation? How do you choose which epistemologies to use at all? How do you create new epistemologies? How do you decide whether the existing ones need to be updated or even discarded? “I don’t have a system, I have multiple systems.” Yeah, but then you also need a system of systems, and that is a system. Closing your eyes does not make it go away.
But this is not how it works. For certain definitions, meta-X is still a subset of X;
And for others, it isn’t.
If a statement is true for all algorithms, it is also true for the “algorithm that tries several algorithms”;
Theoretically, but there is no such algorithm.
Similarly, saying: “I don’t have an epistemology; instead I have several epistemologies, and I use different ones in different situations” is a kind of epistemology.
But it’s not a single algorithmic epistemology.
Also, some important details are swept under the rug, for example: How do you choose which epistemology is appropriate for which situation?
How do you do anything for whcih there isn’t an algorithm? You use experience, intuition, and other system 1 stuff.
This is such a cheap trick
It isn’t in all cases. There is a genuine problem in telling whether a claim of radically superior knowledge is genuine, You can;t round them all off to fraud.
I believed the inscrutability was intentional; a Dark Arts technique where no one can refute my position if no one can agree on what my position actually is. Then all criticism can be summarily dismissed by a courtier’s reply or some mystical ad-hominem like “you are too low on Kegan scale to even understand what we are saying.”
Fixing this would be a huge improvement. It would make rational discussion actually possible.
I am quite sure there are things that have Kolmogorov complexity larger than the size of my brain. As a trivial example, random strings of sufficiently large size.
But for the record, “there is something you don’t understand” doesn’t necessarily imply “therefore, I am right”. Neither does “there is some thing you don’t understand” necessarily imply “and it is this specific thing”.
Speaking for myself, I would mostly like to have some assurance that what you are trying to explain is not one of the following:
the Straw Vulcans are actually not rational;
people who identify as “rationalists” are still stupid humans and make a lot of stupid human mistakes;
some people who self-identify as “rationalists” are actually quite embarrassing;
and generally, merely applying a label on oneself does not make a human more rational;
some things cannot be modeled properly by too simple (e.g. linear) equations, because they have many important details;
armchair reasoning is not a substitute for actual empirical data;
(...and some other similar stuff I forgot now...)
Because if it happens to be one of those, then we already have an agreement, we just seem not to have the common knowledge that we already have the agreement.
And if it really is something else… then we have an agreement that a simpler explanation for the simple creatures too low on the Kegan scale would really be helpful.
.
Frankly, this all seems to me, from outside, like another status move. First you have a group that claims higher status by being inscrutable. And it works, just like it worked for many other groups. But then a problem happens: other people can play the inscrutability game, too.
You could try competing with them, but you actually don’t want to go too far in that direction. You want to stay within the shouting distance from rationalists; far enough to keep higher status, not too far to become irrelevant. If there only was a way to have your cake and eat it, too… Oh, there is one! You can do the same status move again, and counter-signal deep wisdom by being less inscrutable.
Congratulations on becoming the world’s first meta-meta-rationalist!
But this view does not make me appreciate your move less. Zero-sum status games within a group can still produce externalities for the rest of the world (for example, when rich people decide to compete by donating to charity instead of buying expensive cars). Looking forward to the positive externalities of meta-meta-rationality in form of insightful and easy-to-understand articles. If they are good, I will just call them “rationality” in my head. ;)
try “there is no one-size-fits-all epistemology”. With a side of “no one-size-fits-all smartness”.
That issue, if it is an issue, reduplicates itself with rationalists versus pre-rationalists.
Except for the no-one-size-fits-all-epistemology epistemology with a flavor of Westernized Buddhism, I guess.
This is such a cheap trick I wonder why people keep falling for it: “Other people have epistemologies. I don’t have an epistemology. I have a meta-epistemology!” “Other people have beliefs. I don’t have beliefs. I have meta-beliefs!” “Other people use strategies. I don’t have a strategy. I have a meta-strategy!” “Other people use algorithms. I don’t use an algorithm. I use a meta-algorithm!” “Other people try to be rational. I don’t try to be rational. I try to be meta-rational!”
But this is not how it works. For certain definitions, meta-X is still a subset of X; you don’t get beyond X by saying “But I am meta!”. Universal Turing machine is still a Turing machine. A compiler or an interpreter is still a program. An algorithm which tries several algorithms and chooses the one which seems to work best, is still an algorithm. Saying “meta” is not a get-out-of-jail-free card. If a statement is true for all algorithms, it is also true for the “algorithm that tries several algorithms”; there is no free lunch.
Similarly, saying: “I don’t have an epistemology; instead I have several epistemologies, and I use different ones in different situations” is a kind of epistemology. Also, some important details are swept under the rug, for example: How do you choose which epistemology is appropriate for which situation? How do you choose which epistemologies to use at all? How do you create new epistemologies? How do you decide whether the existing ones need to be updated or even discarded? “I don’t have a system, I have multiple systems.” Yeah, but then you also need a system of systems, and that is a system. Closing your eyes does not make it go away.
And for others, it isn’t.
Theoretically, but there is no such algorithm.
But it’s not a single algorithmic epistemology.
How do you do anything for whcih there isn’t an algorithm? You use experience, intuition, and other system 1 stuff.
It isn’t in all cases. There is a genuine problem in telling whether a claim of radically superior knowledge is genuine, You can;t round them all off to fraud.
I would be content in just someone saying: “this person is a meta-rationalist, and this is what s/he has achieved”.