It’s easy to define success in martial arts. Defining ‘rationality’ is harder. Have you done so yet, Eliezer?
Even in martial arts, many of the schools of thoughts are essentially religions or cults, completely unconcerned with fighting proficiency and deeply concerned with mastering the arcane details of a sacred style passed on from teacher to student.
Such styles often come with an unrealistic conviction that the style is devastatingly effective, but there is little concern with testing that.
I’ve read a great many comments and articles by people talking about how karate black belts are being seriously beaten by people with real-world fighting experience—pimps, muggers, etc. Becoming skilled in an esoteric discipline is useful only if that discipline is useful.
Do not seek to establish yourself as a sensei. Do not seek to become a “master of the art”. Instead, try to get better at fighting—or, in this case, thinking correctly—even you don’t get to wear a hood and chant about ‘mysteries’.
Defining ‘rationality’ is harder. Have you done so yet, Eliezer?
Already defined “rationality” in passing in the second sentence of the article, just in case someone came in who wasn’t familiar with the prior corpus.
You, of course, are familiar with the corpus and the amount of work I’ve already put into defining rationality; and so I have made free to vote down this comment, because of that little troll. I remind everyone that anything with a hint of trollishness is a fair target for downvoting, even if you happen to disagree with it.
Eliezer, what do you say about someone who believed the world is entirely rational and then came to theism from a completely rational viewpoint, such as Kurt Gödel did?
Surely Gödel came to it through a very advanced rationality. But I’m trying to understand your own view. Your idea is that Bayesian theory can be applied throughout all conceptual organization?
My view is that you should ask your questions of some different atheist on a different forum. I’m sure there will be plenty willing to debate you, but not here.
I’m not a theist, and so you have made two mistakes. I’m trying to find out why formal languages can’t follow the semantics of concepts through categorial hierarchies of conceptual organization. (Because if they had been able to do so, then there would be no need to train in the Art of Rationality—and we could easily have artificial intelligence.) The reason I asked about Gödel is because it’s a very good way to find out how much people have thought about this. I asked about Bayes because you appear to believe that conditional probability can be used to construct algorithms for semantics—sorry if I’ve got that wrong.
karate black belts are being seriously beaten by people with real-world fighting experience—pimps, muggers, etc
Yes, I heard such stories as well (edit: and recently read an article discussing real-world performance of Chinese and Japanese soldiers in melee/H2H combat). This is one of the reasons why I think that performance in the real world is a better way to measure success at rationality than any synthetic metric.
It’s easy to define success in martial arts. Defining ‘rationality’ is harder. Have you done so yet, Eliezer?
Even in martial arts, many of the schools of thoughts are essentially religions or cults, completely unconcerned with fighting proficiency and deeply concerned with mastering the arcane details of a sacred style passed on from teacher to student.
Such styles often come with an unrealistic conviction that the style is devastatingly effective, but there is little concern with testing that.
See also: http://www.toxicjunction.com/get.asp?i=V2741
I’ve read a great many comments and articles by people talking about how karate black belts are being seriously beaten by people with real-world fighting experience—pimps, muggers, etc. Becoming skilled in an esoteric discipline is useful only if that discipline is useful.
Do not seek to establish yourself as a sensei. Do not seek to become a “master of the art”. Instead, try to get better at fighting—or, in this case, thinking correctly—even you don’t get to wear a hood and chant about ‘mysteries’.
Already defined “rationality” in passing in the second sentence of the article, just in case someone came in who wasn’t familiar with the prior corpus.
You, of course, are familiar with the corpus and the amount of work I’ve already put into defining rationality; and so I have made free to vote down this comment, because of that little troll. I remind everyone that anything with a hint of trollishness is a fair target for downvoting, even if you happen to disagree with it.
Eliezer, what do you say about someone who believed the world is entirely rational and then came to theism from a completely rational viewpoint, such as Kurt Gödel did?
I’d say, “take it to the Richard Dawkins forum or an atheism IRC channel or something, LW is for advanced rationality, not the basics”.
Surely Gödel came to it through a very advanced rationality. But I’m trying to understand your own view. Your idea is that Bayesian theory can be applied throughout all conceptual organization?
My view is that you should ask your questions of some different atheist on a different forum. I’m sure there will be plenty willing to debate you, but not here.
I’m not a theist, and so you have made two mistakes. I’m trying to find out why formal languages can’t follow the semantics of concepts through categorial hierarchies of conceptual organization. (Because if they had been able to do so, then there would be no need to train in the Art of Rationality—and we could easily have artificial intelligence.) The reason I asked about Gödel is because it’s a very good way to find out how much people have thought about this. I asked about Bayes because you appear to believe that conditional probability can be used to construct algorithms for semantics—sorry if I’ve got that wrong.
“Fat chance.”
Yes, I heard such stories as well (edit: and recently read an article discussing real-world performance of Chinese and Japanese soldiers in melee/H2H combat). This is one of the reasons why I think that performance in the real world is a better way to measure success at rationality than any synthetic metric.