Actually, I think “Rationalists should WIN” regardless of what their goals are, even if that includes social wrestling matches.
The “should” here is not intended to be moral prescriptivism. I’m not saying in an morally/ethically ideal world, rationalists would win. Instead, I’m using “should” to help define what the word “Rationalist” means. If some person is a rationalist, then given equal opportunity, resources, difficult-of-goal, etc., they will on average, probabilistically win more often than someone who was not a rationalist. And if they happen to be an evil rationalist, well that sucks for the rest of the universe, but that’s still what “rationalist” means.
I believe this definitional-sense of “should” is also what the originator of the “Rationalists should WIN” quote intended.
People who win are not necessarily rationalists. A person who is a rationalist is more likely to win than a person who is not.
Consider someone who just happens to win the lottery vs someone who figures out what actions have the highest expected net profit.
Edit: That said, careful not to succumb to http://rationalwiki.org/wiki/Argument_from_consequences maybe Genghis Khan really was one of the greatest rationalists ever. I’ve never met the guy nor read any of his writings, so I wouldn’t know.
Even ignoring the issue that “rationalist” is not a binary variable, I don’t know how in practice will you be able to tell whether someone is a rationalist or not. Your definition depends on counterfactuals and without them you can’t disentangle rationalism and luck.
I assume that you accept the claim that it is possible to define what a fair coin is, and thus what an unfair coin is.
If we observe some coin, at first, it may be difficult to tell if it’s a fair coin or not. Perhaps the coin comes from a very trustworthy friend who assures you that it’s fair. Maybe it’s specifically being sold in a novelty store and labelled as an “unfair coin” and you’ve made many purchases from this store in the past and have never been disappointed. In other words, you have some “prior” probability belief that the coin is fair (or not fair).
As you see the coin flip, you can keep track of its outcomes, and adjust your belief. You can ask yourself “Given the outcomes I’ve seen, is it more likely that the coin is fair? or unfair?” and update accordingly.
I think the same applies for rationalist here. I meet someone new. Eliezer vouches for her as being very rational. I observe her sometimes winning, sometimes not winning. I expend mental effort and try to judge how easy/difficult her situation was and how much effort/skill/rationality/luck/whatever it would have taken her to win in that situation. I try to analyze how it came about that she won when she won, or lost when she lost. I try to dismiss evidence where luck was a big factor. She bought a lottery ticket, and she won. Should I update towards her being a rationalist or not? She switched doors in Monty Hall, but she ended up with a goat. Should I update towards her being a rationalist or not? Etc.
Hm, OK. So you are saying that the degree of rationalism is an unobservable (hidden) variable and what we can observe (winning or losing) is contaminated by noise (luck). That’s a fair way of framing it.
The interesting question then becomes what kind of accuracy can you achieve in the real world given that the noise level are high, information available to you is limited, and your perception is imperfect (e.g. it’s not uncommon to interpret non-obvious high skill as luck).
Right, I suspect just having heard about someone’s accomplishments would be an extremely noisy indicator. You’d want to know what they were thinking, for example by reading their blog posts.
Eliezer seems pretty rational, given his writings. But if he repeatedly lost in situations where other people tend to win, I’d update accordingly.
Possibly he’s just extremely lucky. There are seven billion people in the world—one of these people is almost certain to be luckier than all of the rest.
Possibly he is being looked after by a far more competent person behind the scenes; a spouse or a parent, perhaps, who dislikes being visible but works to help that person succeed.
Possibly that person really is more rational than you are, but his methods of success are so alien to you that your first instinct is to reject them out-of-hand.
Possibly his “writings” are actually being ghost-written by someone else.
Possibly he doesn’t much care about what he writes, going for low-effort writing in order to concentrate on winning.
Possibly he’s found one exploit that really works but won’t work if everyone does it; thus, he keeps quiet about it.
Possibly he’s deliberately writing to obscure or hide his own methods of success.
Possibly he’s found a winning strategy, but he doesn’t understand why it works, and thus invents a completely implausible “explanation” for it.
Possibly that person really is more rational than you are, but his methods of success are so alien to you that your first instinct is to reject them out-of-hand.
If I understand the Peter Thiel doctrine of the secret correlectly that should be the case in many instances.
Some people are rich and can afford valuable things even if they don’t spend their money wisely. Some people might win because they have a lot of resources or connections to throw at problems.
I can’t directly observe Eliezer winning or losing, but I can make (perhaps very weak) inferences about how often he wins/loses given his writing.
As an analogy, I might not have the opportunity to play a given videogame ABC against a given blogger XYZ that I’ve never met and will never meet. But if I read his blog posts on ABC strategies, and try to apply them when I play ABC, and find that my win-rate vastly improves, I can infer that XYZ also probably wins often (and probably wins more often than I do).
Well, if what you want to accomplish is motivating large groups of people into supporting you and using them to conquer a large empire, you should study what they did and how they did it.
Actually, I think “Rationalists should WIN” regardless of what their goals are, even if that includes social wrestling matches.
The “should” here is not intended to be moral prescriptivism. I’m not saying in an morally/ethically ideal world, rationalists would win. Instead, I’m using “should” to help define what the word “Rationalist” means. If some person is a rationalist, then given equal opportunity, resources, difficult-of-goal, etc., they will on average, probabilistically win more often than someone who was not a rationalist. And if they happen to be an evil rationalist, well that sucks for the rest of the universe, but that’s still what “rationalist” means.
I believe this definitional-sense of “should” is also what the originator of the “Rationalists should WIN” quote intended.
There is a bit of a problem here in that the list of the greatest rationalists ever will be headed by people like Genghis Khan and Prophet Muhammad.
People who win are not necessarily rationalists. A person who is a rationalist is more likely to win than a person who is not.
Consider someone who just happens to win the lottery vs someone who figures out what actions have the highest expected net profit.
Edit: That said, careful not to succumb to http://rationalwiki.org/wiki/Argument_from_consequences maybe Genghis Khan really was one of the greatest rationalists ever. I’ve never met the guy nor read any of his writings, so I wouldn’t know.
Even ignoring the issue that “rationalist” is not a binary variable, I don’t know how in practice will you be able to tell whether someone is a rationalist or not. Your definition depends on counterfactuals and without them you can’t disentangle rationalism and luck.
I assume that you accept the claim that it is possible to define what a fair coin is, and thus what an unfair coin is.
If we observe some coin, at first, it may be difficult to tell if it’s a fair coin or not. Perhaps the coin comes from a very trustworthy friend who assures you that it’s fair. Maybe it’s specifically being sold in a novelty store and labelled as an “unfair coin” and you’ve made many purchases from this store in the past and have never been disappointed. In other words, you have some “prior” probability belief that the coin is fair (or not fair).
As you see the coin flip, you can keep track of its outcomes, and adjust your belief. You can ask yourself “Given the outcomes I’ve seen, is it more likely that the coin is fair? or unfair?” and update accordingly.
I think the same applies for rationalist here. I meet someone new. Eliezer vouches for her as being very rational. I observe her sometimes winning, sometimes not winning. I expend mental effort and try to judge how easy/difficult her situation was and how much effort/skill/rationality/luck/whatever it would have taken her to win in that situation. I try to analyze how it came about that she won when she won, or lost when she lost. I try to dismiss evidence where luck was a big factor. She bought a lottery ticket, and she won. Should I update towards her being a rationalist or not? She switched doors in Monty Hall, but she ended up with a goat. Should I update towards her being a rationalist or not? Etc.
Hm, OK. So you are saying that the degree of rationalism is an unobservable (hidden) variable and what we can observe (winning or losing) is contaminated by noise (luck). That’s a fair way of framing it.
The interesting question then becomes what kind of accuracy can you achieve in the real world given that the noise level are high, information available to you is limited, and your perception is imperfect (e.g. it’s not uncommon to interpret non-obvious high skill as luck).
Right, I suspect just having heard about someone’s accomplishments would be an extremely noisy indicator. You’d want to know what they were thinking, for example by reading their blog posts.
Eliezer seems pretty rational, given his writings. But if he repeatedly lost in situations where other people tend to win, I’d update accordingly.
But what about the other case? People who don’t seem rational given their writings but who repeatedly win?
Possibly he’s just extremely lucky. There are seven billion people in the world—one of these people is almost certain to be luckier than all of the rest.
Possibly he is being looked after by a far more competent person behind the scenes; a spouse or a parent, perhaps, who dislikes being visible but works to help that person succeed.
Possibly that person really is more rational than you are, but his methods of success are so alien to you that your first instinct is to reject them out-of-hand.
Possibly his “writings” are actually being ghost-written by someone else.
Possibly he doesn’t much care about what he writes, going for low-effort writing in order to concentrate on winning.
Possibly he’s found one exploit that really works but won’t work if everyone does it; thus, he keeps quiet about it.
Possibly he’s deliberately writing to obscure or hide his own methods of success.
Possibly he’s found a winning strategy, but he doesn’t understand why it works, and thus invents a completely implausible “explanation” for it.
...have I missed anything?
If I understand the Peter Thiel doctrine of the secret correlectly that should be the case in many instances.
Some people are rich and can afford valuable things even if they don’t spend their money wisely. Some people might win because they have a lot of resources or connections to throw at problems.
If you define rationality as winning, why does it matter what his writings seem like?
I can’t directly observe Eliezer winning or losing, but I can make (perhaps very weak) inferences about how often he wins/loses given his writing.
As an analogy, I might not have the opportunity to play a given videogame ABC against a given blogger XYZ that I’ve never met and will never meet. But if I read his blog posts on ABC strategies, and try to apply them when I play ABC, and find that my win-rate vastly improves, I can infer that XYZ also probably wins often (and probably wins more often than I do).
Well, if what you want to accomplish is motivating large groups of people into supporting you and using them to conquer a large empire, you should study what they did and how they did it.
Now that you mention it, I actually don’t.