Everyone has some collection of skills and abilities, including things like charisma, luck, rationality, determination, networking ability, etc. Each person’s success is limited by constraints related to these abilities, in the same way that an application’s performance is limited by the CPU speed, RAM, disk speed, networking speed, etc of the machine(s) it runs on. But just as for many applications the performance bottleneck isn’t CPU speed, for most people the success bottleneck isn’t rationality.
It could be worse. Rationality essays could be attracting a self-selected group of people whose bottleneck isn’t rationality. Actually I think that’s true. Here’s a three-step program that might help a “stereotypical LWer” more than reading LW:
Strongly disagree with 2) and 3). I think you mean them as a proxy for ‘become more social, make more connections, find ways to fit in a local culture’, but quality of connections usually matters more than quantity. But in many circles that are likely to matter for a typical LWer 3) is likely to be useless and likely benefits of 2) are achievable without drinking or with a very modest drinking.
Then living in a wilderness and cutting trees would be much better. Or some kinds of manual work where you can see the fruits of your labor, e.g. gardening. I believe that activities like these would be better suited for connecting mental and physical parts of a person.
I think that depends very much on the kind of people with whom you hang out. There are people who need alcohol to open up. On the other hand there are people who have no problem opening up without alcohol.
This is just what nerdy types tell themselves and they come up with all these rationalizations for it, most peoples skillsets don’t lend themselves for that type of socialization. These people just realize they were wrong years later when it’s much too late.
I recommend trying “placebo alcohol”. That means, getting drunk for the first time, to get an experience of what it feels like, but to have a non-alcoholic drink the next time and merely role-play being drunk.
This is the exact sort of community that would delude themselves in exactly this department and would never stop arguing(not saying you do this), but if someone told me “Can you have fun/meet people without drinking”, I would say “sort of, but you’re better of just participating anyways”.
When you drink with friends you learn why you were wrong, there’s always going to be just that “one guy” who thinks he knows better though.
I think at some point in time a few years ago there seemed to be an implicit assumption on LessWrong that of course you can hack your determination, rewire your networking ability and bootstrap your performance at anything! And I don’t think people so much stopped believing that this was true in principle, but rather people started realizing how incredibly difficult and time consuming it is to change your base skillset.
Well, there’s also the possibility that people who did successfully hack their determination, networking ability, and performance are now mostly not spending time on LW.
Depending on what is their goal. There are many possible levels of “winning”.
You can win on individual level by quitting LW and focusing on your career and improving your skills. People who achieve that can easily disappear from our radars.
You can win on group level by creating a community of “successful rationalists”; so you are not only successful as a lonely individual, but you have a tribe that shares your values, and can cooperate in effective ways. We would probably notice such group, for example because they would advertise themselves on LW for purposes of recruitment.
And then you can win on civilizational level, by raising the planetary level of sanity and building a Friendly AI. Almost sure we would notice that.
Okay, the third one is outside of everyday life’s scope, so let’s ignore it for now.
I don’t know how much I am generalizing here from my own example, but winning on an individual level would now feel insufficient for me, having met rationalists on LW website and in real life. If I could increase my skills and resources significantly, I would probably spend some time trying to get others from the rationalist community on my level. Because having allies I could achieve even more. So I would probably post much less comments on LW, but once in a while I would post an article trying to inspire people to “become stronger”.
On the other hand, perhaps you are being too insular in the communities you engage in. There are many, many groups of smart people out there in the world. Perhaps someone who got what they wanted from LW and ‘quit’ went on to gather allies who were already successful in their fields?
Thousand of small steps is required, one big epiphany is not enough. But many people expect the latter, because the very reason they seek advice is to avoid doing the former.
The part of focusing your efforts on the right task is a rationality skill.
Recently one rationalist wrote on facebook how he used physical rationality to make his his shoulder heal faster after an operation and produce less pain. Having accurate models of reality is very useful in many cases.
It’s a new coinage, so the term isn’t well-defined. On the other hand here are reasons to use the term.
On key aspect of “physical rationality” is a strong alignment between your own physical body and your own map of it. An absence of conflicts between system I and system II when it comes to physicality.
I don’t know all influences in this particular case but it’s certainly that direction. There was a reference to the book “A Guide to Better Movement” by Todd Hargrove.
Doctors being dumfounded is a hallmark of irrationalist stories. Not saying this one is—I don’t even know the story here—but as someone who grew up around a lot of people who basically believed in magic, I can conjure so many anectotes of people thinking their doctors were blown away by sudden recoveries and miraculous healings. I mostly figure doctors go “oh cool it’s going pretty well” and add a bit of color for the patient’s benefit.
A lot of doctors will be suprised if someone walks over hot coals and afterwards has no blisters or burning marks. Yet, at Anthony Robbins seminars thousands walk over hot coals and most of them don’t develop blisters.
I had an idea to write a post about this problem under the name “general effectiveness”. GE is measure of you by your outside peer, typically employer.
If I were employer I would (and I really did it as I used to hire people for small tasks in my art business) look on their general effectiveness. It constitutes of many things after rationality, including visual outlook, age, gender, interest to work, ability to come in time and their results in test work.
Most of these characteristics are unchangeable personality traits, so if a given person would invest a lot in studying rationality, he would not be able to change them much.
But he could change his place of work and find more suitable to him.
There are also ways to rise personal effectiveness in different ways. For example if I hire a helper I rise my effectiveness.
But just as for many applications the performance bottleneck isn’t CPU speed, for most people the success bottleneck isn’t rationality.
Instrumental rationality, among other things, points people to whichever of their skills or abilities is currently the performance bottleneck and encourages them to work on that, not the thing that’s most fun to work on. So we would still expect instrumental rationalists to win in this model.
(Yes, epistemic rationality might not lead to winning as directly.)
Yes, epistemic rationality might not lead to winning as directly
Why would that be? Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns? Epistemic rationality provides a direction where people should put their efforts if they want to become less wrong about stuff. Are people simply unwilling to put in that effort?
Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns?
More the latter. Most of the things that a person could learn about are things that won’t help them directly. Agreed that if one has poor epistemic rationality, it’s hard to do the instrumental rationality part correctly (“I know, I’ll fix this problem by wishing!”).
for most people the success bottleneck isn’t rationality.
Instrumental rationality is more or less defined as “doing whatever you need to in order to succeed”. If success requires e.g. networking, instrumental rationality would tell you to improve your networking ability.
For epistemic rationality I agree, it’s not a common bottleneck.
The question whether luck is a skill is an interesting question :-)
Simple hypothesis relating to Why Don’t Rationalists Win:
Everyone has some collection of skills and abilities, including things like charisma, luck, rationality, determination, networking ability, etc. Each person’s success is limited by constraints related to these abilities, in the same way that an application’s performance is limited by the CPU speed, RAM, disk speed, networking speed, etc of the machine(s) it runs on. But just as for many applications the performance bottleneck isn’t CPU speed, for most people the success bottleneck isn’t rationality.
It could be worse. Rationality essays could be attracting a self-selected group of people whose bottleneck isn’t rationality. Actually I think that’s true. Here’s a three-step program that might help a “stereotypical LWer” more than reading LW:
1) Gym every day
2) Drink more alcohol
3) Watch more football
Only slightly tongue in cheek ;-)
Strongly disagree with 2) and 3). I think you mean them as a proxy for ‘become more social, make more connections, find ways to fit in a local culture’, but quality of connections usually matters more than quantity. But in many circles that are likely to matter for a typical LWer 3) is likely to be useless and likely benefits of 2) are achievable without drinking or with a very modest drinking.
My advice was more like “get in touch with your stupid animal side”. The social part comes later :-)
Then living in a wilderness and cutting trees would be much better. Or some kinds of manual work where you can see the fruits of your labor, e.g. gardening. I believe that activities like these would be better suited for connecting mental and physical parts of a person.
I don’t know about yours, but my stupid animal side is uninterested in alcohol and football. It wants to eat, sleep, fuck, and harass betas :-D
Drinking alcohol is very necessary for connecting with people. People who are against alcohol don’t know much they miss out at times.
“I drink to make other people more interesting”—Ernest Hemingway
I think that depends very much on the kind of people with whom you hang out. There are people who need alcohol to open up. On the other hand there are people who have no problem opening up without alcohol.
This is so obviously wrong.
Alcohol may aid in connecting with some people some of the time.
This is just what nerdy types tell themselves and they come up with all these rationalizations for it, most peoples skillsets don’t lend themselves for that type of socialization. These people just realize they were wrong years later when it’s much too late.
I recommend trying “placebo alcohol”. That means, getting drunk for the first time, to get an experience of what it feels like, but to have a non-alcoholic drink the next time and merely role-play being drunk.
This is the exact sort of community that would delude themselves in exactly this department and would never stop arguing(not saying you do this), but if someone told me “Can you have fun/meet people without drinking”, I would say “sort of, but you’re better of just participating anyways”.
When you drink with friends you learn why you were wrong, there’s always going to be just that “one guy” who thinks he knows better though.
What do you mean?
I think at some point in time a few years ago there seemed to be an implicit assumption on LessWrong that of course you can hack your determination, rewire your networking ability and bootstrap your performance at anything! And I don’t think people so much stopped believing that this was true in principle, but rather people started realizing how incredibly difficult and time consuming it is to change your base skillset.
Well, there’s also the possibility that people who did successfully hack their determination, networking ability, and performance are now mostly not spending time on LW.
If that’s true, then many rationalists actually do win.
Depending on what is their goal. There are many possible levels of “winning”.
You can win on individual level by quitting LW and focusing on your career and improving your skills. People who achieve that can easily disappear from our radars.
You can win on group level by creating a community of “successful rationalists”; so you are not only successful as a lonely individual, but you have a tribe that shares your values, and can cooperate in effective ways. We would probably notice such group, for example because they would advertise themselves on LW for purposes of recruitment.
And then you can win on civilizational level, by raising the planetary level of sanity and building a Friendly AI. Almost sure we would notice that.
Okay, the third one is outside of everyday life’s scope, so let’s ignore it for now.
I don’t know how much I am generalizing here from my own example, but winning on an individual level would now feel insufficient for me, having met rationalists on LW website and in real life. If I could increase my skills and resources significantly, I would probably spend some time trying to get others from the rationalist community on my level. Because having allies I could achieve even more. So I would probably post much less comments on LW, but once in a while I would post an article trying to inspire people to “become stronger”.
On the other hand, perhaps you are being too insular in the communities you engage in. There are many, many groups of smart people out there in the world. Perhaps someone who got what they wanted from LW and ‘quit’ went on to gather allies who were already successful in their fields?
Thousand of small steps is required, one big epiphany is not enough. But many people expect the latter, because the very reason they seek advice is to avoid doing the former.
“Isn’t there a pill I can just take?”
X-)
The world needs more “pills I can just take.”
I don’t know about that. So far the world’s experience with “Just take this pill and everything will be fine” is… mixed.
Well, admittedly I was assuming pills that worked and had the intended effect.
Maybe some started to appreciate the struggle and the suffering, to find joy and strength in it. Then, their terminal goals pivoted.
The part of focusing your efforts on the right task is a rationality skill.
Recently one rationalist wrote on facebook how he used physical rationality to make his his shoulder heal faster after an operation and produce less pain. Having accurate models of reality is very useful in many cases.
What is “physical rationality”?
It’s a new coinage, so the term isn’t well-defined. On the other hand here are reasons to use the term.
On key aspect of “physical rationality” is a strong alignment between your own physical body and your own map of it. An absence of conflicts between system I and system II when it comes to physicality.
So I suppose things like the Alexander Technique, possibly Yoga, certain martial arts and sports might be implicated?
I don’t know all influences in this particular case but it’s certainly that direction. There was a reference to the book “A Guide to Better Movement” by Todd Hargrove.
Assuming he only had one shoulder operated on, where was the control shoulder?
His doctor was dumbfounded over the result and the doctor has seen control shoulders.
Doctors being dumfounded is a hallmark of irrationalist stories. Not saying this one is—I don’t even know the story here—but as someone who grew up around a lot of people who basically believed in magic, I can conjure so many anectotes of people thinking their doctors were blown away by sudden recoveries and miraculous healings. I mostly figure doctors go “oh cool it’s going pretty well” and add a bit of color for the patient’s benefit.
A lot of doctors will be suprised if someone walks over hot coals and afterwards has no blisters or burning marks. Yet, at Anthony Robbins seminars thousands walk over hot coals and most of them don’t develop blisters.
The human body is complex there are a lot of real phenomena that can dumfounded doctors. If you think doctors are infallible you might want to read http://lesswrong.com/r/discussion/lw/nes/link_evidencebased_medicine_has_been_hijacked/
Whether you take that as evidence that magic exists is a different matter.
If you don’t mind, what’s the name of the person who used physical rationality?
Given semi-private facebook sources, I think I’ll rather write you a direct message then answer publically.
I had an idea to write a post about this problem under the name “general effectiveness”. GE is measure of you by your outside peer, typically employer.
If I were employer I would (and I really did it as I used to hire people for small tasks in my art business) look on their general effectiveness. It constitutes of many things after rationality, including visual outlook, age, gender, interest to work, ability to come in time and their results in test work.
Most of these characteristics are unchangeable personality traits, so if a given person would invest a lot in studying rationality, he would not be able to change them much.
But he could change his place of work and find more suitable to him.
There are also ways to rise personal effectiveness in different ways. For example if I hire a helper I rise my effectiveness.
Instrumental rationality, among other things, points people to whichever of their skills or abilities is currently the performance bottleneck and encourages them to work on that, not the thing that’s most fun to work on. So we would still expect instrumental rationalists to win in this model.
(Yes, epistemic rationality might not lead to winning as directly.)
Why would that be? Is it that many people work in areas where it doesn’t really matters if they are mistaken? Or do people already know enough about the area they work in and further improvements have diminishing returns? Epistemic rationality provides a direction where people should put their efforts if they want to become less wrong about stuff. Are people simply unwilling to put in that effort?
People may underestimate the amount and kind of information they need to turn epistemic rationality into instrumental rationality.
People may underestimate the value of clearly stated and expressed and communicated preferences.
More the latter. Most of the things that a person could learn about are things that won’t help them directly. Agreed that if one has poor epistemic rationality, it’s hard to do the instrumental rationality part correctly (“I know, I’ll fix this problem by wishing!”).
Another hypothesis—the smarter you sound the less friends you tend to have.
Fewer!
Most people like having at least one smart friend.
The trick is not to make other people feel stupid, which many (most?) smart people are very bad at.
I suspect it’s more of a golden middle kind of thing—people out in both tails of the distribution tend to have social problems.
Could it also be, that being rational deprives portion of CPU/RAM of human brains, that would otherwise be used for something better?
Instrumental rationality is more or less defined as “doing whatever you need to in order to succeed”. If success requires e.g. networking, instrumental rationality would tell you to improve your networking ability.
For epistemic rationality I agree, it’s not a common bottleneck.
The question whether luck is a skill is an interesting question :-)