I just realized that willingness to update seems very cultish from outside. Literally.
I mean—if someone joins a cult, what is the most obvious thing that happens to them? They update heavily; towards the group teachings. This is how you can tell that something wrong is happening.
We try to update on reasonable evidence. For example we would update on a scientific article more than on a random website. However, from outside is seems similar to willingness to update on your favorite (in-group) sources, and unwillingness to update on other (out-group) sources. Just like a Jehovah Witness would update on the Watch Tower, but would remain skeptical towards Mormon literature. As if the science itself is your cult… except that it’s not really the science as we know it, because most scientist behave outside the laboratory just like everyone else; and you are trying to do something else.
Okay, I guess this is nothing new for a LW reader. I just realized now, on the emotional level, how willingness to update, considered a virtue on LW, may look horrifying to an average person. And how willingness to update on trustworthy evidence more than on untrustworthy evidence, probably seems like hypocrisy, like a rationalization for preferring your in-group ideas to out-group ideas.
Almost surely, yes. If other people keep telling you crazy things, not updating is a smart choice. Not the smartest one, but it is a simple strategy that anyone can use, cheaply (because we can’t always afford verification).
Perhaps, but moving the local optimum from Politics-is-the-Mindkiller towards a higher sanity line seems to require dropping this defensive mechanism (on the societal level, at least).
First one must be able to tell the difference between reliable and unreliable sources of information. Only then it is safe to drop the defensive mechanism.
Just dropping the defensive mechanism could lead to whatever… for example massive religious zealotry. Or, more probably, some kind of political zealotry.
Unfortunately, one cannot simple revert stupidity. If a creationist refused to update to evolution, that’s bad. But if they update to scientology, that’s even worse. So before people start updating in masses, they better understand the difference.
I agree that this is a common failure mode locally on LW, but even if this community did not have this problem, Villiam_Bur’s point would still have a lot of explanatory power on why raising-the-sanity-line methods are resisted by society at large.
I worry that this is a case of finding a ‘secret virtue’ in one’s vices: I think we’re often tempted to pick some outstandingly bad feature of ourselves or an organization we belong to and explain it as the necessary consequence of a necessary and good feature.
My reason for thinking that this is going on here is that another explanation seems much more plausible. For one thing, you’d think the effect of seeing someone heavily update would depend on knowing them before and after. But how many people who think of LW this way think so because they knew someone before and after they became swayed by LW’s ideas?
With Nancy, I think that the PR problem LW has isn’t the impression people have that LWers converts of a certain kind. Rather, I think what negative impression there is is the result of an extremely fixable problem of presentation: some of the most prominent and popular ways of expressing core LW ideas come in the form of 1) ‘litanies’, or pseudo-asian mysticism. These are good ideas being given a completely unnecessary and, for many, off-putting gilding. No one here takes the religious overtone seriously, but outsiders don’t know that. 2) they come in the form of explicit expressions of contempt for outsiders, such as ‘raising the sanity waterline’, etc.
I admit that I honestly do consider many people insane; and I always did. Even the smarter ones seem paralyzed by some harmful memes. I mean, people argue about words that have no connection with reality, while in other parts of the world children are dying from hunger. Hoaxes of every kind circulate by e-mail, and it’s hard to find someone I know personally who didn’t send me them repeatedly (after being repeatedly explained that it was a hoax, and given a pointer to some sites collecting hoaxes). Smart people start speaking in slogans, when difficult problems need to be solved, and seem unable to understand where is the problem with this kind of communication. People doing bullshit that obviously doesn’t and can’t work, and insisting that you have to do it harder and spend more money, instead of just trying something else for a while and observing what happens. So much stupidity, so much waste. -- And the few people who know better, or at least are able to know better, are often afraid to admit it even to themselves, because the idea that we live in insane society is scary. So even they don’t resist the madness; at best they don’t join it, but they pretend they don’t see it. This is how I saw the world decades before I found LW.
And yes, it is a bad PR. It is impolite towards the insane people, who may feel offended, and then try to punish us. But even worse, it is a bad strategy towards the sane people, who are not yet emotionally ready to admit that the rest of the world is not sane. Because it goes against our tribal instincts. We must agree with the tribe, whether it is right or wrong; especially when it is wrong. If you are able to resist this pressure, it’s probably not caused by higher rationality, but by lower social skills.
So how exactly should we communicate the inconvenient truths. Because we are trying to communicate truthfully, aren’t we? Should we post the information openly, and have a bad PR? Should we have a secret forum for forbidden thoughts, and appear cultish, and risk that someone exposes the information? Should we communicate certain thoughts only in person, never online? Seems to me that “bad PR” is the least wrong option.
Is there a way to disagree with the majority, be open to new members, and not seem dangerous? Perhaps we could downplay our ambitions; to stop talking about improving the world, and pretend that we are just some kind of Mensa, a few geeks solving their harmless Bayesian equations, unconnected with the real world. Or we could make a semi-secret discussion forum; it would be open to anyone after overcoming a trivial inconvenience, and it would not be indexed by google. Then the best articles (judged by quality and PR impact) would be published on a public forum. Perhaps the articles should not appear all in the same place: everyone (including Eliezer) would have their own blog with their own articles, and LW would just contain links to them (like Digg). This would be an inconvenience for publishing; but we could provide some technical help for people who have problem starting their own website. Perhaps we should split LW to multiple websites, concerned with different topics: artificial intelligence, effective philantrophy, rationality, community forum, etc. -- All these ideas are about being less open, less direct. Which is dishonest per se, but perhaps this is what good PR means: lying in socially accepted ways; pretending what other people want you to pretend.
This could be a separate topic probably. And first we would have to solve what we want to archieve, and only then discuss how.
I admit that I honestly do consider many people insane; and I always did.
I don’t think you do, I think you consider most people to be (in some sense rightly) wrong or ignorant. Just the fact that you hold people to some standard (which you must do, if you say that they fail) means you don’t think of them as insane. If you’ve ever known someone with depression or who is bi-polar disorder, you know that you can’t tell them t snap out of it, or learn this or that, or just think it through. Even calling people insane, as an expression of contempt, is a way of holding them to a standard. But we don’t hold actually insane people to standards, and we don’t (unless we’re jerks) hold them in contempt. You don’t communicate the inconvenient truth to the insane. You don’t disagree or agree with the insane. The wrong, the ignorant, the evil, yes. But not the insane.
No one here (and I mean no one) actually thinks the world is full of insane people. That’s a bit of metaphor and hyperbole. If anyone seriously thought that, their behavior would be so radically strange (think ‘I am Legend’ or something), you’d probably find them locked up somewhere.
Is there a way to disagree with the majority, be open to new members, and not seem dangerous?
The claim that everyone else is insane doesn’t sound dangerous, it sounds resentful. Dangerous is not a problem. I don’t think we need to implement any of your ideas, because the issue is purely one of rhetoric. None of the ideas themselves are a problem, because there’s no problem with saying everyone else is wrong so long as you have either 1) results, or 2) good, persuasive, arguments. And if all you’ve got is (2), tone matters, because you can only persuade people who listen to you. There’s no reason at all to hide anything, or lie, or pretend or anything like that.
Speaking about typical indviduals, ignorant is a good word, insane is not. As you say, it makes sense trying to explain things to an ignorant person, not to an insane person. Individuals can be explained things with some degree of success. I agree with you on this.
The difference becomes less clear when dealing with groups of people, societies. Explaining things to a group of people, that is more often (as an anthropomorphism) like dealing with an insane person. Literally, the kind of person that hears you and understands your words, but then also hears “voices in their head” telling them it’s bad to think that way, that they should keep doing the stupid stuff they were doing regardless of the problems it brought them, etc. Except that these “voices” are the other people. -- But this probably just proves that societies are not individuals.
there’s no problem with saying everyone else is wrong so long as you have either 1) results, or 2) good, persuasive, arguments
Yeah, having results would be good. The Friendly AI would be the best, but until then, we need some other kind of results.
So, an interesting task would be to make a list of results of the LW community that would impress outsiders. Put that into a flyer, and we have a nice PR tool.
The difference becomes less clear when dealing with groups of people, societies. Explaining things to a group of people, that is more often (as an anthropomorphism) like dealing with an insane person.
That’s fair enough. I’d stay away from groups of people. Back in the day, they used to write without vowels, so that you could only really read something if you were either exceptionally literate or were being told what it said by a teacher. I say never communicate with more than a handful of people at once, but I suppose that’s not possible a lot of the time.
The difference becomes less clear when dealing with groups of people, societies. Explaining things to a group of people, that is more often (as an anthropomorphism) like dealing with an insane person.
Perhaps it would be less confusing to treat a society as if it were a single organism, of which the people within it are analogous to cells rather than agents with minds of their own. I’m not sure how far such an approach would get but it might be interesting.
until then, we need some other kind of results.
CFAR might be able to demonstrate such after a few more years of their workshops. I’m not sure how they’re measuring results, but I would be surprised if they were not doing so.
CFAR planned to do some statistics about how the minicamp attendees’ lives have changed after a year, using a control group of people who applied to minicamps but were not admitted. Not perfect, but pretty good. And the year from the first minicamps is approximately now (for me it will be in one month). But the samples are very small.
With regards to PR, I am not sure if this will work. I mean, even if the results are good, only the people who care about statistical results will be impressed by them. It’s a circular problem: you need to already have some rationality to be able to be impressed by rational arguments. -- Because you may also say: yeah, those guys are trying so hard, and I will just pray or think positively and the same results will come to me, too. And if they don’t, that just means I have to pray or think positively more. Or even: statistics doesn’t prove anything, I feel it in my heart that rationality is cold and can’t make anyone happy.
I agree. But optimizing for good storytelling is different from optimizing for good science. A good scientific result would be like: “minicamp attendees are 12% more efficient in their lives, plus or minus 3.5%”. A good story would be “this awesome thing happened to an minicamp attendee” (ignoring the fact that equivalent thing happened to a person in the control group).
Maybe the best would be to publish both, and let readers pick their favourite part.
I just realized that willingness to update seems very cultish from outside. Literally.
I mean—if someone joins a cult, what is the most obvious thing that happens to them? They update heavily; towards the group teachings. This is how you can tell that something wrong is happening.
We try to update on reasonable evidence. For example we would update on a scientific article more than on a random website. However, from outside is seems similar to willingness to update on your favorite (in-group) sources, and unwillingness to update on other (out-group) sources. Just like a Jehovah Witness would update on the Watch Tower, but would remain skeptical towards Mormon literature. As if the science itself is your cult… except that it’s not really the science as we know it, because most scientist behave outside the laboratory just like everyone else; and you are trying to do something else.
Okay, I guess this is nothing new for a LW reader. I just realized now, on the emotional level, how willingness to update, considered a virtue on LW, may look horrifying to an average person. And how willingness to update on trustworthy evidence more than on untrustworthy evidence, probably seems like hypocrisy, like a rationalization for preferring your in-group ideas to out-group ideas.
So does that make stubbornness a kind of epistemic self defence?
Almost surely, yes. If other people keep telling you crazy things, not updating is a smart choice. Not the smartest one, but it is a simple strategy that anyone can use, cheaply (because we can’t always afford verification).
Perhaps, but moving the local optimum from Politics-is-the-Mindkiller towards a higher sanity line seems to require dropping this defensive mechanism (on the societal level, at least).
First one must be able to tell the difference between reliable and unreliable sources of information. Only then it is safe to drop the defensive mechanism.
Just dropping the defensive mechanism could lead to whatever… for example massive religious zealotry. Or, more probably, some kind of political zealotry.
Unfortunately, one cannot simple revert stupidity. If a creationist refused to update to evolution, that’s bad. But if they update to scientology, that’s even worse. So before people start updating in masses, they better understand the difference.
For what it’s worth, the complaints I’ve heard about LW center around arrogance, not excessive compliance.
you could restate the arrogance as an expectation that others update when you say things
Likewise, especially of people talking about fields they are not experts in.
I agree that this is a common failure mode locally on LW, but even if this community did not have this problem, Villiam_Bur’s point would still have a lot of explanatory power on why raising-the-sanity-line methods are resisted by society at large.
On the other hand, once you’re in one it’s the not-updating that gives it away.
I worry that this is a case of finding a ‘secret virtue’ in one’s vices: I think we’re often tempted to pick some outstandingly bad feature of ourselves or an organization we belong to and explain it as the necessary consequence of a necessary and good feature.
My reason for thinking that this is going on here is that another explanation seems much more plausible. For one thing, you’d think the effect of seeing someone heavily update would depend on knowing them before and after. But how many people who think of LW this way think so because they knew someone before and after they became swayed by LW’s ideas?
With Nancy, I think that the PR problem LW has isn’t the impression people have that LWers converts of a certain kind. Rather, I think what negative impression there is is the result of an extremely fixable problem of presentation: some of the most prominent and popular ways of expressing core LW ideas come in the form of 1) ‘litanies’, or pseudo-asian mysticism. These are good ideas being given a completely unnecessary and, for many, off-putting gilding. No one here takes the religious overtone seriously, but outsiders don’t know that. 2) they come in the form of explicit expressions of contempt for outsiders, such as ‘raising the sanity waterline’, etc.
I admit that I honestly do consider many people insane; and I always did. Even the smarter ones seem paralyzed by some harmful memes. I mean, people argue about words that have no connection with reality, while in other parts of the world children are dying from hunger. Hoaxes of every kind circulate by e-mail, and it’s hard to find someone I know personally who didn’t send me them repeatedly (after being repeatedly explained that it was a hoax, and given a pointer to some sites collecting hoaxes). Smart people start speaking in slogans, when difficult problems need to be solved, and seem unable to understand where is the problem with this kind of communication. People doing bullshit that obviously doesn’t and can’t work, and insisting that you have to do it harder and spend more money, instead of just trying something else for a while and observing what happens. So much stupidity, so much waste. -- And the few people who know better, or at least are able to know better, are often afraid to admit it even to themselves, because the idea that we live in insane society is scary. So even they don’t resist the madness; at best they don’t join it, but they pretend they don’t see it. This is how I saw the world decades before I found LW.
And yes, it is a bad PR. It is impolite towards the insane people, who may feel offended, and then try to punish us. But even worse, it is a bad strategy towards the sane people, who are not yet emotionally ready to admit that the rest of the world is not sane. Because it goes against our tribal instincts. We must agree with the tribe, whether it is right or wrong; especially when it is wrong. If you are able to resist this pressure, it’s probably not caused by higher rationality, but by lower social skills.
So how exactly should we communicate the inconvenient truths. Because we are trying to communicate truthfully, aren’t we? Should we post the information openly, and have a bad PR? Should we have a secret forum for forbidden thoughts, and appear cultish, and risk that someone exposes the information? Should we communicate certain thoughts only in person, never online? Seems to me that “bad PR” is the least wrong option.
Is there a way to disagree with the majority, be open to new members, and not seem dangerous? Perhaps we could downplay our ambitions; to stop talking about improving the world, and pretend that we are just some kind of Mensa, a few geeks solving their harmless Bayesian equations, unconnected with the real world. Or we could make a semi-secret discussion forum; it would be open to anyone after overcoming a trivial inconvenience, and it would not be indexed by google. Then the best articles (judged by quality and PR impact) would be published on a public forum. Perhaps the articles should not appear all in the same place: everyone (including Eliezer) would have their own blog with their own articles, and LW would just contain links to them (like Digg). This would be an inconvenience for publishing; but we could provide some technical help for people who have problem starting their own website. Perhaps we should split LW to multiple websites, concerned with different topics: artificial intelligence, effective philantrophy, rationality, community forum, etc. -- All these ideas are about being less open, less direct. Which is dishonest per se, but perhaps this is what good PR means: lying in socially accepted ways; pretending what other people want you to pretend.
This could be a separate topic probably. And first we would have to solve what we want to archieve, and only then discuss how.
I don’t think you do, I think you consider most people to be (in some sense rightly) wrong or ignorant. Just the fact that you hold people to some standard (which you must do, if you say that they fail) means you don’t think of them as insane. If you’ve ever known someone with depression or who is bi-polar disorder, you know that you can’t tell them t snap out of it, or learn this or that, or just think it through. Even calling people insane, as an expression of contempt, is a way of holding them to a standard. But we don’t hold actually insane people to standards, and we don’t (unless we’re jerks) hold them in contempt. You don’t communicate the inconvenient truth to the insane. You don’t disagree or agree with the insane. The wrong, the ignorant, the evil, yes. But not the insane.
No one here (and I mean no one) actually thinks the world is full of insane people. That’s a bit of metaphor and hyperbole. If anyone seriously thought that, their behavior would be so radically strange (think ‘I am Legend’ or something), you’d probably find them locked up somewhere.
The claim that everyone else is insane doesn’t sound dangerous, it sounds resentful. Dangerous is not a problem. I don’t think we need to implement any of your ideas, because the issue is purely one of rhetoric. None of the ideas themselves are a problem, because there’s no problem with saying everyone else is wrong so long as you have either 1) results, or 2) good, persuasive, arguments. And if all you’ve got is (2), tone matters, because you can only persuade people who listen to you. There’s no reason at all to hide anything, or lie, or pretend or anything like that.
Speaking about typical indviduals, ignorant is a good word, insane is not. As you say, it makes sense trying to explain things to an ignorant person, not to an insane person. Individuals can be explained things with some degree of success. I agree with you on this.
The difference becomes less clear when dealing with groups of people, societies. Explaining things to a group of people, that is more often (as an anthropomorphism) like dealing with an insane person. Literally, the kind of person that hears you and understands your words, but then also hears “voices in their head” telling them it’s bad to think that way, that they should keep doing the stupid stuff they were doing regardless of the problems it brought them, etc. Except that these “voices” are the other people. -- But this probably just proves that societies are not individuals.
Yeah, having results would be good. The Friendly AI would be the best, but until then, we need some other kind of results.
So, an interesting task would be to make a list of results of the LW community that would impress outsiders. Put that into a flyer, and we have a nice PR tool.
That’s fair enough. I’d stay away from groups of people. Back in the day, they used to write without vowels, so that you could only really read something if you were either exceptionally literate or were being told what it said by a teacher. I say never communicate with more than a handful of people at once, but I suppose that’s not possible a lot of the time.
Perhaps it would be less confusing to treat a society as if it were a single organism, of which the people within it are analogous to cells rather than agents with minds of their own. I’m not sure how far such an approach would get but it might be interesting.
CFAR might be able to demonstrate such after a few more years of their workshops. I’m not sure how they’re measuring results, but I would be surprised if they were not doing so.
CFAR planned to do some statistics about how the minicamp attendees’ lives have changed after a year, using a control group of people who applied to minicamps but were not admitted. Not perfect, but pretty good. And the year from the first minicamps is approximately now (for me it will be in one month). But the samples are very small.
With regards to PR, I am not sure if this will work. I mean, even if the results are good, only the people who care about statistical results will be impressed by them. It’s a circular problem: you need to already have some rationality to be able to be impressed by rational arguments. -- Because you may also say: yeah, those guys are trying so hard, and I will just pray or think positively and the same results will come to me, too. And if they don’t, that just means I have to pray or think positively more. Or even: statistics doesn’t prove anything, I feel it in my heart that rationality is cold and can’t make anyone happy.
I think that people who don’t care about statistics are still likely to be impressed by vivid stories, not that I have any numbers to prove this.
I agree. But optimizing for good storytelling is different from optimizing for good science. A good scientific result would be like: “minicamp attendees are 12% more efficient in their lives, plus or minus 3.5%”. A good story would be “this awesome thing happened to an minicamp attendee” (ignoring the fact that equivalent thing happened to a person in the control group).
Maybe the best would be to publish both, and let readers pick their favourite part.
I’m sure they’ll be publishing both stories and statistics.
One more possibility: spin off instrumental rationality. Develop gradual introductions on how to think more clearly to improve your life.