I think you would find me less frustrating if you provided more context.
I think LessWrong as a whole would find you less frustrating if you assumed most comments from established users on domain-specific concepts or facts were more likely to be correct than your own thoughts and updated accordingly.
I think LessWrong as a whole would find you less frustrating if you assumed most comments from established users on domain-specific concepts or facts were more likely to be correct than your own thoughts and updated accordingly.
I think LessWrong as a whole would find you less frustrating if you assumed most comments from established users on domain-specific concepts or facts were more likely to be correct
Agreed. That’s easier. However, sometimes the easier way is not the correct way.
I wish I could trust other’s information. I have wished that my entire life. It is frequently exhausting and damn hard to question this much of what people say. But I want to be correct, not merely pleasant, and that’s life.
Eliezer intended for us to question authority. I’d have done it anyway because I started doing that ages ago. But he said in no uncertain terms that this is what he wants:
In Two More Things to Unlearn from School he warns his readers that “It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism.”
In Cached Thoughts he tells you to question what HE says. “Now that you’ve read this blog post, the next time you hear someone unhesitatingly repeating a meme you think is silly or false, you’ll think, “Cached thoughts.” My belief is now there in your mind, waiting to complete the pattern. But is it true? Don’t let your mind complete the pattern! Think!”
Perhaps there is a way to be more pleasant while still questioning everything. If you can think of something, I will consider it.
I’m not saying that a hypothetical vague “you” shouldn’t question things. I’m saying that you specifically, User: Epiphany, seem to not be very well-calibrated in this respect and should update towards questioning things less until you have a better feel for LessWrong discussion norms and epistemic standards.
I’m not saying that a hypothetical vague “you” shouldn’t question things.
Neither was I:
what reason do I have to believe that any authority figure or expert or established user is more likely to be correct?
I’m saying that you specifically, User: Epiphany, seem to not be very well-calibrated in this respect and should update towards questioning things less until you have a better feel for LessWrong discussion norms and epistemic standards.
So, trust you guys more while I’m still trying to figure out how much to trust you? Not going to happen, sorry.
Perhaps the perception you’re having is caused by the fact that you did not know how cynical I was when I started. My trust has increased quite a bit. If I appear not to trust Alicorn very much, this is because I’ve seen what appears to be an unusually high number of mistakes. I realize that this may be due to biased sample (I haven’t read thousands of Alicorn’s posts, maybe a dozen or so). But I’m not going to update with information I don’t have, and I don’t see it as a good use of time to go reading lots and lots of posts by Alicorn and whoever else trying to figure out how much to trust them. I will have a realistic idea of her eventually.
You might think about the reasons people have for saying the things they say. Why do people make false statements? The most common reasons probably fall under intentional deception (“lying”), indifference toward telling the truth (“bullshitting”), having been deceived by another, motivated cognition, confabulation, or mistake. As you’ve noticed, scientists and educators can face situations where complete integrity and honesty comes into conflict with their own career objectives, but there’s no apparent incentive for anyone to distort the truth about the name of the Center for Applied Rationality. There’s also no apparent motivation for Alicorn to bullshit or confabulate; if she isn’t quite sure she remembers the name, she doesn’t have anything to lose by simply moving on without commenting, nor does she have much to gain by getting away with posting the wrong name. That leaves the possibility that she has the wrong name by an unintended mistake. But different people’s chances of making a mistake are not necessarily equal. By being more directly involved with the organization, Alicorn has had many more opportunities to be corrected about the name than you have. That makes it much more likely that you are the one making the mistake, as turned out to be the case.
Perhaps there is a way to be more pleasant while still questioning everything. If you can think of something, I will consider it.
You could phrase your questions as questions rather than statements. You could also take extra care to confirm your facts before you preface a statement with “no, actually”.
there’s no apparent incentive for anyone to distort the truth about the name of the Center for Applied Rationality. There’s also no apparent motivation for Alicorn to bullshit or confabulate
I know. But it’s possible for her to be unaware of the existence of CFMR, had there been two orgs. If you read the entire disagreement, you’ll notice that what it came down to is that it did not occur to me that CFMR might have changed it’s name. Therefore, denial that it existed appeared to be in direct conflict with the evidence. The evidence being two articles where people were creating CFMR.
Alicorn has had many more opportunities to be corrected about the name than you have.
I was surprised she didn’t seem to know about it, but then again, if she doesn’t read every single post on here, it’s possible she didn’t know. I don’t know how much she knows, or who she specifically talks to, or how often she talks to them, or whether she might have been out sick for a month or what might have happened. For something that small, I am not going to go to great lengths to analyze her every potential motive for being correct or incorrect. My assessment was simple for that reason.
As for wanting to trust people more, I’ve been thinking about ways to go about that, but I doubt I will do it by trying to rule out every possible reason for them to have been wrong. That’s a long list, and it’s dependent upon my imperfect ability to think of all the reasons that a person might be wrong. I’m more likely to go about it from a totally different angle: How many scientists are there? What things do most of them agree on? How many of those have been proven false? Okay, that’s an estimated X percent chance that what most scientists believe is actually true based on sample set of (whatever) size.
You could phrase your questions as questions rather than statements.
This is a good suggestion, and I normally do.
You could also take extra care to confirm your facts before you preface a statement with “no, actually”.
I did confirm my fact with two articles. That is why it became a “no actually” instead of a question.
This seems like a risky heuristic to apply generally, given the volume of domain-specific contrarianism floating around here. My own version is more along the lines of “trust, but verify”.
It’s a specific problem Epiphany has that she assumes her own internal monologue of what’s true is far more reliable than any evidence or statements to the contrary.
That’s not a problem unless it’s false. Almost all evidence and statements to the contrary are less reliable than my belief regarding what’s true.
That’s a very expensive state to maintain, since I got that way by altering my internal description of what’s true to match the most reliable evidence that I can find...
I don’t think I am right about everything, but I relate to this. I am not perfectly rational. But I decided to tear apart and challenge all my cached thoughts around half my life ago (well over a decade before Eliezer wrote about cached thoughts of course, but it’s a convenient term for me now) and ever since then, I have not been able to see authorities the same way...
I think it would be ideal if we were all to strive to do enough hard work that we’ve successfully altered our internal description of what’s true to match the most reliable evidence on so many different topics as to be able to see fatal flaws in the authoritative views more often than not.
Considering the implications of the first three links in this post that accomplishment may not be an unrealistic one and sadly, I don’t say this because I think we’re all so incredibly smart, but because the world is so incredibly broken.
I’ve never accepted that belief in the authority on any subject could pay rent. The biggest advantage experts have to me is when they can quickly point me to the evidence that I can evaluate fastest to arrive at the correct conclusion; rather than trust Aristotle that heavier items fall faster, I can duplicate any number of experiments that show that any two objects with equal specific air resistance fall at exactly the same speed.
Downside: It is more expensive to evaluate the merits of the evidence than the credentials of the expert.
The biggest advantage experts have to me is when they can quickly point me to the evidence that I can evaluate fastest to arrive at the correct conclusion
I relate to this.
Downside: It is more expensive to evaluate the merits of the evidence than the credentials of the expert.
There simply isn’t enough time to evaluate everything. When it’s really important, I’ll go to a significant amount of trouble. If not, I use heuristics like “how likely is it that something as easy to test as this made it’s way into the school curriculum and is also wrong?” if I have too little time or the subject is of little importance, I may decide the authoritative opinion is more likely to be right than my absolutely not thought out at all opinion, but that’s not the same as trusting authority. That’s more like slapping duct tape on, to me.
Slightly wrong heuristic. Go with “What proportion of things in the curriculum that are this easy to test have been wrong when tested?”
The answer is disturbing. Things like ‘Glass is a slow-flowing liquid’.
Actually ‘Glass is a slow-flowing liquid’ would take decades to test, wouldn’t it? I think you took a different meaning of “easy to test”. I meant something along the lines of “A thing that just about anyone can do in a matter of minutes without spending much money.”
Unless you can think of a fast way to test the glass is a liquid theory?
Unless you can think of a fast way to test the glass is a liquid theory?
Look at old windows that have been in for decades. Do they pile up on the bottom like caramel? No. Myth busted.
More interesting than simple refutation though is “taboo liquid”. Go look at non-newtonian fluids and see all the cool things that matter can do. For example, Ice and rock flow like a liquid on a large enough scale (glaciers, planetary mantle convection).
Look at old windows that have been in for decades. Do they pile up on the bottom like caramel? No. Myth busted.
I actually believed that myth for ages because the panes in my childhood house were thicker on the bottom than on the top, causing visible distortion. Turns out that making perfectly flat sheets of glass was difficult at the time it was built, and that for whatever reason they’d been put in thick side down.
Oh. Yeah. Good point. Obviously I wasn’t thinking too hard about this. Thank you.
Wait, so they put the glass is a liquid theory into school curriculum and it was this easy to test?
I don’t recall that in my own school curriculum. I’ll be thinking about whether to reduce my trust for my own schooling experience. It can’t go much further down after reading John Taylor Gatto, but if the remaining trust that is there is unfounded, I might as well kill it, too.
This is the first one that comes to mind. I might post others as I find them, but to be honest I’m too lazy to go through your logs or my IRC logs to find the examples
That is an example of me not being aware of how others use a word, not an example of me believing I am correct when others disagree with me and then being wrong. In fact, I think that LessWrong and I agree for the most part on that subject. We’re just using the word elitism differently.
Do you have even a single example of me continuing to think I am correct about something where a matter of truth (not wording) is concerned even after compelling evidence to the contrary is presented?
I think LessWrong as a whole would find you less frustrating if you assumed most comments from established users on domain-specific concepts or facts were more likely to be correct than your own thoughts and updated accordingly.
Established users can be wrong about many things, including domain-specific concepts or facts.
A more general heuristic that I do endorse, from Cromwell:
Agreed. That’s easier. However, sometimes the easier way is not the correct way.
In a world where the authoritative “facts” can be wrong more often than they’re right, scientists often take a roughly superstitious approach to science and the educational system isn’t even optimized for the purpose of educating what reason do I have to believe that any authority figure or expert or established user is more likely to be correct?
I wish I could trust other’s information. I have wished that my entire life. It is frequently exhausting and damn hard to question this much of what people say. But I want to be correct, not merely pleasant, and that’s life.
Eliezer intended for us to question authority. I’d have done it anyway because I started doing that ages ago. But he said in no uncertain terms that this is what he wants:
In Two More Things to Unlearn from School he warns his readers that “It may be dangerous to present people with a giant mass of authoritative knowledge, especially if it is actually true. It may damage their skepticism.”
In Cached Thoughts he tells you to question what HE says. “Now that you’ve read this blog post, the next time you hear someone unhesitatingly repeating a meme you think is silly or false, you’ll think, “Cached thoughts.” My belief is now there in your mind, waiting to complete the pattern. But is it true? Don’t let your mind complete the pattern! Think!”
Perhaps there is a way to be more pleasant while still questioning everything. If you can think of something, I will consider it.
I’m not saying that a hypothetical vague “you” shouldn’t question things. I’m saying that you specifically, User: Epiphany, seem to not be very well-calibrated in this respect and should update towards questioning things less until you have a better feel for LessWrong discussion norms and epistemic standards.
Neither was I:
So, trust you guys more while I’m still trying to figure out how much to trust you? Not going to happen, sorry.
So you’re trying to figure out how much to trust “us,” but you’re only willing to update in the negative direction?
Perhaps the perception you’re having is caused by the fact that you did not know how cynical I was when I started. My trust has increased quite a bit. If I appear not to trust Alicorn very much, this is because I’ve seen what appears to be an unusually high number of mistakes. I realize that this may be due to biased sample (I haven’t read thousands of Alicorn’s posts, maybe a dozen or so). But I’m not going to update with information I don’t have, and I don’t see it as a good use of time to go reading lots and lots of posts by Alicorn and whoever else trying to figure out how much to trust them. I will have a realistic idea of her eventually.
You might think about the reasons people have for saying the things they say. Why do people make false statements? The most common reasons probably fall under intentional deception (“lying”), indifference toward telling the truth (“bullshitting”), having been deceived by another, motivated cognition, confabulation, or mistake. As you’ve noticed, scientists and educators can face situations where complete integrity and honesty comes into conflict with their own career objectives, but there’s no apparent incentive for anyone to distort the truth about the name of the Center for Applied Rationality. There’s also no apparent motivation for Alicorn to bullshit or confabulate; if she isn’t quite sure she remembers the name, she doesn’t have anything to lose by simply moving on without commenting, nor does she have much to gain by getting away with posting the wrong name. That leaves the possibility that she has the wrong name by an unintended mistake. But different people’s chances of making a mistake are not necessarily equal. By being more directly involved with the organization, Alicorn has had many more opportunities to be corrected about the name than you have. That makes it much more likely that you are the one making the mistake, as turned out to be the case.
You could phrase your questions as questions rather than statements. You could also take extra care to confirm your facts before you preface a statement with “no, actually”.
I know. But it’s possible for her to be unaware of the existence of CFMR, had there been two orgs. If you read the entire disagreement, you’ll notice that what it came down to is that it did not occur to me that CFMR might have changed it’s name. Therefore, denial that it existed appeared to be in direct conflict with the evidence. The evidence being two articles where people were creating CFMR.
I was surprised she didn’t seem to know about it, but then again, if she doesn’t read every single post on here, it’s possible she didn’t know. I don’t know how much she knows, or who she specifically talks to, or how often she talks to them, or whether she might have been out sick for a month or what might have happened. For something that small, I am not going to go to great lengths to analyze her every potential motive for being correct or incorrect. My assessment was simple for that reason.
As for wanting to trust people more, I’ve been thinking about ways to go about that, but I doubt I will do it by trying to rule out every possible reason for them to have been wrong. That’s a long list, and it’s dependent upon my imperfect ability to think of all the reasons that a person might be wrong. I’m more likely to go about it from a totally different angle: How many scientists are there? What things do most of them agree on? How many of those have been proven false? Okay, that’s an estimated X percent chance that what most scientists believe is actually true based on sample set of (whatever) size.
This is a good suggestion, and I normally do.
I did confirm my fact with two articles. That is why it became a “no actually” instead of a question.
I do read every single post on here. (Well, I skim technical ones.)
This seems like a risky heuristic to apply generally, given the volume of domain-specific contrarianism floating around here. My own version is more along the lines of “trust, but verify”.
It’s a specific problem Epiphany has that she assumes her own internal monologue of what’s true is far more reliable than any evidence or statements to the contrary.
That’s not a problem unless it’s false. Almost all evidence and statements to the contrary are less reliable than my belief regarding what’s true.
That’s a very expensive state to maintain, since I got that way by altering my internal description of what’s true to match the most reliable evidence that I can find...
I don’t think I am right about everything, but I relate to this. I am not perfectly rational. But I decided to tear apart and challenge all my cached thoughts around half my life ago (well over a decade before Eliezer wrote about cached thoughts of course, but it’s a convenient term for me now) and ever since then, I have not been able to see authorities the same way...
I think it would be ideal if we were all to strive to do enough hard work that we’ve successfully altered our internal description of what’s true to match the most reliable evidence on so many different topics as to be able to see fatal flaws in the authoritative views more often than not.
Considering the implications of the first three links in this post that accomplishment may not be an unrealistic one and sadly, I don’t say this because I think we’re all so incredibly smart, but because the world is so incredibly broken.
Did you start questioning early as well?
I’ve never accepted that belief in the authority on any subject could pay rent. The biggest advantage experts have to me is when they can quickly point me to the evidence that I can evaluate fastest to arrive at the correct conclusion; rather than trust Aristotle that heavier items fall faster, I can duplicate any number of experiments that show that any two objects with equal specific air resistance fall at exactly the same speed.
Downside: It is more expensive to evaluate the merits of the evidence than the credentials of the expert.
I relate to this.
There simply isn’t enough time to evaluate everything. When it’s really important, I’ll go to a significant amount of trouble. If not, I use heuristics like “how likely is it that something as easy to test as this made it’s way into the school curriculum and is also wrong?” if I have too little time or the subject is of little importance, I may decide the authoritative opinion is more likely to be right than my absolutely not thought out at all opinion, but that’s not the same as trusting authority. That’s more like slapping duct tape on, to me.
Slightly wrong heuristic. Go with “What proportion of things in the curriculum that are this easy to test have been wrong when tested?” The answer is disturbing. Things like ‘Glass is a slow-flowing liquid’.
Actually ‘Glass is a slow-flowing liquid’ would take decades to test, wouldn’t it? I think you took a different meaning of “easy to test”. I meant something along the lines of “A thing that just about anyone can do in a matter of minutes without spending much money.”
Unless you can think of a fast way to test the glass is a liquid theory?
Look at old windows that have been in for decades. Do they pile up on the bottom like caramel? No. Myth busted.
More interesting than simple refutation though is “taboo liquid”. Go look at non-newtonian fluids and see all the cool things that matter can do. For example, Ice and rock flow like a liquid on a large enough scale (glaciers, planetary mantle convection).
I actually believed that myth for ages because the panes in my childhood house were thicker on the bottom than on the top, causing visible distortion. Turns out that making perfectly flat sheets of glass was difficult at the time it was built, and that for whatever reason they’d been put in thick side down.
Oh. Yeah. Good point. Obviously I wasn’t thinking too hard about this. Thank you.
Wait, so they put the glass is a liquid theory into school curriculum and it was this easy to test?
I don’t recall that in my own school curriculum. I’ll be thinking about whether to reduce my trust for my own schooling experience. It can’t go much further down after reading John Taylor Gatto, but if the remaining trust that is there is unfounded, I might as well kill it, too.
You can’t taboo a word used in the premise.
Non-Newtonian fluids aren’t liquids, except when they are.
Granted, they are pretty cool though.
Give three examples.
http://lesswrong.com/lw/efv/elitism_isnt_necessary_for_refining_rationality/
This is the first one that comes to mind. I might post others as I find them, but to be honest I’m too lazy to go through your logs or my IRC logs to find the examples
That is an example of me not being aware of how others use a word, not an example of me believing I am correct when others disagree with me and then being wrong. In fact, I think that LessWrong and I agree for the most part on that subject. We’re just using the word elitism differently.
Do you have even a single example of me continuing to think I am correct about something where a matter of truth (not wording) is concerned even after compelling evidence to the contrary is presented?