Michael Smith touched on this in his keynote talk at LWCW last weekend. Don’t believe something just because you’ve heard a good argument for it, he said (I think, reconstructing from memory, and possibly extrapolating as well). If you do that, you’ll just change your mind as soon as you encounter a really good argument for the opposite (the process Yvain described). You don’t really know something until you’ve reached the state where the knowledge would grow back if it was deleted from your mind.
Post something half-baked on LW and you will be torn to shreds. Which is great, of course, and I wouldn’t have it any other way—but it doesn’t really sound like the behaviour of a website full of gullible people.
LW has a higher bar to believing, but is it high enough? Once an idea breaches the walls, should it sweep all before it, assisted by the meta-idea of taking ideas seriously?
“Don’t be gullible” itself has gotchas. If I adjust my skepticism-meter too far upward, then manufactured controversy can seem like real controversy to me—this fact was used by tobacco companies in the 1960′s to sway public opinion and continues to be used today by many other groups. Just because some seemingly-smart people seem to be having a debate about a subject does not mean that the truth is beyond your understanding.
I think the better advice would be “There is no easy path to find the truth.” That is, there is no formula or template you can always follow to find the truth. Finding the truth requires continuously challenging your beliefs and questioning authority.
I think the better advice would be “There is no easy path to find the truth.” That is, there is no formula or template you can always follow to find the truth.
I always liked the morealliterative formulation of that: “no royal road to rationality”.
Once an idea breaches the walls, should it sweep all before it, assisted by the meta-idea of taking ideas seriously?
I don’t think that’s the case. If you look at the LW census you find that people think UFAI isn’t the biggest Xrisk for most people on LW, even through it’s the Xrisk that’s most prominently discussed on LW.
Michael Smith touched on this in his keynote talk at LWCW last weekend. Don’t believe something just because you’ve heard a good argument for it, he said (I think, reconstructing from memory, and possibly extrapolating as well). If you do that, you’ll just change your mind as soon as you encounter a really good argument for the opposite (the process Yvain described). You don’t really know something until you’ve reached the state where the knowledge would grow back if it was deleted from your mind.
LW has a higher bar to believing, but is it high enough? Once an idea breaches the walls, should it sweep all before it, assisted by the meta-idea of taking ideas seriously?
Also relevant, an old comment of mine.
“Don’t be gullible” itself has gotchas. If I adjust my skepticism-meter too far upward, then manufactured controversy can seem like real controversy to me—this fact was used by tobacco companies in the 1960′s to sway public opinion and continues to be used today by many other groups. Just because some seemingly-smart people seem to be having a debate about a subject does not mean that the truth is beyond your understanding.
I think the better advice would be “There is no easy path to find the truth.” That is, there is no formula or template you can always follow to find the truth. Finding the truth requires continuously challenging your beliefs and questioning authority.
I always liked the more alliterative formulation of that: “no royal road to rationality”.
I don’t think that’s the case. If you look at the LW census you find that people think UFAI isn’t the biggest Xrisk for most people on LW, even through it’s the Xrisk that’s most prominently discussed on LW.