A really intelligent response, so I upvoted you, even though, as I said, it surprised me by telling me that, just as one example, tarot cards are Bad when I had not even considered the possibility, so I doubt this came from inside me.
Well you are obviously not able to predict the output of your own brain, that’s the whole point of the brain. If morality is in the brain and still too complex to understand, you would expect to encounter moral feelings that you had not anticipated.
Er, I thought it was overall pretty lame, e.g. the whole question-begging w.r.t. the ‘prior probability of omnibenevolent omnipowerful thingy’ thingy (nothing annoys me more than abuses of probability theory these days, especially abuses of algorithmic probability theory). Perhaps you are conceding too much in order to appear reasonable. Jesus wasn’t very polite.
By the way, in case you’re not overly familiar with the heuristics and biases literature, let me give you a hint: it sucks. At least the results that most folk around her cite have basically nothing to do with rationality. There’s some quite good stuff with tons of citations, e.g. Gigerenzer’s, but Eliezer barely mentioned it to Less Wrong (as fastandfrugal.com which he endorsed) and therefore as expected Less Wrong doesn’t know about it. (Same with interpretations of quantum mechanics, as Mitchell Porter often points out. I really hope that Eliezer is pulling some elaborate prank on humanity. Maybe he’s doing it unwittingly.)
Anyway the upshot is that when people tell you about ‘confirmation bias’ as if it existed in the sense they think it does then they probably don’t know what the hell they’re talking about and you should ignore them. At the very least don’t believe them until you’ve investigated the literature yourself. I did so and was shocked at how downright anti-informative the field is, and less shocked but still shocked at how incredibly useless statistics is (both Bayesianism as a theoretical normative measure and frequentism as a practical toolset for knowledge acquisition). The opposite happened with the parapsychology literature, i.e. low prior, high posterior. Let’s just say that it clearly did not confirm my preconceptions; lolol.
Lastly, towards the esoteric end: All roads lead to Rome, if you’ll pardon a Catholicism. If they don’t it’s not because the world is mad qua mad; it is because it is, alas, sinful. An easy way to get to hell is to fall into a fully-general-counterargument blackhole, or a literal blackhole maybe. Those things freak me out.
(P.S. My totally obnoxious arrogance is mostly just a passive aggressive way of trolling LW. I’m not actually a total douchebag IRL. /recursive-compulsive-self-justification)
I love how Less Wrong basically thinks that all evidence that doesn’t support its favored conclusion is bad because it just leads to confirmation bias. “The evidence is on your side, granted, but I have a fully general counterargument called ‘confirmation bias’ that explains why it’s not actually evidence!” Yeah, confirmation bias, one of the many claimed cognitive biases that arguably doesn’t actually exist. (Eliezer knew about the controversy, which is why his post is titled “Positive Bias”, which arguably also doesn’t exist, especially not in a cognitively relevant way.) Then they talk about Occam’s razor while completely failing to understand what algorithmic probability is actually saying. Hint: It definitely does not say that naturalistic mechanistic universes are a priori more probable! It’s like they’re trolling and I’m not supposed to feed them but they look sort of like a very hungry, incredibly stupid puppy.
Searching and skimming, the first link does not seem to actually say that confirmation bias does not exist. It says that it does not appear to be the cause of “overconfidence bias”—it seems to take no position on whether it exists otherwise.
Okay, yeah, I was taking a guess. There are other papers that talk about confirmation/positive bias specifically, a lot of in the vein of this kinda stuff. Maybe Kaj’s posts called ‘Heuristics and Biases Biases?’ from here on LW references some relevant papers too. Sorry, I have limited cognitive resources at the moment, I’m mostly trying to point in the general direction of the relevant literature because there’s quite a lot of it.
So I think you’re quite right in that “supernatural” and “natural” are sets that contain possible universes of very different complexity and that those two adjectives are not obviously relevant to the complexity of the universes they describe. I support tabooing those terms. But if you compare two universes, one of which is described most simply by the wave function and an initial state, and another which is described by the wave function, an initial state and another section of code describing the psychic powers of certain agents the latter universe is a priori more unlikely (bracketing for the moment the simulation issue), Obviously if psi phenomenon can be incorporated into the physical model without adding additional lines of code that’s another matter entirely.
Returning to the simulation issue I take your position to be that there are conceivable “meta-physics” (meant literally; not necessarily referring to the branch of philosophy) which can make local complexities more common? Is that a fair restatement? I have a suspicion that this is not possibly without paying the complexity back at the other end, though I’m not sure.
Anyway the upshot is that when people tell you about ‘confirmation bias’ as if it existed in the sense they think it does then they probably don’t know what the hell they’re talking about and you should ignore them.
...
I love how Less Wrong basically thinks that all evidence that doesn’t support its favored conclusion is bad because it just leads to confirmation bias. “The evidence is on your side, granted, but I have a fully general counterargument called ‘confirmation bias’ that explains why it’s not actually evidence!” Yeah, confirmation bias, one of the many claimed cognitive biases that arguably doesn’t actually exist.
What was said that’s a synonym for or otherwise invoked the confirmation bias?
It’s mentioned a few times in this thread re AspiringKnitter’s evidence for Christianity. I’m too lazy to link to them, especially as it’d be so easy to get the answer to your question with control+f “confirmation” that I’m not sure I interpreted it correctly?
A really intelligent response, so I upvoted you, even though, as I said, it surprised me by telling me that, just as one example, tarot cards are Bad when I had not even considered the possibility, so I doubt this came from inside me.
Well you are obviously not able to predict the output of your own brain, that’s the whole point of the brain. If morality is in the brain and still too complex to understand, you would expect to encounter moral feelings that you had not anticipated.
Er, I thought it was overall pretty lame, e.g. the whole question-begging w.r.t. the ‘prior probability of omnibenevolent omnipowerful thingy’ thingy (nothing annoys me more than abuses of probability theory these days, especially abuses of algorithmic probability theory). Perhaps you are conceding too much in order to appear reasonable. Jesus wasn’t very polite.
By the way, in case you’re not overly familiar with the heuristics and biases literature, let me give you a hint: it sucks. At least the results that most folk around her cite have basically nothing to do with rationality. There’s some quite good stuff with tons of citations, e.g. Gigerenzer’s, but Eliezer barely mentioned it to Less Wrong (as fastandfrugal.com which he endorsed) and therefore as expected Less Wrong doesn’t know about it. (Same with interpretations of quantum mechanics, as Mitchell Porter often points out. I really hope that Eliezer is pulling some elaborate prank on humanity. Maybe he’s doing it unwittingly.)
Anyway the upshot is that when people tell you about ‘confirmation bias’ as if it existed in the sense they think it does then they probably don’t know what the hell they’re talking about and you should ignore them. At the very least don’t believe them until you’ve investigated the literature yourself. I did so and was shocked at how downright anti-informative the field is, and less shocked but still shocked at how incredibly useless statistics is (both Bayesianism as a theoretical normative measure and frequentism as a practical toolset for knowledge acquisition). The opposite happened with the parapsychology literature, i.e. low prior, high posterior. Let’s just say that it clearly did not confirm my preconceptions; lolol.
Lastly, towards the esoteric end: All roads lead to Rome, if you’ll pardon a Catholicism. If they don’t it’s not because the world is mad qua mad; it is because it is, alas, sinful. An easy way to get to hell is to fall into a fully-general-counterargument blackhole, or a literal blackhole maybe. Those things freak me out.
(P.S. My totally obnoxious arrogance is mostly just a passive aggressive way of trolling LW. I’m not actually a total douchebag IRL. /recursive-compulsive-self-justification)
Explain?
Explain?
Elaborate?
I love how Less Wrong basically thinks that all evidence that doesn’t support its favored conclusion is bad because it just leads to confirmation bias. “The evidence is on your side, granted, but I have a fully general counterargument called ‘confirmation bias’ that explains why it’s not actually evidence!” Yeah, confirmation bias, one of the many claimed cognitive biases that arguably doesn’t actually exist. (Eliezer knew about the controversy, which is why his post is titled “Positive Bias”, which arguably also doesn’t exist, especially not in a cognitively relevant way.) Then they talk about Occam’s razor while completely failing to understand what algorithmic probability is actually saying. Hint: It definitely does not say that naturalistic mechanistic universes are a priori more probable! It’s like they’re trolling and I’m not supposed to feed them but they look sort of like a very hungry, incredibly stupid puppy.
Explain?
http://library.mpib-berlin.mpg.de/ft/gg/gg_how_1991.pdf is exemplary of the stuff I’m thinking of. Note that that paper has about 560 citations. If you want to learn more then dig into the literature. I really like Gigerenzer’s papers as they’re well-cited and well-reasoned, and he’s a statistician. He even has a few papers about how to improve rationality, e.g. http://library.mpib-berlin.mpg.de/ft/gg/GG_How_1995.pdf has over 1,000 citations.
Searching and skimming, the first link does not seem to actually say that confirmation bias does not exist. It says that it does not appear to be the cause of “overconfidence bias”—it seems to take no position on whether it exists otherwise.
Okay, yeah, I was taking a guess. There are other papers that talk about confirmation/positive bias specifically, a lot of in the vein of this kinda stuff. Maybe Kaj’s posts called ‘Heuristics and Biases Biases?’ from here on LW references some relevant papers too. Sorry, I have limited cognitive resources at the moment, I’m mostly trying to point in the general direction of the relevant literature because there’s quite a lot of it.
Hard to know whether to agree or disagree without knowing “more probable than what?”
Sorry. More probable than supernaturalistic universes of the sort that the majority of humans finds more likely (where e.g. psi phenomena exist).
So I think you’re quite right in that “supernatural” and “natural” are sets that contain possible universes of very different complexity and that those two adjectives are not obviously relevant to the complexity of the universes they describe. I support tabooing those terms. But if you compare two universes, one of which is described most simply by the wave function and an initial state, and another which is described by the wave function, an initial state and another section of code describing the psychic powers of certain agents the latter universe is a priori more unlikely (bracketing for the moment the simulation issue), Obviously if psi phenomenon can be incorporated into the physical model without adding additional lines of code that’s another matter entirely.
Returning to the simulation issue I take your position to be that there are conceivable “meta-physics” (meant literally; not necessarily referring to the branch of philosophy) which can make local complexities more common? Is that a fair restatement? I have a suspicion that this is not possibly without paying the complexity back at the other end, though I’m not sure.
Boltzmann brain, maybe?
Explain?
What was said that’s a synonym for or otherwise invoked the confirmation bias?
It’s mentioned a few times in this thread re AspiringKnitter’s evidence for Christianity. I’m too lazy to link to them, especially as it’d be so easy to get the answer to your question with control+f “confirmation” that I’m not sure I interpreted it correctly?