The rationalist community [...] rationalist standards [...] in this community
Uh, remind me why I’m supposed to care what some Bay Area robot cult thinks? (Although I heard there was an offshoot in Manchester that might be performing better!) The square quotes around “rationalist” “community” in the second paragraph are there for a reason.
The OP is a very narrowly focused post, trying to establish a single point (Being Wrong Doesn’t Mean You’re Stupid and Bad, Probably) by appealing to probability theory as normative reasoning (and some plausible assumptions). If you’re worried about someone thinking you’re stupid and bad because you were wrong, you should just show them this post, and if they care about probability theory as normative reasoning, then they’ll realize that they were wrong and stop mistakenly thinking that you’re stupid and bad. On the other hand, if the person you’re trying to impress doesn’t care about probability theory as normative reasoning, then they’re stupid and bad, and you shouldn’t care about impressing them.
outside cultural baggage
Was there ever an “inside”, really? I thought there was. I think I was wrong.
people will only raise their estimate of incompetence by a Bayesian 0.42%.
But that’s the correct update! People who update more or less than the Bayesian 0.42% are wrong! (Although that doesn’t mean they’re stupid or bad, obviously.)
they are referring to things with standard definitions that are precise enough to do math with.
This is an isolated demand for rigor and I’m not going to fall for it. I shouldn’t need to have a reduction of what brain computations correspond to people’s concept of “stupid and bad” in order to write a post like this.
using a small sample of data is worse than defaulting to base rates
What does this mean? If you have a small sample of data and you update on it the correct amount, you don’t do worse than you would have without the data.
you’re at a tech conference and looking for interesting people to talk to, do you bother approaching anyone wearing a suit on the chance that a few hackers like dressing up?
Analyzing the signaling game governing how people choose to dress at tech conferences does look like a fun game-theory exercise; thanks for the suggestion! I don’t have time for that now, though.
Uh, remind me why I’m supposed to care what some Bay Area robot cult thinks? (Although I heard there was an offshoot in Manchester that might be performing better!) The square quotes around “rationalist” “community” in the second paragraph are there for a reason.
The OP is a very narrowly focused post, trying to establish a single point (Being Wrong Doesn’t Mean You’re Stupid and Bad, Probably) by appealing to probability theory as normative reasoning (and some plausible assumptions). If you’re worried about someone thinking you’re stupid and bad because you were wrong, you should just show them this post, and if they care about probability theory as normative reasoning, then they’ll realize that they were wrong and stop mistakenly thinking that you’re stupid and bad. On the other hand, if the person you’re trying to impress doesn’t care about probability theory as normative reasoning, then they’re stupid and bad, and you shouldn’t care about impressing them.
Was there ever an “inside”, really? I thought there was. I think I was wrong.
But that’s the correct update! People who update more or less than the Bayesian 0.42% are wrong! (Although that doesn’t mean they’re stupid or bad, obviously.)
This is an isolated demand for rigor and I’m not going to fall for it. I shouldn’t need to have a reduction of what brain computations correspond to people’s concept of “stupid and bad” in order to write a post like this.
What does this mean? If you have a small sample of data and you update on it the correct amount, you don’t do worse than you would have without the data.
Analyzing the signaling game governing how people choose to dress at tech conferences does look like a fun game-theory exercise; thanks for the suggestion! I don’t have time for that now, though.