Signalling doesn’t have to be that straightforward. A clever individual (of which we have a few) may choose to be significantly more circumspect, and imply that a piece of knowledge is obvious by omitting it from a statement that presupposes it, or alluding to it off-hand. We do this all the time, but I’m going to say that this probably has more to do with mind projection than anything else. It often simply won’t occur to us to modulate a statement to encompass the receivers.
However, I don’t know if this is a ploy we can entirely defeat just by making obviousness a bad word. If anything, that just requires people trying to make such a ploy to be circumspect…
That’s just not very correct. There are no external errors in measuring probability, seeing as the unit and measure comes from internal processes. Errors in perceptions of reality and errors in evaluating the strength of an argument will invariably come from oneself, or alternatively from ambiguity in the argument itself (which would make it a worse argument anyway).
Intelligent people do make bad ideas seem more believable and stupid people do make good ideas seem less believable, but you can still expect the intelligent people to be right more often. Otherwise, what you’re describing as intelligence… ain’t. That doesn’t mean you should believe something just because a smart person said it—just that you shouldn’t believe it less.
It’s going back to the entire reverse stupidity thing. Trying to make yourself unbiased by compensating in the opposite direction doesn’t remove the bias—you’re still adjusting from the baseline it’s established.
On a similar note, I may just have given you an uncharitable reading and assumed you meant something you didn’t. Such a misunderstanding won’t adjust the truth of what I’m saying about what I’d be reading into your words, and it won’t adjust the truth of what you were actually trying to say. Even if there’s a bias on my part, it skews perception rather than reality.