It seems to me ultra-BS is perhaps continuous with hyping up one particular way that reality might in fact be, in a way that is disproportionate to your actual probability, and that is also continuous with emphasizing a way that reality might in fact be which is actually proportionate with your subjective probability.
About public belief: I think that people do tend to pick up at least vaguely on what the words they encounter are optimized for, and if you have the facts on your side but optimize for the recipient’s belief you do not have much advantage over someone optimizing for the opposite belief if the facts are too complicated. Well actually, not so confident about, but I am confident about this: if you optimize for tribal signifiers—for appearing to be supporting the correct side to others on “your” side—then you severely torpedo your credibility re: convincing the other side. And I do think that that tends to occur whenever something gets controversial.
It seems to me ultra-BS is perhaps continuous with hyping up one particular way that reality might in fact be, in a way that is disproportionate to your actual probability, and that is also continuous with emphasizing a way that reality might in fact be which is actually proportionate with your subjective probability.
Yep! I think this is a pretty good summary. You want to understand reality just enough to where you can say things that sound plausible (and are in line with your reasoning) but omit just enough factual information to where your case isn’t undermined.
I once read a post (I forget where) that an amateur historian will be able to convince an uneducated onlooker of any historical argument simply because history is so full of empirical examples. Whatever argument you’re making, you can almost always find at least one example supporting your claim. Whether the rest of history contradicts their point is irrelevant, as the uneducated onlooker doesn’t know history. Same principle here. Finding plausible points to support an unplausible argument is almost trivially easy.
About public belief: I think that people do tend to pick up at least vaguely on what the words they encounter are optimized for, and if you have the facts on your side but optimize for the recipient’s belief you do not have much advantage over someone optimizing for the opposite belief if the facts are too complicated. Well actually, not so confident about, but I am confident about this: if you optimize for tribal signifiers—for appearing to be supporting the correct side to others on “your” side—then you severely torpedo your credibility re: convincing the other side. And I do think that that tends to occur whenever something gets controversial.
Yeah, I definitely agree. At some point you reach a hard limit on how much an uneducated onlooker is able to understand. They may have a vague idea, but your guess is as good as mine in terms of what that looks like. If the onlooker can’t tell which of two experts to believe they’ll have even more trouble with two people spouting BS. (if the judges were perfect Bayesian reasoners, you should expect them to do the logical equivalent of ignoring everything me and my opponet say, since we’re likely both wrong in every way that matters). Thus, they mostly default to tribal signals, and that not being possible, to whichever side appears more confident/convincing.
It’s not really possible to argue against tribal signals, because at that point logic flies out the window and what matters is whether you’re on someone’s ‘side’, whatever that means. It’s why you don’t usually see tribal appeals in debate (unless you’re me, and prep 2 separate cases for the 2 main tribes).
It seems to me ultra-BS is perhaps continuous with hyping up one particular way that reality might in fact be, in a way that is disproportionate to your actual probability, and that is also continuous with emphasizing a way that reality might in fact be which is actually proportionate with your subjective probability.
About public belief: I think that people do tend to pick up at least vaguely on what the words they encounter are optimized for, and
if you have the facts on your side but optimize for the recipient’s belief you do not have much advantage over someone optimizing for the opposite belief if the facts are too complicated.Well actually, not so confident about, but I am confident about this: if you optimize for tribal signifiers—for appearing to be supporting the correct side to others on “your” side—then you severely torpedo your credibility re: convincing the other side. And I do think that that tends to occur whenever something gets controversial.Yep! I think this is a pretty good summary. You want to understand reality just enough to where you can say things that sound plausible (and are in line with your reasoning) but omit just enough factual information to where your case isn’t undermined.
I once read a post (I forget where) that an amateur historian will be able to convince an uneducated onlooker of any historical argument simply because history is so full of empirical examples. Whatever argument you’re making, you can almost always find at least one example supporting your claim. Whether the rest of history contradicts their point is irrelevant, as the uneducated onlooker doesn’t know history. Same principle here. Finding plausible points to support an unplausible argument is almost trivially easy.
Yeah, I definitely agree. At some point you reach a hard limit on how much an uneducated onlooker is able to understand. They may have a vague idea, but your guess is as good as mine in terms of what that looks like. If the onlooker can’t tell which of two experts to believe they’ll have even more trouble with two people spouting BS. (if the judges were perfect Bayesian reasoners, you should expect them to do the logical equivalent of ignoring everything me and my opponet say, since we’re likely both wrong in every way that matters). Thus, they mostly default to tribal signals, and that not being possible, to whichever side appears more confident/convincing.
It’s not really possible to argue against tribal signals, because at that point logic flies out the window and what matters is whether you’re on someone’s ‘side’, whatever that means. It’s why you don’t usually see tribal appeals in debate (unless you’re me, and prep 2 separate cases for the 2 main tribes).