You are correct that someone is unlikely form a new belief about how the world functions or something just because they hear someone say so. On the other hand, they’re very likely to form new beliefs about you, and in politics, about your fellows and your constituents, on the basis of what they hear you say.
Some things can be very clear, sure. First, I’d invite you to consider what you would do, as they say, in the least convenient possibel world. If you did have to work around impossible communication barriers that forced you to either shut up or deceive somebody, what would you do?
As for the realistic nature of this supposed situation, it’s very easy to show indeed. Just think how often politicians accuse each other of “cutting” funding to some vital thing because they’ve proposed not increasing spending as much as some previous plan.
That’s a very easily understood ambiguity for a nerd. The next one is harder.
Suppose you say you support gun control. If you’re not naive, you now know that people will make many assumptions about your other stances, on taxes, abortion, and so on. They’ll use one answer to infer your ingroup, and use your ingroup to deduce your other answers. Your answer will, you know, make them believe false things about your other stances. Maybe the best thing to do is weigh how many true and false beliefs will be caused by any answer you give and try to rank them somehow...
The next is worse.
Realistically, no one person can be expected to be an expert on even half the issues that, say, the president of the USA will have strong influence upon. Honestly, candidates should probably say something like “I have no idea how to solve these problems. I have a few ideas about X and Y, but, honestly, superior experts can probably think of better ideas, and I should just listen to them. My only real qualifications are, my judgment in choosing which experts to guide me, the list of goals I will try to achieve with whatever power I achieve, and my ability to play the political game in order to make those goals happen.”
Refreshingly honest, self-aware, insightful, and suicidal, wouldn’t you say? Nerds, and rationalists, especially, would love to hear an answer like that. But, all the insight would be lost upon political-thinking normal people, and when that’s been washed out, all that would be left over would be various admissions of ignorance, which would be interpreted as admissions of weakness and inability. There are a whole host of false beliefs that such an answer would cause, but the central one would be that this candidate would suck as president.
But the gulf between this truth and even a moderately acceptable answer is so huge that you pretty much have to lie if you want to win. You could always give up on winning, seeking honesty instead, but, as discussed in the previous paragraph, that would just lead to deceiving potentially fewer people.
You are correct that someone is unlikely form a new belief about how the world functions or something just because they hear someone say so. On the other hand, they’re very likely to form new beliefs about you, and in politics, about your fellows and your constituents, on the basis of what they hear you say.
Some things can be very clear, sure. First, I’d invite you to consider what you would do, as they say, in the least convenient possibel world. If you did have to work around impossible communication barriers that forced you to either shut up or deceive somebody, what would you do?
As for the realistic nature of this supposed situation, it’s very easy to show indeed. Just think how often politicians accuse each other of “cutting” funding to some vital thing because they’ve proposed not increasing spending as much as some previous plan.
That’s a very easily understood ambiguity for a nerd. The next one is harder.
Suppose you say you support gun control. If you’re not naive, you now know that people will make many assumptions about your other stances, on taxes, abortion, and so on. They’ll use one answer to infer your ingroup, and use your ingroup to deduce your other answers. Your answer will, you know, make them believe false things about your other stances. Maybe the best thing to do is weigh how many true and false beliefs will be caused by any answer you give and try to rank them somehow...
The next is worse.
Realistically, no one person can be expected to be an expert on even half the issues that, say, the president of the USA will have strong influence upon. Honestly, candidates should probably say something like “I have no idea how to solve these problems. I have a few ideas about X and Y, but, honestly, superior experts can probably think of better ideas, and I should just listen to them. My only real qualifications are, my judgment in choosing which experts to guide me, the list of goals I will try to achieve with whatever power I achieve, and my ability to play the political game in order to make those goals happen.”
Refreshingly honest, self-aware, insightful, and suicidal, wouldn’t you say? Nerds, and rationalists, especially, would love to hear an answer like that. But, all the insight would be lost upon political-thinking normal people, and when that’s been washed out, all that would be left over would be various admissions of ignorance, which would be interpreted as admissions of weakness and inability. There are a whole host of false beliefs that such an answer would cause, but the central one would be that this candidate would suck as president.
But the gulf between this truth and even a moderately acceptable answer is so huge that you pretty much have to lie if you want to win. You could always give up on winning, seeking honesty instead, but, as discussed in the previous paragraph, that would just lead to deceiving potentially fewer people.