Social Insight: When a Lie Is Not a Lie; When a Truth Is Not a Truth
//The point has already been made, that if you wish to truly be honest, it is not enough to speak the truth.
I generally don’t tell people I’m an atheist (I describe my beliefs without using any common labels). Why? I know that if I say the words “I am an atheist,” that they will hear the following concepts:
- I positively believe there is no God
- I cannot be persuaded by evidence any more than most believers can be persuaded by evidence, ie, I have a kind of faith in my atheism
- I wish to distance myself from members of religious tribes
As I said, the point has already been made; If I know that they will hear those false ideas when I say a certain phrase, how can I say I am honest in speaking it, knowing that I will cause them to have false beliefs? Hence the saying, if you wish to protect yourself, speak the truth. If you wish to be honest, speak so that truth will be heard.
Many a politician convincingly lies with truths by saying things that they know will be interpreted in a certain positive (and false) way, but which they can always defend as having been intended to convey some other meaning.
---
The New
There is a counterpart to this insight, come to me as I’ve begun to pay more attention to the flow of implicit social communication. If speaking the truth in a way you know will deceive is a lie, then perhaps telling a lie in a way that you know will communicate a true concept is not a lie.
I’ve relaxed my standards of truth-telling as I’ve come to understand this. “You’re the best” and “You can do this” statements have been opened to me, no qualifiers needed. If I know that everyone in a group has to say “I have XYZ qualification,” but I also know that no one actually believes anybody when they say it, I can comfortably recite those words, knowing that I’m not actually leading anybody to believe false things, and thus, am not being dishonest.
Politicians use this method, too, and I think I’m more or less okay with it. You see, we have a certain problem that arises from intellectual inequality. There are certain truths which literally cannot be spoken to some people. If someone has an IQ of 85, you literally cannot tell them the truth about a great number of things (or they cannot receive it). And there are a great many more people who have the raw potential to understand certain difficult truths, but whom you cannot reasonably tell these truths (they’d have to want to learn, put in effort, receive extensive teaching, etc).
What if some of these truths are pertinent to policy? What do you do, say a bunch of phrases that are “true” in a way you will interpret them, but which will only be heard as...
As what? What do people hear when you explain concepts they cannot understand? If I had to guess, very often they interpret this as an attack on their social standing, as an attempt by the speaker to establish themselves as a figure of superior ability, to whom they should defer. You sound uppity, cold, out-of-touch, maybe nerdy or socially inept.
So, then...if you’re socially capable, you don’t say those things. You give up. You can’t speak the truth, you literally cannot make a great many people hear the real reasons why policy Z is a good idea; they have limited the vocabulary of the dialogue by their ability and willingness to engage.
Your remaining moves are to limit yourself to their vocabulary, or say something outside of that vocabulary, all the nuance of which will evaporate en route to their ears, and which will be heard as a monochromatic “I think I’m better than you.”
The details of this dynamic at play go on and on, but for now, I’ll just say that this is the kind of thing Scott Adams is referring to when he says that what Trump has said is “emotionally true” even if it “doesn’t pass the fact checks” (see dialogue with Sam Harris).
In a world of inequality, you pick your poison. Communicate what truths can be received by your audience, or...be a nerd, and stay out of elections.
- Rational Feed by 27 Aug 2017 3:49 UTC; 18 points) (
- 15 Aug 2017 18:31 UTC; 1 point) 's comment on Social Insight: When a Lie Is Not a Lie; When a Truth Is Not a Truth by (
I am very suspicious of arguments which boil down to “You’re too dumb to understand the truth, so I’m justified in telling you lies”.
Do you think anyone can undertstand anything? (And are simplifications lies?)
Everyone builds their own maps and yes, they can be usefully ranked by how well do they match the territory.
Truth/lie is not a boolean, but a whole range of possible values between two obvious extremes. For any threshold that you might set, you can find and argue about uncertain edge cases.
In practice, I find that “intent to deceive” works well, though, of course, there are situations when it breaks down.
How do you detect that?
In the usual way: by testing the congruence with reality.
That’s just another word for the same thing? What does one do operationally?
One tests. Operationally.
Think science/engineering/Popper/the usual stuff.
Usually, predictive accuracy is used as a proxy for correspondence to reality, because one cannot check map-territory correspondence by standing outside the map-territory relationship and observing (in)congreuence directly.
Right.
There are caveats because e.g. you can never prove that a map is (entirely) correct, you can only prove that one is wrong—but these are not new and well-known.
It’s worse than that , and they’re not widely enough know
Eh. “All models are wrong but some are useful”.
Do you happen to have alternatives?
Not saying your epsitemology can do things it can;’t do.
Motte: We can prove things about reality.
Bailey: We can predict obervations.
That doesn’t seem to be meaningful advice given how “X should not claim it can do things it can’t do” is right there near “You should do good things and not do bad things”.
And aren’t your motte & bailey switched around?
You know that is not universally followed?
I could never imagine such a thing! Next thing you’ll be telling me that people do stupid things on a regular basis :-P
In Science and Sanity “The map is not the territory” isn’t a statement that means that the map can never be correct but is always wrong but that it’s not meaningful to call a map right or wrong. Instead of being right or wrong different maps have more or less correspondence to reality and to other maps.
Why would one care about correspondence to other maps?
That seems about right, although the whole ‘lies are truth and truths are lies’ way of phrasing it make sit seem unnecessarily Orwellian.
Like, the concept of the ‘white lie’ seems to cover this pretty well. You do it when you talk to kids.
Yes, that’s one of the prime examples.
Honestly, the problem with this approach is that it tends to degenerate to “when my side tells lies, they’re still emotionally true; when the other side makes inconvenient statements that are true, I can dismiss them as emotionally false”.
That’s harder to do when you have an explicit understanding.
That is what most people are already doing. It has its advantages and disadvantages, but there are no advantages to being oblivious to how people are thinking
This is really cool stuff and I think you’re hitting on some important things. I think you’re basically right in most of it, and yes, this is what Scott Adams is talking about when he says that trump says things that are “emotionally true” even if it “doesn’t pass the fact checks”.
I have a few minor quibbles though.
Yes, they take it as an attack on their social standing. It is very hard to communicate new concepts without positioning yourself as someone who might have something to teach, and that requires your audience to position themselves as someone who might have something to learn. This gets ten times harder when the thing you have to teach isn’t just something about a topic they’ve never thought about before, but something they feel confident about and would feel dumb being wrong on. Notice, for example, how people refer to Scott Adams as the “sex hypnotism guy”, try to twist his words and ask why he’s supporting a “master liar”, and do other things in attempts to sleazily discredit him instead of just letting people laugh at him for being an idiot that believes in voodoo hypnotism or beating him on the object level and giving more persuasive arguments. You’re absolutely right when you suggest that people might take it as a threat to their status.
Here’s where I think you go wrong:
You don’t come off as “socially inept” unless you also come across like you’re unaware of what you’re doing or are doing it on accident. When it is clear that you understand what you’re doing and are doing it intentionally, it comes off as intimidating, and if you do the rest of it right, not in a bad way.
In the cases that count, what you do is use their (limited) vocabulary to build a picture that they cannot comprehend/violates the beliefs they’re attached to, and leave them with it to do what they want. Yes, they will still often take it as an assault on their status and a claim of “I am better than you”, so you want to make really really sure that you aren’t motivated in part by an attempt to make them look bad, but rather to teach them/to give them a chance to teach you (which requires you putting out your model so that they can show you what’s wrong with it). Since, if you’re doing this right, you aren’t claiming to be better than them, any “I’m better than you” feelings will be entirely internally generated and they’ll know this. That’s why it will feel intimidating to them, but not in a bad way. When they are the ones saying “I think he’s better than me” and you are the one saying “no, really, I’m not. I just know this one thing and I’m telling you so that you can know this one thing too” (and meaning it), then that is a very good outcome given the situation. That’s not to say some people won’t try to pin their feelings on you anyway, but they don’t have to stick, and not everyone will.
Basically, the trick is to use their vocabulary to point out contradictions and keep inviting them into that cognitive dissonance while not at all pushing them in or actively implying that you’re better than them. If you do it right, they’ll realize that you’re right, that you know something they don’t, and that you won’t think any less of them for saying “huh, never thought of that” nor that you think it’s your position to be giving out approval or disapproval. I really do admire people who can put their ego aside and learn things, and I aspire to be that way myself. When that comes across correctly, I generally don’t have problems explaining weird/potentially threatening concepts to people.
Yes, even when you do a good job people will react hostilely to you, try to misrepresent you, and try to paint you as someone who thinks they’re better than everyone else. And yes, if you don’t have the time/energy/ability to deal with this, the right answer is to not do this. Personally, while I understand the justification for saying things like “you’re the best!” and while this insight you conveyed has also caused me to move in that direction, I would still be very cautious about how you do that kind of thing. For one, I really value the ability to say “no, that dress doesn’t make your ass look far” and for it to be taken at face value and remove all anxiety, and for compliments to be undiluted in meaning when they come from me. More relevant to the point at hand though, if you aren’t careful you might end up telling them that you can’t threaten their status, and that would be a harmful lie. To speak metaphorically, if you can come off like clifford the big red dog, that’s a really good thing. It means people can feel safe around you because the idea of you turning on them just doesn’t occur to them in the first place. However, if you can’t pull off “I am a ridiculously oversized predator with fangs the size of your head, and you’re not afraid of me because you trust me”, it is not a worthwhile compromise to defang yourself, let your muscles atrophy, and let them keep you in a cage so that they no longer fear you.
If that is the choice at hand, I think it’s better to be fairly quiet and just not really engage with those types, because at least then there’s the option for them to ask why you’re so quiet and you can give them the honest answer that you didn’t think they wanted to hear what you would have to say—and that gives them the chance to decide that they do.
When you do have the time/energy/ability to deal with it, that hostility is a feature, not a bug. It’s peacocking and inviting shit tests, in PUA terms. Heck, look at how much that hostility “got in the way of” Trump’s political campaign so far.
I don’t think it’s the same thing. Trump’s speech leads to people adopting wrong beliefs.
There are many issues where Trump lies about an issue where the truth would be simple to explain and be understood by average people. When Trump tells the public that John Stewart invited Trump multiple times when John Stewart did no such thing it might be “emotionally true” in the sense that people who watch Trump want to emotionally belief.
Trump tells lies that are wrong on a very simple factual level and lead to people believing simple factual falsehoods.
The post has more to do with lies that other politicians tell. Berny Sanders for example said in on of the debates that America is the richest country on earth. There are countries with a richer per capita GDP but that’s besides the point that Sanders made for the debate.
It’s interesting that the best example you could come up with appears to be an obscure bit of trivia. I wasn’t able to figure out the exact details by searching, but Jon Steward certainly said many things that sounded like he was implying he’d love to have Trump on his show, e.g., this. I suspect, what may have happened is that Jon Steward (whose whole schtick is telling lies and half-truths, using a laugh track in lieu of a counter-argument, and pleading “just joking” when called on it) likes to imply he would totally beat Trump in an argument. A much more fun thing to say until Trump implies you’re just desperate to have him on the show for the ratings boost.
Which was? I’m guessing it was something along the lines of “America is the richest country on earth therefore we can afford to adopt ”.
I’m sorry, I good the name wrong. I meant to say John Oliver and got the last name wrong. I referencing information from one of his videos on Trump. I think Last Week Tonight generally follows at least Karl Roves 100% truth test.
Pieces of trivia make good examples because they are less politically charged. If you read “politics is the mindkiller” and understand it than you make effort in choicing nonpolitical examples to be able to think more rational.
Rationally analyzing a person like Trump isn’t easy and looking at examples that are in that trivia reference class instead of looking at highly charged political examples is much better if your goal is to understand the kind of person that Trump happens to be.
I think it was something about how America has more people who suffer in poverty than many European countries.
This doesn’t exactly inspire me to trust your memory about other details of the story.
Specifically, he appears to have made a joke that could reasonably be interpreted as an invitation to Trump (specifically inviting an alias Trump once used), then said “that was only a joke” when Trump called him on it.
I admittedly haven’t watched it, but isn’t that the show that perfected the “laugh track in place of counter-argument without other breaks so viewers don’t have time to rationally process what’s being said” format.
The goal of my post isn’t to convince you. There’s a bunch of politics involved and additionally, it’s about the distinction of states for which I believe jimmy to which I have replied to have mental models, but where there’s a good chance that you don’t. The best way to explain those to you would likely to talk about hypnosis in a nonpolitical context and I don’t want to get into that at this point.
And why does this discussion of psychological states depend no you asserting false statements about contemporary politics?
I don’t think that it depends on them. The fact that you think it does, indicates that the context of politics puts you into a defense way of approaching this conversation and that’s a state in which it’s unlikely that it’s easy to complicate a complex subject, and there’s no real reason for me to put in that work.
Then why are you asserting them?
You don’t think it’s the same thing as what Trump is doing, or the same thing that Scott Adams is referring to when he says trump is doing it?
There are a bunch of things that are getting mixed up here. Clearly Trump tells lies that lead to people believing simple factual falsehoods. That much doesn’t even contradict that main thesis here, and it also applies to anyone that believed Bernie when he said that America is the richest country on earth.
I think what you meant is probably that Trump says things that lead people to be mislead on the things that actually matter (as judged by you) and that he’s not actually a great example of saying the “truest” things, in this strange but important sense. I actually agree with you there too, though I think I blame Trump less for this than you do because I think he’s legitimately bad at figuring out what is true and so when he might say something about vaccines causing autism, for example, it’s more about him being genuinely wrong than knowing the right answer and maliciously lying about it. Hanlon’s razor, basically.
Additionally, I think you’d argue that Trump isn’t doesn’t seem to care enough about the truth and is reckless in that way, and I’d probably agree with you there too. None of this challenges Adam’s main point here though, which is that Trump’s messages, despite being easily fact-checked as false, contain (other) things which Trump does not actively disbelieve and are evaluated as both important and true by his followers—even if Christian (or Jimmy, or anyone else) thinks that those things are false as well.
It’s important to look at how people respond to proof that his statements don’t pass the fact checks. If they feel betrayed by trump or if there’s cognitive disonnance induced, then your criticism is valid and it’s simple lying and pandering to wishful thinking. If, on the other hand, you get “lol, don’t care” then you’re missing the point and aren’t actually addressing what they think is important and true. I see both in Trump’s followers, but the interesting part is that I see far more of the latter than I have with any other politician. In other words, I think Adams has a point.
I don’t think this is clear at all. At least the statements of his that people object to the loudest aren’t lies.
I don’t think the issue of whether or not Trump was invited by Last Week Tonight is an issue that “actually matters”.
But lets go to an issue that matters. “Do vaccines cause autism” It’s factually wrong but I also think that a majority of Trump followers don’t. The demographics of vaccine denailism is not equivalent with Trumps supporters.
If you take a Trump belief like “exercise is bad for your health” it’s even more clear. That’s not the kind of lie that someone who simply wants to do persuasion tells. It’s also a very strange lie to tell for a person who learned their persuasion skills from Tony Robbins.
I’m not sure I follow all the details of what you’re saying, but it seems like your main point is along the lines of “That’s no the kind of lie that someone who simply wants to do persuasion tells”, and with that I completely agree.
That seems to be a reasonable reading and I think we are in agreement.
Some of what Trump says is both emotionally and empirically wrong. The concept of “emotional truth” isn’t a carte blanche to claim that anything you want is “true in some way;” it’s a different way of communicating, and can be used to deceive as well as inform.
Some things Trump says are empirically wrong, but emotionally true, and those I have some measure of sympathy for.
Honestly, I’m not sure how much Scott Adams even believes what he says. I suspect part of it is that his target audience is people for whom “don’t worry Trump doesn’t actually believe these things, he’s just saying them to hypnotize the masses” is less threatening then “actually these things Trump says are true”. If you want the latter, I recommend Steve Sailer.
I’m at the moment at my 7th seminar of Chris Mulzer who’s trained by Richard Bandler. Scott Adams suggest that Trump learned hypnosis from Tony Robbins who was also trained by Richard Bandler.
I understand the kind of lie that Bandler and his students tell and the intellectual groundwork behind them and what those people want to communicate. To me Trump doesn’t pattern match with that. It rather pattern matches with psychopath based on a model I build from people who actually have a clinical diagnosis in psychopathy.
I’m trying to pinpoint that difference. Unfortunately, that isn’t easy. Especially with an audience that’s doesn’t have a good mental model about how a hypnotist like Richard Bandler lies.
I totally agree that he doesn’t look like “trained hypnotist that thinks things through and has a nuanced plan for what he’s trying to communicate”. Looking at Trump and concluding “don’t worry guys, get him in a private room and he’ll drop the act and explain exactly how this all makes sense” would be a mistake.
At the same time, what he’s doing is effective, and largely for similar reasons. The important difference is that you can’t really trust him to be doing anything other than emotional gradient following, and he’s a reason to get serious and step up your game to make sure that important things aren’t underrepresented, rather than to sit back and trust that things are in the hands of an expert.
I’m actually just starting to look into hypnosis a bit. I found a blog by an LW person at https://cognitiveengineer.blogspot.com/
You have any recommendations? I’m getting enough to tell there’s something interesting being described, but not enough to get it quite down pat.
Not just “an person” the author is jimmy towards whom I replied above.
For myself reading literature and hearing audio books didn’t give my any skills in the subject. I learned the largest chunk of my skills from Chris Mulzer. I also went to other people and read afterwards about the subject but I’m not an autodidact in it. jimmy on the other hand is an autodidact. In http://lesswrong.com/lw/pbt/social_insight_when_a_lie_is_not_a_lie_when_a/dw9g?context=3 , both I and jimmy consider the strategy of getting Reality Is Plastic: The Art of Impromptu Hypnosis and doing the exercises in it with a willing subject to be a good starting point for developing actual skill.
At the moment there’s an idea in my head that it would be possible to create a better course for this learning hypnosis from the beginning than what’s out there. If you find someone who wants to practice with you hypnosis in person, I would be willing to do more specific guidance about what to do. Maybe jimmy also wants to pitch in and we can create a kind of course together.
I don’t think reading blog posts or forum posts is enough to develop actual skill but if your goal is just information there’s the forum http://www.uncommonforum.com/viewforum.php?f=16 where jimmy, myself and a bunch of other people had a few long discussions about hypnosis in the past.
After this post of yours I think you might be really interesting to talk to on the subject. Let me know if you want to chat sometime (I’m that LW person mentioned).
We’re mostly on the same page, really.
Much of what I’ve said applies to politics with large electorates, where the default case is that you can’t effectively teach new concepts and people don’t want to learn them, anyway.
In small groups, by all means, there are times when it’s a very powerful move to try and teach people. There are even times, in all arenas, where saying “I’m better than you” is a useful move, you just don’t want to be limited to that one move.
I also strongly value being honest and known to be honest. I find “you’re the best” statements to be acceptable insofar as the other person KNOWS what I mean and is not deceived in any way. The key insight here is that the explicit meaning of the words is not the real meaning of a statement in many contexts. Don’t ask what the words of a sentence mean, ask what it means for someone to say those words in this situation, in other words. “You’re the best” doesn’t actually mean “I would bet money on you against Muhammad Ali,” and nobody thinks it does, which is why it doesn’t communicate any false information. It doesn’t communicate ANY information about how the world works, nor does it try to; it’s more like the verbal equivalent of a shot of caffeine
Ah, I didn’t realize you were focused on large scale politics and figured you were using it as merely one example.
I’m not so sure I agree on that completely. Certainly it’s more in that direction, and you aren’t going to be able to explain complex models to large electorates, and I don’t have time to coherently express my reasoning here, but it certainly appears to me that teaching is possible on the margin and that this strategy still works on larger scales with more of those inherent limitations.
I agree that “you’re the best” isn’t dishonest so long as the person knows what you mean. My point wasn’t about honesty so much as whether you want to dilute your message. I should be clear that it doesn’t always apply here and I don’t claim to have the full answer about exactly how to do it, but I have found value in avoiding certain types of these “honest literal-untruths” or whatever you’d like to call them. In cases where one might want to say “you got this!” as normal encouragement, abstaining from normal encouragement makes it easier to convey real confidence in the person when you know for a fact that they can do it. Both have value, but I do feel like the latter is often undervalued while the former is overvalued.
I’ve classically been a literalist super-honest guy, and now intend to be super-honest about what I make the other person hear.
I think them knowing I’m being honest about what they hear is sufficient to grant me all the benefits I’ve enjoyed in the past, while avoiding some of the disadvantages
This is fair.
But this is a completely different case. Lies told to stupid people are still lies, the stupid people don’t understand the truth behind them, and you have communicated nothing. You could argue that those lies are somehow justified, but there is no parallel between lying to stupid people and things like “You’re the best”.
Can you say it again while tabooing “lie?”
My guess is that you’re saying that if X says something that they know will be interpreted as abc, then it is a lie even if abc is true, if X personally interprets the statement as xyz, or perhaps if the “true” meaning of the thing is xyz instead of abc
Case 1: Alice tells Bob that “X is true”, Bob then interprets this as “Y is true”
Case 2: Alice tells Bob that “X is true”, because Bob would be too stupid to understand it if she said “Y is true”. Now Bob believes that “X is true”.
These two cases are very different. You spend the first half of your post in case 1, and then suddenly jump to case 2 for the other half.
Supposing that Y is the correct answer to a question, but you are incapable of communicating it to Y, some kind of less or differently true substitute must be used, in terms of the language that they speak and understand
Sure, and if X really is the best approximation of Y that Bob can understand, then again Alice is not dishonest. Although I’m not sure what “approximation” means exactly.
But there is also a case where Alice tells Bob that “X is true”, not because X is somehow close to Y, but because, supposedly, X and Y both imply some Z. This is again a very different case. I think this is just pure and simple lying. That is, the vast majority of lies ever told fall into this category (for example, Z could be “you shouldn’t jail me”, X could be “I didn’t kill anyone” and Y could be “sure, I killed someone, but I promise I won’t do it again”).
In general, the problem is that you didn’t give specific examples, so I don’t really know what case you’re referring to.
Hello everyone, I am from USA. I am here to share this good news to only those who will seize this opportunity.I read a post about an ATM hacker and I contacted him via the email address that was attached in the post. I paid the required sum of money for the blank card I wanted and he sent the card through UPS Express Delivery Shipment, and I got it in 3days. I got it from him last week and now I have withdrew $50,000 USD. The blank ATM card is programmed in a way that it can withdraw money from any ATM machine around the world. Now I have so much money to put of my bills with the help of the card. I am really happy I met Mr.Esa Perez who helped me with this card because I’ve heard about this card long ago but I had no means of getting it until I came across Mr. Esa Perez. To contact him, you can send him a mail or visit his website. Email: unlimitedblankatmcard@gmail.com Website: http//:unlimitedblankatmcard.webs.com
Suppose X is the case. When you say “X” your opposite will believe Y, which is wrong. So, even though “X” is the truth, you should not say it.
Your new idea as I understand it: Suppose saying “Z” will let your opposite will believe X. So, even though saying “Z” is, technically, lying, you should say “Z” because the listener will come to have a true believe.
(I’m sorry if I misunderstood you or you think I’m being uncharitable. But even if I misunderstood I think others might misunderstand in a similar way, so I feel justified in responding to the above concept)
First I dislike that approach because it makes things harder for people that could understand, if only people would stop lying to them or prefer to be told the truth along the lines of “study macro economics for two years and you will understand”.
Second, that seems to me to be a form of the-end-justifies-the-means that, even though I think of myself as a consequentialist, I’m not 100% comfortable with. I’m open to the idea that sometimes it’s okay, and even proper, to say something that’s technically untrue, if it results in your audience coming to have a truer world-view. But if this “sometimes” isn’t explained or restricted in any way, that’s just throwing out the idea that you shouldn’t lie.
Some ideas on that:
Make sure you don’t harm your audience because you underestimate them. If you simplify or modify what you say to the point that it can’t be considered true any more because you think your audience is limited in their capacity to understand the correct argument, make sure you don’t make it harder to understand the truth for those that can. That includes the people you underestimated, people that you didn’t intend to address but heard you all the same and people that really won’t understand now, but will later. (Children grow up, people that don’t care enough to follow complex arguments might come to care).
It’s not enough that your audience comes to believe something true. It needs to be justified true believe. Or alternatively, your audience should not only believe X but know it. For a discussion on what is meant with “know” see most of the field of epistemology, I guess. Like, if you tell people that voting for candidate X will give them cancer and the believe you they might come to the correct believe that voting for candidate X is bad for them. But saying that is still unethical.
I guess if you could give people justified true believe, it wouldn’t be lying at all and the whole idea is that you need to lie because some people are incapable of justified true believe on matter X. But then it should at least be “justified in some sense”. Particularly, your argument shouldn’t work just as well if “X” were false.
I think you’ve hit upon one of the side effects of this approach
All the smart people will interpret your words differently and note them to be straightforwardly false. You can always adjust your speaking to the abilities of the intelligent and interested, and they’ll applaud you for it, but you do so at the cost of reaching everybody else
I understand your post to be about difficult truths related to politics, but you don’t actually give examples (except “what Trump has said is ‘emotionally true’”) and the same idea applies to simplifications of complex material in science etc. I just happened upon an example from a site teaching drawing in perspective (source):
The author way lied to about the possible number of vanishing points in a drawing. But instead of realizing the falsehood he was confused.
[Quick comprehension check]: I think that you are saying that it is important to acknowledge when our notions of truths and lies break down because saying a thing that is apparently “true” can have connotations we didn’t intend, thus making it “false”. And you’re flipping it around to say that the opposite is also valid—that you can say a thing which is apparently “false”, yet the way that it’s interpreted could make it more “true”.
I think you are saying that there are other factors when communicating, which is the context you convey with your words, i.e. meaning that is imparted which is distinct from the actual referents of the words in your utterances. And that this meaning is also important to keep in mind because we can’t “just” communicate with only the words themselves, apart from connotation / context. It’s just part of the package.
I think that you then took this to show that there are often times where knowledge about true things isn’t easily transmissible due to a lack of prerequisite knowledge. And that this has problems when that information might be important.
[Actual response, if the above was accurate]: I think the part of this essay about how trying to get across points is often difficult is important. There are certain tradeoffs to be wary of, like when someone asks you a question, and you give an abridged / simplified question to optimize more for communication rather than accuracy. (EX: Giving someone a stripped-down description of your medical condition when they ask you why you’re taking a pill.)
Thus, one of the questions we might want to take out of this is “How can we convey information many inferential steps away from the other party, especially when it’s beneficial to them?” which seems like it could be resolved several ways:
1) Take the time to build up their prerequisites.
2) Convince them you’re competent / trustworthy such that they can defer to your judgment.
3) Tell them false things such that they do the thing the information would have convinced them to do.
(I don’t really like these options. Feel free to take this as an open invitation to spend 3 minutes thinking of other things.)
Anyway, it’s less clear to me that you can tell people false stuff to make them believe true stuff. It feels more like you can people false stuff to do either 2 or 3, but not 1.
Suppose that a vast group of statements that sound (they really, REALLY sound) like propositions about economic cause and effect are ALL interpreted by a great many people always and only as either “Yay blues” or “Boo blues.”
In that case, your ability to tell the truth is limited by their way of filtering your statements, and your ability to tell lie is equally hampered. All you can do is decide whether to say Yay or Boo or not say anything at all (which will also often be interpreted one way or the other if you’re involved in politics). It is an illusion that you’re saying something about the minimum wage, for example. All you’re really saying is “Yay blues!” as far as a great many people are concerned.
And if you’re aware of this and count on it, you can choose to use statements that way on purpose, such that that IS all you’re really saying.
This is most of politics.
Thanks for the additional clarification.