Hufflepuff Cynicism
Summary: In response to catching a glimpse of the dark world, and especially of the extent of human hypocrisy with respect to the dark world, one might take a dim view of one’s fellow humans. I describe an alternative, Hufflepuff cynicism, in which you lower your concept of what the standards were all along. I give arguments for and against this perspective.
This came out of conversations with Steve Rayhawk, Harmanas Chopra, Ziz, and Kurt Brown.
In See the Dark World, Nate describes a difficulty which people have in facing the worst of a situation. In one of his examples, in which Alice and Bob grapple with the idea that the market price of saving a life is $3000, Bob’s response is to deny what this means about people—to avoid concluding that humans aren’t generally willing to pay more than 3k to save a life, Bob decides that money is not a relevant measure of how much people would sacrifice to save a life. This is a non-sequiter, but allows Bob to sleep at night.
Alice, on the other hand, decides that a life really is only worth about 3k. This, to me, is something like Hansonian cynicism—when revealed preferences differ from stated preferences, assume that people are lying about what they value.
Nate calls both Alice and Bob’s response “tolerification”, and recommends undoing tolerification by entertaining the what-if question describing the dark world (following the leave-a-line-of-retreat method).
Ziz gives a somewhat similar analysis in social reality: when you recognize the distance between the social reality most people are willing to endorse and the real reality, you have three options:
Make a buckets error where your map of reality overwrites your map of social reality, and you have the “infuriating perspective”, typified by less-cunning activists and people new to their forbidden truths. “No, it is not ‘a personal choice’, which means people can’t hide from the truth. I can call people out and win arguments”.
Make a buckets error where your map of social reality overwrites your map of reality, and you have the “dehumanizing perspective” of someone who is a vegetarian for ethical reasons but believes truly feels it when they say “it’s a personal choice”, the atheist who respects religion-the-proposition, to some extent the trans person who feels the gender presentation they want would be truly out of line…
But it was all right, everything was all right, the struggle was finished. He had won the victory over himself. He loved Big Brother.
Learn to deeply track the two as separate, and you have the “isolating perspective”. It is isolating to let it entirely into your soul, the knowledge that “people are good and rational” is pretense.
This is importantly different from Nate’s analysis in several ways, but I’ll leave that as an exercise for the reader.
What’s interesting to me is that in certain situations like this, I seem to take what Ziz calls the “isolating perspective” in a way which is not isolating at all—I can cleanly seperate between the social reality and what’s real, but it doesn’t feel especially isolating. Humans are like cats. Sure, sometimes they claw up the upholstry, but it’s no use being upset—you have the choice of declawing them, or trying to train them by spraying them with water. Humans are not automatically strategic. Work with what’s there.
I call this Hufflepuff cynicism because I think most people assume a cynic is like a slytherin (or, in Ziz’s case, a sith) -- but hufflepuffs, above all, have to deal with people as they are. It seems to me that Hufflepuff cynicism is where you end up if you start out as a starry-eyed young Hufflepuff who just wants to do good, and then you become a teacher or a doctor and have to help with a whole bunch of people who are more ignorant on a subject than you. Helping people doesn’t mean correcting every error. It means nudging people in the right direction as best you can. This perspective allows you to avoid creating Hufflepuff Traps.
ETA, more precise definition of Hufflepuff cynicism:
Track social reality and real reality seperately, but don’t get upset when this allows you to recognize ways that people don’t live up to standards which they would claim to endorse.
Adapt your sense of the standards to roughly conserve the amount of “violation of basic standards” you see, rather than letting it increase as you see better.
Speak to others in the way you expect them to understand: work around truths which you suspect they’re unable or unwilling to accept. Feel no negative judgement of them in doing so.
Have a strong enough assumption of good faith around other people’s beliefs and actions that you want to chesterton’s-fence them before correcting them.
This certainly isn’t a perfect response. One problem is that sometimes voicing my real thoughts lead to horrible awkwardness or social punishment. Tracking social reality and real reality seperately isn’t very helpful in circumstances where I really want to be honest for one reason or another. However, I recently talked with some people about my hufflepuff cynicism, and they had some arguments against it which surprised me and which I found fairly compelling.
You Can’t Choose between Lying to Others and Lying to Yourself
In Honesty: Beyond Internal Truth, Eliezer discusses two paths to rationality: scrupulous truth-telling, vs allowing yourself to lie to others so that you can keep your internal beliefs uncontaminated by social incentives. Eliezer preferred the first route, but granted the possibility of the second route. However, it seems like what really happens when you lie to others is that you start to believe the lies. Paul Christiano discusses how humans are bad at lying in If We Can’t Lie to Others, We Will Lie to Ourselves. Scott Alexander recently discussed how, in hindsight, many of his attempts to seperately track what’s true and what he can say have failed:
Sometimes I can almost feel this happening. First I believe something is true, and say so. Then I realize it’s considered low-status and cringeworthy. Then I make a principled decision to avoid saying it – or say it only in a very careful way – in order to protect my reputation and ability to participate in society. Then when other people say it, I start looking down on them for being bad at public relations. Then I start looking down on them just for being low-status or cringeworthy. Finally the idea of “low-status” and “bad and wrong” have merged so fully in my mind that the idea seems terrible and ridiculous to me, and I only remember it’s true if I force myself to explicitly consider the question. And even then, it’s in a condescending way, where I feel like the people who say it’s true deserve low status for not being smart enough to remember not to say it. This is endemic, and I try to quash it when I notice it, but I don’t know how many times it’s slipped my notice all the way to the point where I can no longer remember the truth of the original statement.
It is not at all clear that you can really make a decision to seperately track what you say to others and what is true. Ziz’s “isolating perspective” and my “hufflepuff cynicism” may sound like the obvious path out of the cath-22, but perhaps this is not as sustainable as it may seem. Eliezer’s path is much closer to Ziz’s “infuriating perspective”.
Becoming Complicit
A second argument against Hufflepuff cynicism is that you make yourself complicit in a bad social equilibrium. At its root, Hufflepuff cynicism is the claim that it doesn’t make sense to hold people up to standards any higher than they’re already mostly meeting. From one perspective, this is just common sense—what are standards for, if they’re being broken constantly? From another perspective, however, this is just refusing to help. And, in cases of hypocrisy, it may mean refusing to help kick a system out of a broken equilibrium.
The part of me that runs on Hufflepuff cynicism tends to correct someone exactly once, and if they don’t change, forever assume that they’re not interested in or open to advice on the matter. I’m often reluctant to do even that, because of a deeply-seated assumption that people mostly would have corrected an error if they were interested/able.
Here’s an anecdote from Kurt Brown:
The other day, someone was estimating time for me, and for a moment, it seemed that he thought I thought that it was 3 (rather than 3:30, the true time). His claim that something started in 2 hours (at 5) appeared to me to be intentionally propagating the error that I thought he thought I had made, presumably because he didn’t feel the need to correct me. I was disturbed and asked something like “this math doesn’t work out, and it looks like you should know this—are you huffcyn-ing me, bro??”
He was not, he had simply made the same error in a confusing way. I draw no conclusions, but it was an interesting experience!
Refraining from correcting someone about what time it is isn’t exactly like something my huffcyn algorithm would do, because there’s no plausible story in which the person is just incapable or unwilling to correct their time estimate. However, it feels worryingly close to things I might do.
The worst examples of this, and the most likely to actually happen, involve living within a corrupt system. The Hufflepuffian cynic is likely to pragmatically refuse to speak out, or speak out exactly once and then remain forever silent. This means a system in which all cynics are Hufflepuff cynics is not as self-correcting as one in which people are more likely to take the “infuriating perspective”.
Conclusion?
I don’t know whether to recommend Hufflepuff cynisism or not. It may help you face the dark world, bringing your beliefs closer in line with the truth. On the other hand, it may make you complicit in the perpetuation of the dark world itself. The way I see it now, there’s a catch-22 whose resolution is unclear.
If you try to cleanly distinguish between what you can say and what’s true, the latter may lose relevance over time until it is forgotten. If you try to only speak the truth, you may be ostracized or give in to the social incentives by editing your beliefs to be acceptable.
Hopefully, it is at least helpful to have the term in your lexicon, to identify Hufflepuff cynicism when it occurs (be it good or ill).
- Hufflepuff Cynicism on Hypocrisy by 29 Mar 2018 21:01 UTC; 21 points) (
- Hufflepuff Cynicism on Crocker’s Rule by 14 Feb 2018 0:52 UTC; 16 points) (
- 6 Nov 2022 1:10 UTC; 3 points) 's comment on tobyj’s Quick takes by (EA Forum;
Quoting myself from Facebook:
I think this is an important question and I don’t have an answer, but a first step I’d push for is to scream when you encounter something worth screaming about, and to allow yourself to live in a world where you may have to scream a lot.
One thing—maybe the only thing—that creeps people out about the sort of people who go around talking about uncomfortable truths is that they seem way too goddamn comfortable with them. If that’s what you’re doing then you’re not letting them sink in enough to scream.
I’d love to hear your thoughts on A Fable Of Politics And Science. Would you say that Barron’s attitude is better than Ferris’s, at least sometimes?
I think Barron sees something important that Ferris is missing, but that he’s also making a mistake.
Screaming is almost never truth-maximimizing. Truths are not comfortable nor un-, they just are true. The fact that many are creeped out by your apparent comfort with certain truths, and they prefer to deny those truths in order to be comfortable, is just one of those truths you’ll need to get comfortable with.
Why is comfort truth-maximizing? From Twelve Virtues:
If the chair is made of barbed wire, does the Way not oppose my comfort in sitting there? I think Hufflepuff Cynicism more or less agrees that screeming is not truth-maximizing (“If you must scream,” says my inner hufflepuff cynic, “do so exactly once!”), but I’m not sure I agree.
I almost didn’t click through to If We Can’t Lie to Others, We Will Lie to Ourselves, but it’s an excellent post that I’m glad I didn’t miss.
I second this; that’s a great post. It’s one of the most well-articulated arguments against radical honesty I’ve seen (though not the only one, certainly).
Idk, I don’t need any special cynicism to accept the fact that people are wildly different in any given skill (science, empathy, strategic thinking, etc). It’s just normal. And in any skill there’s also tons of people above me, so there’s no point looking down too much.
There are a bunch of cases where you don’t need any kind of cynicism. I didn’t try to say anything about the special category of truths for which you do need some special mindset, but I think there is such a category. I think it is hard to write about what the category is, due to the incentive structure, but Nate and Ziz both do a decent job of gesturing toward it.
This post seems a little bit vague and I’m not sure I fully understand what it’s pointing at. So I will respond to the best of my understanding, but I might be a little off the mark?
I think the way you interact with a person should depend a lot on the particular person. The key question is “what common ground do you have?” Communicating across inferential distance is hard, and it’s more or less impossible if you don’t even know what the inferential distance is. You can make incremental progress in communicating ideas, but it requires introducing a new idea only when the ground for it was already prepared by previous ideas it depends on.
I think that this doesn’t contradict strong commitment to honesty. Honesty means you should not say falsehoods, but it doesn’t mean that you have to be fully transparent with everyone and it doesn’t mean you should voice every possible opinion at every possible situation.
The real challenge is what to do you when you’re speaking publicly. I think that, once again, the answer depends on who your target audience is. But of course you should take into account that people outside the intented target audience might also hear the same words. Here I feel that I might be out of my depth, since I was never good at politics, and I’m not ready to make the claim that you can be good at politics without lying. However, I do feel that this should be a more or less binary decision. If you decided to go into politics, this is one thing. But if you decided not to go into politics, then be honest. Maybe we need good people of both kinds? Politicians can acquire power and use it for good ends. However, only the Honest can be truly committed to seeking truth, and only the Honest can make alliances based on real trust.
I added a more cleanly delineated definition of Hufflepuff cynicism; hopefully that’ll help clarify. On my view, everything you’re saying is compatible with Hufflepuff cynicism. Someone who is not a Hufflepuff-cynic might respond that a “strong commitment to honesty” is not very meaningful if you lie by omission all the time, especially if it gives a false image about something important—not that you should always tell the whole truth even if it hurts someone’s feelings or gets you fired etc, but that you should sometimes be unwilling to compromize even when your words will not be received in a way which will increase the accuracy of the other person’s beliefs.
Thank you Abram, that helps.
It sounds like I endorse this Hufflepuff cynicism myself. If honesty does not contradict HC then I don’t see why HC should cause you to lie to yourself. And I don’t think HC implies becoming complicit: I see it more as looking for opportunities to nudge things in the right direction. I also don’t necessarily agree with the rule of correcting exactly once. You might correct any number of times if you think hard enough about the form in which you deliver your criticism and improve over time. I certainly have to correct my son many times before ey learn anything, lol.
Regarding the notion that “you should sometimes be unwilling to compromize even when your words will not be received in a way which will increase the accuracy of the other person’s beliefs,” I think that precisely because your goal is increasing the accuracy of the other person’s beliefs, you should not always be blunt about it. In order to change someone’s beliefs you need to find a way which will cause em to listen.
I agree in principle, but in practice, I am concerned. Dancing around taboo topics in your speech makes it easy to dance around them in your head, too.
Again, I agree in principle, but am concerned in practice. Sometimes, nudging isn’t enough. HC can look an awful lot like cowardice / risk aversion.
Perhaps this is more of a bug in my implementation of HC. :)
It’s somewhat difficult for me to play the anti-HC side, but I suspect it would be something about “sometimes the accuracy of their beliefs isn’t all that’s at stake”.
Cynic, n: a blackguard whose faulty vision sees things as they are, not as they ought to be.
I’m not sure whether I agree with this, or am baffled by it. The bafflement comes mainly form the very high level of jargon.
I don’t think its basic premise is false. I think it is accurately describing a catch-22. It is easy to misread it as an argument that you choose between lying to others and lying to yourself, but in fact Paul himself notes that humans are not actually very good at lying. Or at sticking to the truth. A slightly more accurate title might have been: “If we can’t lie to others, we will lie to ourselves. Also, we can’t lie to others.”
FYI: the link https://www.lesswrong.com/posts/5YzC6wECNxGP3XGKA/willpower-and-hufflepuff-traps is broken
And sometimes, it’s about money.
-----
Me: “There was a patch of forest here.”
Internal voice: “There are other places, some of them defendable.”
Me: “They burned it down.”
IV: “Yeah, but you don’t get to see the success stories. You know there are success stories.”
Me: “The ground’s still hot.”
IV: “And that’s what will certainly happen to those other places if you don’t survey them first.”
Me: “It’s about money.”
IV: ”...okay, let’s go get a life.”