Bind Yourself to Reality
So perhaps you’re reading all this, and asking: “Yes, but what does this have to do with reductionism?”
Partially, it’s a matter of leaving a line of retreat. It’s not easy to take something important apart into components, when you’re convinced that this removes magic from the world, unweaves the rainbow. I do plan to take certain things apart, on this blog; and I prefer not to create pointless existential anguish.
Partially, it’s the crusade against Hollywood Rationality, the concept that understanding the rainbow subtracts its beauty. The rainbow is still beautiful plus you get the beauty of physics.
But even more deeply, it’s one of these subtle hidden-core-of-rationality things. You know, the sort of thing where I start talking about ‘the Way’. It’s about binding yourself to reality.
In one of Frank Herbert’s Dune books, IIRC, it is said that a Truthsayer gains their ability to detect lies in others by always speaking truth themselves, so that they form a relationship with the truth whose violation they can feel. It wouldn’t work, but I still think it’s one of the more beautiful thoughts in fiction. At the very least, to get close to the truth, you have to be willing to press yourself up against reality as tightly as possible, without flinching away, or sneering down.
You can see the bind-yourself-to-reality theme in “Lotteries: A Waste of Hope.” Understanding that lottery tickets have negative expected utility, does not mean that you give up the hope of being rich. It means that you stop wasting that hope on lottery tickets. You put the hope into your job, your school, your startup, your eBay sideline; and if you truly have nothing worth hoping for, then maybe it’s time to start looking.
It’s not dreams I object to, only impossible dreams. The lottery isn’t impossible, but it is an un-actionable near-impossibility. It’s not that winning the lottery is extremely difficult—requires a desperate effort—but that work isn’t the issue.
I say all this, to exemplify the idea of taking emotional energy that is flowing off to nowhere, and binding it into the realms of reality.
This doesn’t mean setting goals that are low enough to be “realistic”, i.e., easy and safe and parentally approved. Maybe this is good advice in your personal case, I don’t know, but I’m not the one to say it.
What I mean is that you can invest emotional energy in rainbows even if they turn out not to be magic. The future is always absurd but it is never unreal.
The Hollywood Rationality stereotype is that “rational = emotionless”; the more reasonable you are, the more of your emotions Reason inevitably destroys. In “Feeling Rational” I contrast this against “That which can be destroyed by the truth should be” and “That which the truth nourishes should thrive”. When you have arrived at your best picture of the truth, there is nothing irrational about the emotions you feel as a result of that—the emotions cannot be destroyed by truth, so they must not be irrational.
So instead of destroying emotional energies associated with bad explanations for rainbows, as the Hollywood Rationality stereotype would have it, let us redirect these emotional energies into reality—bind them to beliefs that are as true as we can make them.
Want to fly? Don’t give up on flight. Give up on flying potions and build yourself an airplane.
Remember the theme of “Think Like Reality”, where I talked about how when physics seems counterintuitive, you’ve got to accept that it’s not physics that’s weird, it’s you?
What I’m talking about now is like that, only with emotions instead of hypotheses—binding your feelings into the real world. Not the “realistic” everyday world. I would be a howling hypocrite if I told you to shut up and do your homework. I mean the real real world, the lawful universe, that includes absurdities like Moon landings and the evolution of human intelligence. Just not any magic, anywhere, ever.
It is a Hollywood Rationality meme that “Science takes the fun out of life.”
Science puts the fun back into life.
Rationality directs your emotional energies into the universe, rather than somewhere else.
- To Spread Science, Keep It Secret by 28 Mar 2008 5:47 UTC; 82 points) (
- Is Humanism A Religion-Substitute? by 26 Mar 2008 4:18 UTC; 62 points) (
- God Is Great by 29 Oct 2021 7:00 UTC; 13 points) (EA Forum;
- The Singularity and Its Metaphysical Implications by 28 Mar 2022 0:18 UTC; 12 points) (EA Forum;
- Rationality Reading Group: Part Q: Joy in the Merely Real by 30 Dec 2015 23:16 UTC; 9 points) (
- [SEQ RERUN] Bind Yourself to Reality by 6 Mar 2012 2:51 UTC; 9 points) (
- Agency and Life Domains by 16 Nov 2014 1:38 UTC; 8 points) (
- 2 Jun 2021 20:02 UTC; 5 points) 's comment on Which animals can suffer? by (
- 10 Apr 2014 18:07 UTC; 2 points) 's comment on Rationality Quotes April 2014 by (
- God Is Great by 30 Oct 2021 13:03 UTC; -11 points) (
Just because you feel an emotion based on something true, doesn’t mean that the emotion is reasonable. Many emotions are simply not capable of grasping all the details of reality; they base themselves on a vague picture. That vague picture may be true, but in many cases the details may well make the emotions based on the vague picture unreasonable.
So if I look up at the stars and feel a boundless wonder and awe at the immense distances I am being vague and not really reasonable at all? If I feel joy for all the people saved with modern medicine? If I feel pain for all the poverty and suffering?
How exactly are these unreasonable? The latter two drive me to do good in the world.
I think you can charitably assume that if you’ve come up with a particular emotion based on a vague picture that’s reasonable, then that’s not the one that Unknown is talking about.
To answer from my own perspective, scope insensitivity can lead to emotions comparing improperly. It is not very good if the emotion from one person being in danger motivates you just as much as 1 million people in danger.
Unknown: “Just because you feel an emotion based on something true, doesn’t mean that the emotion is reasonable”
I’m not really sure what it means for an emotion to be “reasonable”. Suppose you successfully create a thriving internet startup company and make a lot of money. Is it “reasonable” to feel happy about that?
This whole discussion flirts with the hard problem of moral realism vs. antirealism. For, if you could give a convincing rational answer to the question “what is it reasonable to feel happy about?”, you would have a realist theory of ethics.
@Eliezer: Partially, it’s a matter of leaving a line of retreat.
Yes, I like it that you’ve made this explicit.