Rationalism is about the real world. It may or may not strike you as an especially internally consistent, philosophically interesting worldview—this is not what rationality is about. Rationality is about seeing things happen in the real world and then updating your understanding of the world when those things you see surprise you so that they wouldn’t surprise you again.
Why care about predicting things in the world well?
Almost no matter what you ultimately care about, being able to predict ahead of time what’s going to happen next will make you better at planning for your goal.
One central rationalist insight is that thoughts are for guiding actions. Think of your thinking as the connecting tissue sandwiched between the sense data that enters your sense organs and the behaviors your body returns. Your brain is a function from a long sequence of observations (all the sensory inputs you’ve ever received, in the order you received them) to your next motor output.
Understood this way, the point of having a brain and having thoughts is to guide your actions. If your thoughts aren’t all ultimately helping you better steer the universe (by your own lights) … they’re wastes. Thoughts aren’t meant to be causally-closed-off eddies that whirl around in the brain without ever decisively leaving it as actions. They’re meant to transform observations into behaviors! This is the whole point of thinking! Notice when your thoughts are just stewing, without going anywhere, without developing into thoughts that’ll go somewhere … and let go of those useless thoughts. Your thoughts should cut.
If you can imagine a potential worry, then you can generate that worry. Rationalism is, in part, the skill of never being predictably surprised by things you already foresaw.
It may be that you need to “wear another hat” in order to pull that worry out of your brain, or to model another person advising you to get your thoughts to flow that way, but whatever your process, anything you can generate for yourself is something you can foresee and consider. This aspect of rationalism is the art of “mining out your future cognition,” to exactly the extent that you can foresee it, leaving whatever’s left over a mystery to be updated on new observations.
For a true Bayesian, it is impossible to seek evidence that confirms a theory. There is no possible plan you can devise, no clever strategy, no cunning device, by which you can legitimately expect your confidence in a fixed proposition to be higher (on average) than before. You can only ever seek evidence to test a theory, not to confirm it.
This realization can take quite a load off your mind. You need not worry about how to interpret every possible experimental result to confirm your theory. You needn’t bother planning how to make any given iota of evidence confirm your theory, because you know that for every expectation of evidence, there is an equal and oppositive expectation of counterevidence. If you try to weaken the counterevidence of a possible “abnormal” observation, you can only do it by weakening the support of a “normal” observation, to a precisely equal and opposite degree. It is a zero-sum game. No matter how you connive, no matter how you argue, no matter how you strategize, you can’t possibly expect the resulting game plan to shift your beliefs (on average) in a particular direction.
You might as well sit back and relax while you wait for the evidence to come in.
The citation link in this post takes you to a NSFW subthread in the story.
“If you know where you’re going, you should already be there.”
…
“It’s the second discipline of speed, which is fourteenth of the twenty-seven virtues, reflecting a shard of the Law of Probability that I’ll no doubt end up explaining later but I’m not trying it here without a whiteboard.”
“As a human discipline, ‘If you know your destination you are already there’ is a self-fulfilling prediction about yourself, that if you can guess what you’re going to realize later, you have already realized it now. The idea in this case would be something like, because mental qualities do not have intrinsic simple inertia in the way that physical objects have inertia, there is the possibility that if we had sufficiently mastered the second layer of the virtue of speed, we would be able to visualize in detail what it would be like to have recovered from our mental shocks, and then just be that. For myself, that’d be visualizing where I’ll already be in half a minute. For yourself, though this would be admittedly harder, it’d be visualizing what it would be like to have recovered from the Worldwound. Maybe we could just immediately rearrange our minds like that, because mental facts don’t have the same kinds of inertia as physical objects, especially if we believe about ourselves that we can move that quickly.”
“I, of course, cannot actually do that, and have to actually take the half a minute. But knowing that I’d be changing faster if I was doing it ideally is something I can stare at mentally and then change faster, because we do have any power at all to change through imagining other ways we could be, even if not perfectly. Another line of that verse goes, ‘You can move faster if you’re not afraid of speed.’”
…
“Layer three is ‘imaginary intelligence is real intelligence’ and it means that if you can imagine the process that produces a correct answer in enough detail, you can just use the imaginary answer from that in real life, because it doesn’t matter what simulation layer an answer comes from. The classic exercise to develop the virtue is to write a story featuring a character who’s much smarter than you, so you can see what answers your mind produces when you try to imagine what somebody much smarter than you would say. If those answers are actually better, it means that your own model of yourself contains stupidity assertions, places where you believe about yourself that you reason in a way which is incorrect or just think that your brain isn’t supposed to produce good answers; such that when you instead try to write a fictional character much smarter than you, your own actual brain, which is what’s ultimately producing those answers, is able to work unhindered by your usual conceptions of the ways in which you think that you’re a kind of person stupider than that.”
What is rationalism about?
Rationalism is about the real world. It may or may not strike you as an especially internally consistent, philosophically interesting worldview—this is not what rationality is about. Rationality is about seeing things happen in the real world and then updating your understanding of the world when those things you see surprise you so that they wouldn’t surprise you again.
Why care about predicting things in the world well?
Almost no matter what you ultimately care about, being able to predict ahead of time what’s going to happen next will make you better at planning for your goal.
One central rationalist insight is that thoughts are for guiding actions. Think of your thinking as the connecting tissue sandwiched between the sense data that enters your sense organs and the behaviors your body returns. Your brain is a function from a long sequence of observations (all the sensory inputs you’ve ever received, in the order you received them) to your next motor output.
Understood this way, the point of having a brain and having thoughts is to guide your actions. If your thoughts aren’t all ultimately helping you better steer the universe (by your own lights) … they’re wastes. Thoughts aren’t meant to be causally-closed-off eddies that whirl around in the brain without ever decisively leaving it as actions. They’re meant to transform observations into behaviors! This is the whole point of thinking! Notice when your thoughts are just stewing, without going anywhere, without developing into thoughts that’ll go somewhere … and let go of those useless thoughts. Your thoughts should cut.
If you can imagine a potential worry, then you can generate that worry. Rationalism is, in part, the skill of never being predictably surprised by things you already foresaw.
It may be that you need to “wear another hat” in order to pull that worry out of your brain, or to model another person advising you to get your thoughts to flow that way, but whatever your process, anything you can generate for yourself is something you can foresee and consider. This aspect of rationalism is the art of “mining out your future cognition,” to exactly the extent that you can foresee it, leaving whatever’s left over a mystery to be updated on new observations.
Minor spoilers for mad investor chaos and the woman of asmodeus (planecrash Book 1).
The citation link in this post takes you to a NSFW subthread in the story.