PatrickDFarley
The non-tribal tribes
Great comment, and I will have to think more about this. Your examples do seem to support the utility of self-identity-based motivation.
I think maybe my statement “you can’t lie to yourself if you know it’s a lie” is forcing a frame where self-talk is either a genuine attempt at truth, or a lie. But with “this is easy for me because I’m a fighter” and similar statements, it seems they can be received by the mind in a different way—more like as self-fulfilling prophecy.
I guess it’s an open question for me then, where to use that kind of self-talk. On one end is the danger of becoming miserable in pursuit of an identity that was actually kind of arbitrary, and on the other end you miss out on maybe the most powerful kind of motivation.
The present perfect tense is ruining your life
Book review: Range by David Epstein
Action derivatives: You’re not doing what you think you’re doing
I agree, but I’d lump all of that into “Analyze the circumstances that caused it”. Maybe I should’ve included more external examples like these
This method is interesting to me and I’d like to get into it someday. Personally I keep finding that whenever I decline to write something down, that one thing will come back to bite me a few days later (because I’d forgotten it). Do you find that you’re able to mentally keep track of things better than before, even if they’re just vaguely in the back of your mind?
Laziness death spirals
Why pay mind to what’s correlated with being right, when you have the option of just seeing who’s right?
I’m arguing that being right is the same as “holding greater predictive power”, so any conversation that’s not geared toward “what’s the difference in our predictions?” is not about being right, but rather about something else, like “Do I fit the profile of someone who would be right” / “Am I generally intelligent” / “Am I arguing in good faith” etc.
These things are indeed correlated with being right, but aren’t you risking Goodharting? What does it really mean to “be right” about things? If you’re native to LessWrong you’ll probably answer something like, “to accurately anticipate future sensory experiences”. Isn’t that all you need? Find an opportunity for you and your friend to predict measurably different futures, then see who wins. All the rest is distraction.
And if you predict all the same things, then you have no real disagreement, just semantic differences
Fun to do with names. Patrick—English version of a Latin name, Patricius, which means “noble”, referring to the Roman nobility, which was originally composed of the paterfamiliae, the heads of large families. From pater (father), which is Latin but goes back to proto-indo-european. From proto-indo-european pah which means “to protect/shepherd”
Is this an epistemology?
I have experiences, and some interpretations of those experiences allow me to predict future experiences.
I didn’t say it was the answer to everything. The original phrasing was “more truthful.”
These are tautologies. What is the point you’re getting at?
What would it mean for rationality to be “objectively better”? It depends what the objective is. If your objective is “predictive power,” then by some definitions you are already a rationalist.
Is your issue that predictive power isn’t a good objective, or that there are better methods for prediction than those discussed on this site?
If there existed a paradigm that is more truthful than ‘rationality’ as you have been taught it, how would you even know?
Easy. Predictive power.
It seems like you have strong feelings about rationality without actually knowing what that word means here
I really like that last bit about chronological cycles of increasing S-level to “win against” the current level, until physical reality smacks us in the face and we reset. Let me try something:
(Physically) Hard times create S1 men; S1 men create (physically) good times.
(Physically) Good times create S2 men (because there’s free alpha in manipulating S1); S2 men create (socially) hard times (because now you don’t know whom to trust about S1 issues)
(Socially) hard times create S3 men (because tribalism builds/confirms social trust); S3 men create (socially) good times (you have a whole tribe or church or culture war faction that you trust).
(Socially) good times create S4 men (because there’s free alpha in manipulating S3); S4 men create (physically) hard times (because they’re disconnected from physical reality).
My scorched-earth policy on New Year’s resolutions
I’m gonna be lazy and say:
If it comes up tails, you get nothing.
If that ^ is a given premise in this hypothetical, then we know for certain it is not a simulation (because in a simulation, after tails, you’d get something). Therefore the probability of receiving a lollipop here is 0 (unless you receive one for a completely unrelated reason)
You’re right, I’m assuming the reader belongs to a “real” tribe ie red or blue . I should’ve tweaked it for the LW crosspost