Well it usually takes the form of people telling you that being highly rational is “over-analysing” or that logic is cold and ignores important emotional considerations of various kinds, or that focusing on rationality ignores the reality that people aren’t machines or that they don’t want to live such a cold and clinical life etc etc. Basically its just “I don’t want to be that rational”. So I wonder, what makes people honestly think “I want to be very rational”? (grammar aplogies lol)
Ah, I have met those kind of people. Usually I get the same feeling as when someone is debating politics, leading me to assume that the rejection of rationality is signaling belonging to a certain tribe, one where it is important that everyone feel good about themselves or such.
Personally, I was raised to think and think critically so I can’t draw from personal experience. What did convince the ancient Greeks to embrace rationality, to start question the world around them? Maybe we should look there.
Yeah its useless to try to rationally argue for rationality with someone that doesn’t authentically accept the legitimacy of rationality in the first place. I guess all of us are like this to some degree, but some more than others for certain.
Not a bad suggestion. I know a little about the Ancient Greek philosophers, though nothing specific springs to mind.
Sometimes we might really, actually be over-analyzing things, and what our true goals are may be better discovered by paying more attention to what System 1 is informing us of. If we don’t figure this out for ourselves, it might be other rational people who tell us about this. If someone says:
“If you’re trying to solve this problem, I believe you’re over-analyzing it. Try paying more attention to your feelings, as they might indicate what you really want to do.”
how are we supposed to tell if what they’re saying is a:
someone trying to genuinely help us solve our problem(s) in a rational way
or
someone dismissing attempts at analyzing a problem at all?
It can only be one, or the other. Now, someone might not have read the Less Wrong, but that doesn’t preclude them from noticing when we really are over-analyzing a problem. When someone responds like this, how are we supposed to tell if they’re just strawmanning rationality, or really trying to help us achieve a more rational response?
This isn’t some rhetorical question for you. I’ve got the same concerns as you, and I’m not sure how to ask this particular question better. Is it a non-issue? Am I using confusing terms?
I like the exporation of how emotions interact with rationality that seems to be going on over there.
For me over-analysis would be where further analysis is unlikely to yield practically improved knowledge of options to solve the problem at hand. I’d probably treat this as quite separate from bad analysis or the information supplied by instinct and emotion. In a sense then emotions wouldn’t come to bear on the question of over-analysis generally. However, I’d heartily agree with the proposition that emotions are a good topic of exploration and study because they provide good option selection in certain situations and because knowledge of them might help control and account for emotionally based cognitive bias.
I guess the above would inform the question of whether the person you describe is rationally helping or just strawmanning. My sense is that in many cases the term is thrown around as a kind of defence against the mental discomfort that deep thought and the changing of ideas might bring, but perhaps I’m being too judgemental. Other times of course the person is actually identifying hand-wringing and inaction that we’re too oblivious to identify ourselves.
In terms of identification of true goals, I wonder if the contextuality and changability of emotion would render it a relevent but ultimately unreliable source of deriving true goals. For example, in a fierce conflict its fairly tempting to perceive your goals as fundamentally opposed or opposite to your opponents, but I wonder if that’s really a good position to form.
In the end though, people’s emotions are relevent in their perception of their goals, so I suspect we do have to address emotions in the case for rationality.
Does CFAR have its own discussion forum? I can’t see any so far? Do you know what CFAR thinks about the “winning” approach held by many LWers?
CFAR has its own private mailing list, which isn’t available to individuals who haven’t attended a CAR event before. As a CFAR alumnus, though, I can ask them your questions on your behalf. If I get a sufficient response, I can summarize their insight in a discussion post. I believe CFAR alumni are 40% active Less Wrong users, and 60% not. The base of CFAR, i.e. its staff, may have a substantially different perspective from its hundreds of workshop members that compose the broader community.
I think I’d be quite interested to know what % of CRAF people believe that rationality ought to include a component of “truthiness”. Anything that could help on that?
I feel better about my actions when I can justify them with arguments.
But to be honest, I have never met someone who regards rationality as not worthwhile. Or maybe I have just forgotten the experience.
Well it usually takes the form of people telling you that being highly rational is “over-analysing” or that logic is cold and ignores important emotional considerations of various kinds, or that focusing on rationality ignores the reality that people aren’t machines or that they don’t want to live such a cold and clinical life etc etc. Basically its just “I don’t want to be that rational”. So I wonder, what makes people honestly think “I want to be very rational”? (grammar aplogies lol)
Ah, I have met those kind of people. Usually I get the same feeling as when someone is debating politics, leading me to assume that the rejection of rationality is signaling belonging to a certain tribe, one where it is important that everyone feel good about themselves or such.
Personally, I was raised to think and think critically so I can’t draw from personal experience. What did convince the ancient Greeks to embrace rationality, to start question the world around them? Maybe we should look there.
Yeah its useless to try to rationally argue for rationality with someone that doesn’t authentically accept the legitimacy of rationality in the first place. I guess all of us are like this to some degree, but some more than others for certain.
Not a bad suggestion. I know a little about the Ancient Greek philosophers, though nothing specific springs to mind.
I believe there are people like that, but how can we tell them apart from people who appropriately take into account their emotions in their decision-making and/or can’t explain how or why they’re rational, even though they really are?
I don’t 100% follow your comment, but I find the content of those links interesting. Care to expand on that thought at all?
Sometimes we might really, actually be over-analyzing things, and what our true goals are may be better discovered by paying more attention to what System 1 is informing us of. If we don’t figure this out for ourselves, it might be other rational people who tell us about this. If someone says:
how are we supposed to tell if what they’re saying is a:
someone trying to genuinely help us solve our problem(s) in a rational way
or
someone dismissing attempts at analyzing a problem at all?
It can only be one, or the other. Now, someone might not have read the Less Wrong, but that doesn’t preclude them from noticing when we really are over-analyzing a problem. When someone responds like this, how are we supposed to tell if they’re just strawmanning rationality, or really trying to help us achieve a more rational response?
This isn’t some rhetorical question for you. I’ve got the same concerns as you, and I’m not sure how to ask this particular question better. Is it a non-issue? Am I using confusing terms?
I like the exporation of how emotions interact with rationality that seems to be going on over there.
For me over-analysis would be where further analysis is unlikely to yield practically improved knowledge of options to solve the problem at hand. I’d probably treat this as quite separate from bad analysis or the information supplied by instinct and emotion. In a sense then emotions wouldn’t come to bear on the question of over-analysis generally. However, I’d heartily agree with the proposition that emotions are a good topic of exploration and study because they provide good option selection in certain situations and because knowledge of them might help control and account for emotionally based cognitive bias.
I guess the above would inform the question of whether the person you describe is rationally helping or just strawmanning. My sense is that in many cases the term is thrown around as a kind of defence against the mental discomfort that deep thought and the changing of ideas might bring, but perhaps I’m being too judgemental. Other times of course the person is actually identifying hand-wringing and inaction that we’re too oblivious to identify ourselves.
In terms of identification of true goals, I wonder if the contextuality and changability of emotion would render it a relevent but ultimately unreliable source of deriving true goals. For example, in a fierce conflict its fairly tempting to perceive your goals as fundamentally opposed or opposite to your opponents, but I wonder if that’s really a good position to form.
In the end though, people’s emotions are relevent in their perception of their goals, so I suspect we do have to address emotions in the case for rationality.
Does CFAR have its own discussion forum? I can’t see any so far? Do you know what CFAR thinks about the “winning” approach held by many LWers?
CFAR has its own private mailing list, which isn’t available to individuals who haven’t attended a CAR event before. As a CFAR alumnus, though, I can ask them your questions on your behalf. If I get a sufficient response, I can summarize their insight in a discussion post. I believe CFAR alumni are 40% active Less Wrong users, and 60% not. The base of CFAR, i.e. its staff, may have a substantially different perspective from its hundreds of workshop members that compose the broader community.
I think I’d be quite interested to know what % of CRAF people believe that rationality ought to include a component of “truthiness”. Anything that could help on that?