I think I’m quite rational and have a decent understanding of aspects of rationality that I haven’t managed to implement yet. I think karma is a very imperfect measure, but I’ll note that I have more than 100 and less than 400.
Huh? Karma on this site primarily shows positive contribution/participation (if you are able to contribute positively), treating it as a “measure of rationality” is a bizarre perspective. Please try to outline your position on the specific questions I suggested, or something of that kind, it’s hard to make such a decision in a well-known case, but yet harder to construct fully general advice.
For another example, why do you think it’s important to get away from magical thinking? Is it? What is your motivation for thinking about rationality, and for dispelling the other person’s confusion? “Compatibility” of worldviews?
I would probably give you a response you liked better if I understood why you were asking what you were asking.
Why are you an atheist, (why) do you believe science works...
Because the evidence favors atheism and suggests science leads to truth more often than other approaches to belief formation? I could link to arguments but I don’t see the point in trying to explain these things in my own words. Does it help to know that I usually agree with your comments and with the LW consensus, where it exists? Is the implication that the more rational I am, the more of a problem my partners rationality will be?
what is the difference between one person who is actually right and another person who is merely confused, etc.?
I don’t think I understand this question.
why do you think it’s important to get away from magical thinking? Is it?
I think the importance of getting away from magical thinking varies across people and contexts. I’m not confident I know how important it is, or even whether its helpful, for some people. Its clear that getting away from magical thinking can sometimes help people achieve their personal goals and help make the world a better place.
What is your motivation for thinking about rationality,
I enjoy the process regardless of the consequences. But I also hope that it will help me in my career and help me contribute to the world.
and for dispelling the other person’s confusion? “Compatibility” of worldviews?
I think my partner and I both experience some level of discomfort at knowing that our worldviews are in significant conflict, even though this conflict seems to coexist with a high degree of respect for how the accomplishments of the other. It is unfortunate that we basically have to avoid certain topics of conversation that we both find important and that our emotional reaction to things often differs.
You might check my responses to Alicorn to learn more. Once again, thank you very much for responding.
I would probably give you a response you liked better if I understood why you were asking what you were asking.
This is a delicate topic, but I think Vladimir is trying to tell whether you really use rationality to the degree you claim, or whether you rather accept certain opinions of people you see as rationalists, and wish others shared them. In the latter case, it doesn’t matter that the clash is between rationalist and irrationalist opinions: the conflict is isomorphic to any relationship between people of different religions or political parties, and much of the advice for those people should work for you. It’s the former case that would require more particular advice.
Because the evidence favors atheism and suggests science leads to truth more often than other approaches to belief formation? I could link to arguments but I don’t see the point in trying to explain these things in my own words. Does it help to know that I usually agree with your comments and with the LW consensus, where it exists?
I’m afraid that, in the absence of seeing your thought process, much of this looks like guessing the teacher’s password to me. I’d be happy to be corrected, though.
EDIT: Wow, that sounds really tactless and dismissive of me. I retract my accusation, on the basis of (1) not having any real justification and (2) it would set a bad precedent, especially for the sort of reception newcomers get.
Its interesting that people seem to a) be as skeptical of my rationality as they seem to be, and b) think that is the crux of the matter.
Regarding a), if someone tells me that they’ve been reading OB/LW for quite a while and that they think they are considerably more rational than their romantic partner, I think it is very likely that they are correct. But maybe if I was on the other side I would react differently. If I knew of an easy way to prove my rationality I would, but I don’t. Even writing an original rational essay wouldn’t prove much because I could easily be irrational in other domains.
Regarding b), I’m not sure exactly how important it is that potential advice-givers have a very accurate estimate of my rationality (and my girlfriend’s rationality). Perhaps it would be helpful to focus on more specific aspects of our beliefs and approaches to experiencing and acting in the world.
I lean towards preference utilitarianism, though I don’t walk the walk as well as I should.
I attempt to calculate the costs and benefits of various choices, she does this too sometimes, but doesn’t like applying it reflexively.
She believes in spirits, I’m into Dennett and Dawkins (though I see positive aspects to religion/spirituality)
My partner and I both agree that:
She is much more emotional and I am more rational.
She is more prone to depression.
She has more faith in intuition, I’m more skeptical of it.
Lets say you’ve read everything I’ve written here and you think I’m probably no more rational than my partner. ok, that’s fine, I’d be happy to hear advice that works for two equally irrational people with different beliefs/values/approaches to experiencing and acting in the world.
I think my partner and I both experience some level of discomfort at knowing that our worldviews are in significant conflict, even though this conflict seems to coexist with a high degree of respect for how the accomplishments of the other. It is unfortunate that we basically have to avoid certain topics of conversation that we both find important and that our emotional reaction to things often differs.
So the program of understanding each other doesn’t make progress. I agree with Alicorn, it’s essential to establish a mode of communication where you can steadily work on disagreements, with the goal of ultimately resolving them in full. The arguments shouldn’t turn into color politics, polarizing and alienating.
A bit of advice, based on my experience, for a long-term conversion strategy:
Work on understanding your own position better, make sure you know why you believe what you believe before trying to convince another person to change one’s mind. Maybe you are wrong.
Make the mode of interaction and your goals clear when you are arguing, distinguish analysis from social interaction.
Educate the person about fundamentals, thus steadily crafting tools for making deeper arguments in specific discussions.
Prefer shifting the discussion towards education about more general mistake that (might have) contributed to a specific mistake or confusion. In long term, it’s more important than resolving a specific problem, and it’s easier on the other person’s feelings, as you are educating on an abstract theme, rather than attacking a conviction.
Don’t argue the objects of emotional attachment, ever. Instead, work on finding an angle of approach (as suggested above, maybe something more fundamental) that allows you to make progress without directly confronting the issue.
Not everyone is going to change, some people are too dim or too shallow or persistently not interested.
Huh? Karma on this site primarily shows positive contribution/participation (if you are able to contribute positively), treating it as a “measure of rationality” is a bizarre perspective. Please try to outline your position on the specific questions I suggested, or something of that kind, it’s hard to make such a decision in a well-known case, but yet harder to construct fully general advice.
For another example, why do you think it’s important to get away from magical thinking? Is it? What is your motivation for thinking about rationality, and for dispelling the other person’s confusion? “Compatibility” of worldviews?
I would probably give you a response you liked better if I understood why you were asking what you were asking.
Because the evidence favors atheism and suggests science leads to truth more often than other approaches to belief formation? I could link to arguments but I don’t see the point in trying to explain these things in my own words. Does it help to know that I usually agree with your comments and with the LW consensus, where it exists? Is the implication that the more rational I am, the more of a problem my partners rationality will be?
I don’t think I understand this question.
I think the importance of getting away from magical thinking varies across people and contexts. I’m not confident I know how important it is, or even whether its helpful, for some people. Its clear that getting away from magical thinking can sometimes help people achieve their personal goals and help make the world a better place.
I enjoy the process regardless of the consequences. But I also hope that it will help me in my career and help me contribute to the world.
I think my partner and I both experience some level of discomfort at knowing that our worldviews are in significant conflict, even though this conflict seems to coexist with a high degree of respect for how the accomplishments of the other. It is unfortunate that we basically have to avoid certain topics of conversation that we both find important and that our emotional reaction to things often differs.
You might check my responses to Alicorn to learn more. Once again, thank you very much for responding.
This is a delicate topic, but I think Vladimir is trying to tell whether you really use rationality to the degree you claim, or whether you rather accept certain opinions of people you see as rationalists, and wish others shared them. In the latter case, it doesn’t matter that the clash is between rationalist and irrationalist opinions: the conflict is isomorphic to any relationship between people of different religions or political parties, and much of the advice for those people should work for you. It’s the former case that would require more particular advice.
I’m afraid that, in the absence of seeing your thought process, much of this looks like guessing the teacher’s password to me. I’d be happy to be corrected, though.
EDIT: Wow, that sounds really tactless and dismissive of me. I retract my accusation, on the basis of (1) not having any real justification and (2) it would set a bad precedent, especially for the sort of reception newcomers get.
Its interesting that people seem to a) be as skeptical of my rationality as they seem to be, and b) think that is the crux of the matter.
Regarding a), if someone tells me that they’ve been reading OB/LW for quite a while and that they think they are considerably more rational than their romantic partner, I think it is very likely that they are correct. But maybe if I was on the other side I would react differently. If I knew of an easy way to prove my rationality I would, but I don’t. Even writing an original rational essay wouldn’t prove much because I could easily be irrational in other domains.
Regarding b), I’m not sure exactly how important it is that potential advice-givers have a very accurate estimate of my rationality (and my girlfriend’s rationality). Perhaps it would be helpful to focus on more specific aspects of our beliefs and approaches to experiencing and acting in the world.
I lean towards preference utilitarianism, though I don’t walk the walk as well as I should. I attempt to calculate the costs and benefits of various choices, she does this too sometimes, but doesn’t like applying it reflexively. She believes in spirits, I’m into Dennett and Dawkins (though I see positive aspects to religion/spirituality)
My partner and I both agree that: She is much more emotional and I am more rational. She is more prone to depression. She has more faith in intuition, I’m more skeptical of it.
Lets say you’ve read everything I’ve written here and you think I’m probably no more rational than my partner. ok, that’s fine, I’d be happy to hear advice that works for two equally irrational people with different beliefs/values/approaches to experiencing and acting in the world.
So the program of understanding each other doesn’t make progress. I agree with Alicorn, it’s essential to establish a mode of communication where you can steadily work on disagreements, with the goal of ultimately resolving them in full. The arguments shouldn’t turn into color politics, polarizing and alienating.
A bit of advice, based on my experience, for a long-term conversion strategy:
Work on understanding your own position better, make sure you know why you believe what you believe before trying to convince another person to change one’s mind. Maybe you are wrong.
Make the mode of interaction and your goals clear when you are arguing, distinguish analysis from social interaction.
Educate the person about fundamentals, thus steadily crafting tools for making deeper arguments in specific discussions.
Prefer shifting the discussion towards education about more general mistake that (might have) contributed to a specific mistake or confusion. In long term, it’s more important than resolving a specific problem, and it’s easier on the other person’s feelings, as you are educating on an abstract theme, rather than attacking a conviction.
Don’t argue the objects of emotional attachment, ever. Instead, work on finding an angle of approach (as suggested above, maybe something more fundamental) that allows you to make progress without directly confronting the issue.
Not everyone is going to change, some people are too dim or too shallow or persistently not interested.