As to your first paragraph, one of the more liberating things about internalizing the Sequences and inculcating myself with Less Wrong memes is that it I’ve come to hold very few opinions strongly. When you move yourself to a sufficiently morally-relativistic framework you stop identifying with your opinions. Instead of saying there are things you “just have to believe” I would say there are things that it is instrumentally rational to behave as if you believe.
Regarding your second paragraph: I find that I simply get into fewer arguments. Because I have let go of most of the opinions that typically weight people and make them respond at an emotional level to being contradicted, every conversation that includes a disagreement becomes a joint truth-seeking venture. Instead of arguments, I have “discussions.” If the other party in the discussion is not interested in truth seeking but is instead interested in being right, I just stop humoring them and change the subject. If they are someone I can be honest with, I will point out that they seem to have an irrational bias regarding the topic in question.
It seems like your first mistake was getting involved in a theological debate. Science is flawed, but “religion” doesn’t even have enough predictive power to be falsified. I would step back at least one level and urge you to ask yourself what your objectives are in participating in such a confused discussion in the first place. I myself have indulged in totally stupid internet arguments which I can only attribute to a sort of perverse pique that strikes me at times, but I generally admit that I’m already failing to be rational just by participating in such.
Sure, there are plenty of moral relativists here; but given that Eliezer’s metaethics sequence explicitly contrasts itself with traditional moral relativism and that Luke’s moral reductionism sequence holds out several levels of the objective-subjective distinction (to each of which moral relativism gives the subjective answer) as open questions (pending the rest of the sequence), I’d say that the Less Wrong consensus is “reject naive moral realism” rather than “embrace moral relativism”.
Yeah, I definitely should. What I was trying to say was that most LWers think that morality is an aspect of minds, not an aspect of the outside world (no universal morality). I think I misunderstood the term, this time through wikipedia it seems that moral relativism rejects a common human morality. It appears I was using the word wrong.
Then you’re probably right about this being a standard position on LW, but you used wrong/misleading terminology. Rejection of universal morality might be a suitable description, though there are fine points this doesn’t capture, morality being “subjectively objective”, in the sense of everyone having their own “personally-objective” morality they can’t alter in any way (so that there is a possibility of getting it wrong and value in figuring out what it is).
(“Being an aspect of mind” also runs into problems, since there’s no clear dividing line that makes things other than your own mind absolutely useless in figuring out what (your) morality is.)
That’s really insightful. Lately I have been getting into a few more debates, religious and not, because of decreased tolerance to flawed ideas. I got stuck on “That which can be destroyed by the truth should be.” loop, convinced I was right in a discussion with my mother, actually.
Still trying to figure out how to become more truth-seeking, but it’s hard since I’m not nearly rational enough. I wonder what the best way to act is, if I don’t want debates, or discussions, but feel compelled to give a hint at my own opinions. For example, a friend thought the best way to make things better was to pray for me, which sparked a pretty heated argument, something I didn’t want.
I’ll just try to get my head out of my arse, but I still find it frustrating how obviously wrong people (including me) can be.
What sequence would you recommend if I repeatedly approach this from the wrong angle?
The Reductionism Sequence has been the most important for me, in terms of how I would assess its impact on my mental processes. In particular, I think it helps you see what other people are doing wrong, so that you can respond to their errors in a non-confrontational manner. Spending a lot of time essentially meditating on the concepts underlying dissolving the question has really changed how I see things and how I deal with disagreements with other people.
I’m not claiming to be some paragon of perfect rationality here, I still lose my patience sometimes, but it’s a process.
An attempt to be more rational, then? Thanks, I think I need to reread that, anyway. That and a few others. It’ll require some work, sure, but few things in life are easy. It’s a start anyway, cheers! Think I’ll do a bit better in… A few weeks, once I’ve mulled it over.
As to your first paragraph, one of the more liberating things about internalizing the Sequences and inculcating myself with Less Wrong memes is that it I’ve come to hold very few opinions strongly. When you move yourself to a sufficiently morally-relativistic framework you stop identifying with your opinions. Instead of saying there are things you “just have to believe” I would say there are things that it is instrumentally rational to behave as if you believe.
Regarding your second paragraph: I find that I simply get into fewer arguments. Because I have let go of most of the opinions that typically weight people and make them respond at an emotional level to being contradicted, every conversation that includes a disagreement becomes a joint truth-seeking venture. Instead of arguments, I have “discussions.” If the other party in the discussion is not interested in truth seeking but is instead interested in being right, I just stop humoring them and change the subject. If they are someone I can be honest with, I will point out that they seem to have an irrational bias regarding the topic in question.
It seems like your first mistake was getting involved in a theological debate. Science is flawed, but “religion” doesn’t even have enough predictive power to be falsified. I would step back at least one level and urge you to ask yourself what your objectives are in participating in such a confused discussion in the first place. I myself have indulged in totally stupid internet arguments which I can only attribute to a sort of perverse pique that strikes me at times, but I generally admit that I’m already failing to be rational just by participating in such.
“Stop identifying with your opinions” is a classic Less Wrong idea, but moral relativism is not.
Perhaps it’s not an explicitly stated idea, but it’s probably a fairly common one.
Sure, there are plenty of moral relativists here; but given that Eliezer’s metaethics sequence explicitly contrasts itself with traditional moral relativism and that Luke’s moral reductionism sequence holds out several levels of the objective-subjective distinction (to each of which moral relativism gives the subjective answer) as open questions (pending the rest of the sequence), I’d say that the Less Wrong consensus is “reject naive moral realism” rather than “embrace moral relativism”.
You should perhaps unpack what you mean by the label “moral relativism” at this point.
Yeah, I definitely should. What I was trying to say was that most LWers think that morality is an aspect of minds, not an aspect of the outside world (no universal morality). I think I misunderstood the term, this time through wikipedia it seems that moral relativism rejects a common human morality. It appears I was using the word wrong.
Then you’re probably right about this being a standard position on LW, but you used wrong/misleading terminology. Rejection of universal morality might be a suitable description, though there are fine points this doesn’t capture, morality being “subjectively objective”, in the sense of everyone having their own “personally-objective” morality they can’t alter in any way (so that there is a possibility of getting it wrong and value in figuring out what it is).
(“Being an aspect of mind” also runs into problems, since there’s no clear dividing line that makes things other than your own mind absolutely useless in figuring out what (your) morality is.)
That’s really insightful. Lately I have been getting into a few more debates, religious and not, because of decreased tolerance to flawed ideas. I got stuck on “That which can be destroyed by the truth should be.” loop, convinced I was right in a discussion with my mother, actually.
Still trying to figure out how to become more truth-seeking, but it’s hard since I’m not nearly rational enough. I wonder what the best way to act is, if I don’t want debates, or discussions, but feel compelled to give a hint at my own opinions. For example, a friend thought the best way to make things better was to pray for me, which sparked a pretty heated argument, something I didn’t want.
I’ll just try to get my head out of my arse, but I still find it frustrating how obviously wrong people (including me) can be.
What sequence would you recommend if I repeatedly approach this from the wrong angle?
Does an external link work instead? Because I found Paul Graham’s essay Keep Your Identity Small to make the point a bit more succinctly.
Definitely helpful, much appreciated!
The Reductionism Sequence has been the most important for me, in terms of how I would assess its impact on my mental processes. In particular, I think it helps you see what other people are doing wrong, so that you can respond to their errors in a non-confrontational manner. Spending a lot of time essentially meditating on the concepts underlying dissolving the question has really changed how I see things and how I deal with disagreements with other people.
I’m not claiming to be some paragon of perfect rationality here, I still lose my patience sometimes, but it’s a process.
An attempt to be more rational, then? Thanks, I think I need to reread that, anyway. That and a few others. It’ll require some work, sure, but few things in life are easy. It’s a start anyway, cheers! Think I’ll do a bit better in… A few weeks, once I’ve mulled it over.
I wish I had multiple upvotes, so I’ll just say that this is good sense.