Here I would say, “Screw ethics and meta-ethics. All I’m saying is I want to do what I feel like doing, even if you and other elites don’t agree with it.”
I think that there is a genuine concern that many people have when they try to ask ethical questions and discuss them with others, and that this process can lead to doing better in terms of that concern. I am speaking vaguely because, as I said earlier, I don’t think that I or others really understand what is going on. This has been an important process for many of the people I know who are trying to make a large positive impact on the world. I believe it was part of the process for you as well. When you say “I want to do what I want to do” I think it mostly just serves as a conversation-stopper, rather than something that contributes to a valuable process of reflection and exchange of ideas.
I personally suspect your error lies in not considering the problem from perspectives other than “what does Brian Tomasik care about right now?”.
Sure, but this is not a factual error, just an error in being a reasonable person or something. :)
I think it is a missed opportunity to engage in a process of reflection and exchange of ideas that I don’t fully understand but seems to deliver valuable results.
When you say “I want to do what I want to do” I think it mostly just serves as a conversation-stopper, rather than something that contributes to a valuable process of reflection and exchange of ideas.
I’m not always as unreasonable as suggested there, but I was mainly trying to point out that if I refuse to go along with certain ideas, it’s not dependent on a controversial theory of meta-ethics. It’s just that I intuitively don’t like the ideas and so reject them out of hand. Most people do this with ideas they find too unintuitive to countenance.
On some questions, my emotions are too strong, and it feels like it would be bad to budge my current stance.
I think it is a missed opportunity to engage in a process of reflection and exchange of ideas that I don’t fully understand but seems to deliver valuable results.
Fair enough. :) I’ll buy that way of putting it.
Anyway, if I were really as unreasonable as it sounds, I wouldn’t be talking here and putting at risk the preservation of my current goals.
I’m not always as unreasonable as suggested there, but I was mainly trying to point out that if I refuse to go along with certain ideas, it’s not dependent on a controversial theory of meta-ethics. It’s just that I intuitively don’t like the ideas and so reject them out of hand. Most people do this with ideas they find too unintuitive to countenance.
Whether you want to call it a theory of meta-ethics or not, and whether it is a factual error or not, you have an unusual approach to dealing with moral questions that places an unusual amount of emphasis on Brian Tomasik’s present concerns. Maybe this is because there is something very different about you that justifies it, or maybe it is some idiosyncratic blind spot or bias of yours. I think you should put weight on both possibilities, and that this pushes in favor of more moderation in the face of values disagreements. Hope that helps articulate where I’m coming from in your language. This is hard to write and think about.
an unusual approach to dealing with moral questions
Why do you think it’s unusual? I would strongly suspect that the majority of people have never examined their moral beliefs carefully and so their moral responses are “intuitive”—they go by gut feeling, basically. I think that’s the normal mode in which most of humanity operates most of the time.
I think other people are significantly more responsive to values disagreements than Brian is, and that this suggests they are significantly more open to the possibility that their idiosyncratic personal values judgments are mistaken. You can get a sense of how unusual Brian’s perspectives are by examining his website, where his discussions of negative utilitarianism and insect suffering stand out.
I think other people are significantly more responsive to values disagreements
That’s a pretty meaningless statement without specifying which values. How responsive “other people” would be to value disagreements about child pornography, for example, do you think?
I suspect Nick would say that if there were respected elites who favored increasing the amount of child pornography, he would give some weight to the possibility that such a position was in fact something he would come to agree with upon further reflection.
Maybe this is because there is something very different about you that justifies it, or maybe it is some idiosyncratic blind spot or bias of yours.
Or, most likely of all, it’s because I don’t care to justify it. If you want to call “not wanting to justify a stance” a bias or blind spot, I’m ok with that.
Hope that helps articulate where I’m coming from in your language. This is hard to write and think about.
I think that there is a genuine concern that many people have when they try to ask ethical questions and discuss them with others, and that this process can lead to doing better in terms of that concern. I am speaking vaguely because, as I said earlier, I don’t think that I or others really understand what is going on. This has been an important process for many of the people I know who are trying to make a large positive impact on the world. I believe it was part of the process for you as well. When you say “I want to do what I want to do” I think it mostly just serves as a conversation-stopper, rather than something that contributes to a valuable process of reflection and exchange of ideas.
I think it is a missed opportunity to engage in a process of reflection and exchange of ideas that I don’t fully understand but seems to deliver valuable results.
I’m not always as unreasonable as suggested there, but I was mainly trying to point out that if I refuse to go along with certain ideas, it’s not dependent on a controversial theory of meta-ethics. It’s just that I intuitively don’t like the ideas and so reject them out of hand. Most people do this with ideas they find too unintuitive to countenance.
On some questions, my emotions are too strong, and it feels like it would be bad to budge my current stance.
Fair enough. :) I’ll buy that way of putting it.
Anyway, if I were really as unreasonable as it sounds, I wouldn’t be talking here and putting at risk the preservation of my current goals.
Whether you want to call it a theory of meta-ethics or not, and whether it is a factual error or not, you have an unusual approach to dealing with moral questions that places an unusual amount of emphasis on Brian Tomasik’s present concerns. Maybe this is because there is something very different about you that justifies it, or maybe it is some idiosyncratic blind spot or bias of yours. I think you should put weight on both possibilities, and that this pushes in favor of more moderation in the face of values disagreements. Hope that helps articulate where I’m coming from in your language. This is hard to write and think about.
Why do you think it’s unusual? I would strongly suspect that the majority of people have never examined their moral beliefs carefully and so their moral responses are “intuitive”—they go by gut feeling, basically. I think that’s the normal mode in which most of humanity operates most of the time.
I think other people are significantly more responsive to values disagreements than Brian is, and that this suggests they are significantly more open to the possibility that their idiosyncratic personal values judgments are mistaken. You can get a sense of how unusual Brian’s perspectives are by examining his website, where his discussions of negative utilitarianism and insect suffering stand out.
That’s a pretty meaningless statement without specifying which values. How responsive “other people” would be to value disagreements about child pornography, for example, do you think?
I suspect Nick would say that if there were respected elites who favored increasing the amount of child pornography, he would give some weight to the possibility that such a position was in fact something he would come to agree with upon further reflection.
Or, most likely of all, it’s because I don’t care to justify it. If you want to call “not wanting to justify a stance” a bias or blind spot, I’m ok with that.
:)