Possibly. However there is significant semantic ambiguity centring around ‘for’ and the concept of purpose. There is legitimate literal meaning there in which the claim is not moral at all (although it would be rather exaggerated.)
A moral judgement that I do make unambiguously is that people should not be expected to answer loaded moral questions of that kind transparently. Most people are, fortunately, equipped with finely tuned hypocrisy instincts so that they can answer with bullshit with full sincerity. I don’t expect those that have defective hypocrisy instincts to self sabotage by sharing their private, internally coherent value system.
I also note that questions of that form I will reply to with obfuscation, overt insincerity or outright non-response even when I would comfortably answer yes. In this case ‘yes’ is a rather weak answer, given that ‘factor in’ does not specify degree of weighting. Yet many questions (or challenges) of the same form are far less mellow, not anyone else’s business unless I choose it to be and potentially have either no answer that sounds acceptable or the appropriate response (once multiplied out) sounds evil.
For example if the degree of ‘factoring in other’s intuitions’ was specified it could be the case that the factoring in others is the ‘evil’ response, despite being egalitarian. Kind of like I consider CEV to be an incredibly stupid plan even though it sounds kind of like the goody goody heroic altruist response at a superficial level.
But the come to think of it your question was about what should be factored in to a utilitarian calculation. So my answer would really have to be null—because utilitarian morality is an abomination that I would never be using in the first place!
That’s mostly the question I wanted to discuss here. If you want my own personal opinion, I think that it should be considered, but we shouldn’t assign a massive amount of weight to it. I’ve studied enough psychology to learn that humans are not often reliable. I also wouldn’t be very inclined to count the “moral intuitions” of militant religious groups. In this case, however, I’m more unsure.
I mean, as you going for a “moral realist” position, where other people might have insight into the “true” morality?
Or is it that other people’s moral intuition might bring up issues that you hadn’t thought off, or illustrate the consequences of some of your own positions?
Or is it political: it would be better to have a more consensus view (for practical or moral reasons), even if we disagree with certain aspects of it?
If we feel like it. I personally would say yes. What would you say?
Yes, regardless of whether it is true. Morality is for lying about.
If that’s you particular moral judgement...
Possibly. However there is significant semantic ambiguity centring around ‘for’ and the concept of purpose. There is legitimate literal meaning there in which the claim is not moral at all (although it would be rather exaggerated.)
A moral judgement that I do make unambiguously is that people should not be expected to answer loaded moral questions of that kind transparently. Most people are, fortunately, equipped with finely tuned hypocrisy instincts so that they can answer with bullshit with full sincerity. I don’t expect those that have defective hypocrisy instincts to self sabotage by sharing their private, internally coherent value system.
I also note that questions of that form I will reply to with obfuscation, overt insincerity or outright non-response even when I would comfortably answer yes. In this case ‘yes’ is a rather weak answer, given that ‘factor in’ does not specify degree of weighting. Yet many questions (or challenges) of the same form are far less mellow, not anyone else’s business unless I choose it to be and potentially have either no answer that sounds acceptable or the appropriate response (once multiplied out) sounds evil.
For example if the degree of ‘factoring in other’s intuitions’ was specified it could be the case that the factoring in others is the ‘evil’ response, despite being egalitarian. Kind of like I consider CEV to be an incredibly stupid plan even though it sounds kind of like the goody goody heroic altruist response at a superficial level.
But the come to think of it your question was about what should be factored in to a utilitarian calculation. So my answer would really have to be null—because utilitarian morality is an abomination that I would never be using in the first place!
A robust and candid position.
That’s mostly the question I wanted to discuss here. If you want my own personal opinion, I think that it should be considered, but we shouldn’t assign a massive amount of weight to it. I’ve studied enough psychology to learn that humans are not often reliable. I also wouldn’t be very inclined to count the “moral intuitions” of militant religious groups. In this case, however, I’m more unsure.
I mean, as you going for a “moral realist” position, where other people might have insight into the “true” morality?
Or is it that other people’s moral intuition might bring up issues that you hadn’t thought off, or illustrate the consequences of some of your own positions?
Or is it political: it would be better to have a more consensus view (for practical or moral reasons), even if we disagree with certain aspects of it?