I don’t understand what this means. How does one signal to one’s ‘ego’? What information is being conveyed, and to whom?
I’m talking about self-deception, essentially. A perfectly rational agent would not be able to do that, but people aren’t perfectly rational agents, and they are capable of self deception, and sometimes they do that deliberately, sometimes it is unconscious. Wishful thinking and Confirmation bias are instances of this.
These could both be true at different explanatory levels. What are we taking to be the site of ‘really caring’? The person’s conscious desires? The person’s conscious volition and decision-making? The person’s actions and results?
Consider Revealed preferences. Are someone’s actions more consistent with their stated goals or with status seeking and signalling?
What’s the import of the distinction? Presumably we should treat actions as obligatory when that makes the world a better place, and as non-obligatory but praiseworthy when that makes the world a better place.
I’m not sure I can follow you here. This looks like circular reasoning.
I’m not sure I can follow you here. This looks like circular reasoning.
I’m not sure what RobBB meant, but something like this, perhaps:
Utilitarianism doesn’t have fundamental concepts of “obligatory” or “supererogatory”, only “more good” and “less good”. A utilitarian saying “X is obligatory but Y is supererogatory” unpacks to “I’m going to be more annoyed/moralize more/cooperate less at you if you fail to do X than if you fail to do Y”. A utilitarian can pick a strategy for which things to get annoyed/moralize/be uncooperative about according to which strategy maximizes utility.
I’m talking about self-deception, essentially. A perfectly rational agent would not be able to do that, but people aren’t perfectly rational agents, and they are capable of self deception, and sometimes they do that deliberately, sometimes it is unconscious. Wishful thinking and Confirmation bias are instances of this.
Consider Revealed preferences. Are someone’s actions more consistent with their stated goals or with status seeking and signalling?
I’m not sure I can follow you here. This looks like circular reasoning.
I’m not sure what RobBB meant, but something like this, perhaps: