Not only that, they can even significantly affect other people’s lives, as long as they don’t leave everyone involved worse off (on net, aggregated over agents).
I think we can deal with the “net” business quickly.
Jack and Joe both desire Jill. Jack perceives that he can win Jill’s affections in competition with Joe, Jill would be just as happy with Joe, but that Joe would be happier than himself (Jack) with Jill.
The “net maximum” would be Jill with Joe. Do you expect Jack to bow out and leave Jill to Jack? Will you punish him if he doesn’t? Will you punish those who fail to punish Jack?
It seems from what you have said that your answers to this would be yes, yes, yes, and you’re back to being a punishing utilitarian.
My answers are no, no, no. I don’t expect Jack to sacrifice his happiness for Joe.
Are you positing a non-iterated affection competition, here? One-shot, there’s one girl, two guys, end of story, no further interactions, nothing else of value to be exchanged?
If so, then I certainly agree with you. In such a constrained universe, social behavior is useless.
If not, then depending on the details of what further interactions are likely, a strategy with a higher expected value for Jack probably exists which would involve Jack bowing out.
Do I expect Jack to adopt that strategy spontaneously? I dunno, it depends on how smart Jack is, and what information he has. Most humans won’t, though, so I don’t expect it of Jack either. I expect humans to defect on the prisoner’s dilemma, also.
Will I punish him if he doesn’t? Will I punish those who fail to punish Jack? Almost undoubtedly not. Ordinarily, the opportunity costs Jack incurs by pursuing a suboptimal strategy will be far outweighed by the costs I incur by a strategy of punishing people for being suboptimal. You could probably construct a scenario in which I would, though I expect it would be contrived.
I don’t expect Jack to sacrifice his happiness for Joe.
I think we can deal with the “net” business quickly.
Jack and Joe both desire Jill. Jack perceives that he can win Jill’s affections in competition with Joe, Jill would be just as happy with Joe, but that Joe would be happier than himself (Jack) with Jill.
The “net maximum” would be Jill with Joe. Do you expect Jack to bow out and leave Jill to Jack? Will you punish him if he doesn’t? Will you punish those who fail to punish Jack?
It seems from what you have said that your answers to this would be yes, yes, yes, and you’re back to being a punishing utilitarian.
My answers are no, no, no. I don’t expect Jack to sacrifice his happiness for Joe.
Are you positing a non-iterated affection competition, here? One-shot, there’s one girl, two guys, end of story, no further interactions, nothing else of value to be exchanged?
If so, then I certainly agree with you. In such a constrained universe, social behavior is useless.
If not, then depending on the details of what further interactions are likely, a strategy with a higher expected value for Jack probably exists which would involve Jack bowing out.
Do I expect Jack to adopt that strategy spontaneously? I dunno, it depends on how smart Jack is, and what information he has. Most humans won’t, though, so I don’t expect it of Jack either. I expect humans to defect on the prisoner’s dilemma, also.
Will I punish him if he doesn’t? Will I punish those who fail to punish Jack? Almost undoubtedly not. Ordinarily, the opportunity costs Jack incurs by pursuing a suboptimal strategy will be far outweighed by the costs I incur by a strategy of punishing people for being suboptimal. You could probably construct a scenario in which I would, though I expect it would be contrived.
No, neither do I.