Are you positing a non-iterated affection competition, here? One-shot, there’s one girl, two guys, end of story, no further interactions, nothing else of value to be exchanged?
If so, then I certainly agree with you. In such a constrained universe, social behavior is useless.
If not, then depending on the details of what further interactions are likely, a strategy with a higher expected value for Jack probably exists which would involve Jack bowing out.
Do I expect Jack to adopt that strategy spontaneously? I dunno, it depends on how smart Jack is, and what information he has. Most humans won’t, though, so I don’t expect it of Jack either. I expect humans to defect on the prisoner’s dilemma, also.
Will I punish him if he doesn’t? Will I punish those who fail to punish Jack? Almost undoubtedly not. Ordinarily, the opportunity costs Jack incurs by pursuing a suboptimal strategy will be far outweighed by the costs I incur by a strategy of punishing people for being suboptimal. You could probably construct a scenario in which I would, though I expect it would be contrived.
I don’t expect Jack to sacrifice his happiness for Joe.
Are you positing a non-iterated affection competition, here? One-shot, there’s one girl, two guys, end of story, no further interactions, nothing else of value to be exchanged?
If so, then I certainly agree with you. In such a constrained universe, social behavior is useless.
If not, then depending on the details of what further interactions are likely, a strategy with a higher expected value for Jack probably exists which would involve Jack bowing out.
Do I expect Jack to adopt that strategy spontaneously? I dunno, it depends on how smart Jack is, and what information he has. Most humans won’t, though, so I don’t expect it of Jack either. I expect humans to defect on the prisoner’s dilemma, also.
Will I punish him if he doesn’t? Will I punish those who fail to punish Jack? Almost undoubtedly not. Ordinarily, the opportunity costs Jack incurs by pursuing a suboptimal strategy will be far outweighed by the costs I incur by a strategy of punishing people for being suboptimal. You could probably construct a scenario in which I would, though I expect it would be contrived.
No, neither do I.