which does not have to causally pay of for the organism (instead, the payoff is acausal)
I’m not sure what you mean here. Can you give an example?
Suppose I have a gene that makes me cooperate in a prisoner’s dilemma with my relatives. This gene benefits me, because now I can cooperate with my cousins and get the better payoff (assuming my cousins also have this gene!). But you know what would be even better? If my cousins cooperated with me but I defected. So from a causal decision theory standpoint, my best route is to ignore my instincts and defect.
But if I had a gene that said “defect with my cousins,” that would mean my cousins defect back, and so we all lose. So our instincts be beneficial even when the individual best strategy doesn’t line up with them (Because our instincts can be correlated with other humans’).
But you know what would be even better? If my cousins cooperated with me but I defected. So from a causal decision theory standpoint, my best route is to ignore my instincts and defect.
This reasoning assumes that you are special and significantly different from your cousins. If you’re not, your cousins follow the same strategy and you all defect, gene or no gene.
It’s mostly limited to this site, and I don’t know how much that exact wording is used, but it refers to things like Newcomb’s problem, where you can get some benefit from what you do, but you’re not actually causing it.
I should add that when I told Manfred, I didn’t understand, it was more that I didn’t understand how it applied to that situation.
I’m familiar with the concept of an acausal trade. But I don’t understand how it applies to the situation of playing Prisoner’s Dilemma with your cousins.
My point is more about giving the standard amount to the standard charities, rather than earmarking it all for the most efficient one.
I’m not sure what you mean here. Can you give an example?
Suppose I have a gene that makes me cooperate in a prisoner’s dilemma with my relatives. This gene benefits me, because now I can cooperate with my cousins and get the better payoff (assuming my cousins also have this gene!). But you know what would be even better? If my cousins cooperated with me but I defected. So from a causal decision theory standpoint, my best route is to ignore my instincts and defect.
But if I had a gene that said “defect with my cousins,” that would mean my cousins defect back, and so we all lose. So our instincts be beneficial even when the individual best strategy doesn’t line up with them (Because our instincts can be correlated with other humans’).
This reasoning assumes that you are special and significantly different from your cousins. If you’re not, your cousins follow the same strategy and you all defect, gene or no gene.
That’s what acausal benefit means.
Google: No results found for “acausal benefit”
Can you elaborate?
It’s mostly limited to this site, and I don’t know how much that exact wording is used, but it refers to things like Newcomb’s problem, where you can get some benefit from what you do, but you’re not actually causing it.
I should add that when I told Manfred, I didn’t understand, it was more that I didn’t understand how it applied to that situation.
I’m familiar with the concept of an acausal trade. But I don’t understand how it applies to the situation of playing Prisoner’s Dilemma with your cousins.
The wiki article on acausal trade may prove helpful.