Depending on the group size, the underdog might not be the underdog anymore with your support.
If it’s a small group thing (or you have significant power) it is likely that you can determine which side wins.
The underdogs may have more at stake than the winners, and would be willing to give more in return for help. If Bob steals half of Fred’s bananas every day, Bob gets to be a little better fed, and Fred dies.
If you help Fred out, he owes you his life, but Bob doesn’t care nearly as much if it just means he has to go back to eating only his own bananas (that or you kill him).
If you choose to help Bob, your help isn’t worth anything since he had it under control anyway.
I think this instinct may in fact be evolutionarily optimized for conflicts between individuals; in most group conflicts in the ancestral environment, you probably already belong to one of the sides.
But yes, it does seem to generalize too readily to conflicts where you personally wouldn’t sway the balance.
EDIT: How could we test any of the above theories? My theory seems to predict that describing the conflict as “one single entity versus another” (and triggering modes of thought optimized for third parties to single combat) will give a stronger underdog bias than describing a collection of entities on each side (with one collection much larger than the other).
Depending on the group size, the underdog might not be the underdog anymore with your support.
If it’s a small group thing (or you have significant power) it is likely that you can determine which side wins.
The underdogs may have more at stake than the winners, and would be willing to give more in return for help. If Bob steals half of Fred’s bananas every day, Bob gets to be a little better fed, and Fred dies.
If you help Fred out, he owes you his life, but Bob doesn’t care nearly as much if it just means he has to go back to eating only his own bananas (that or you kill him).
If you choose to help Bob, your help isn’t worth anything since he had it under control anyway.
I think this instinct may in fact be evolutionarily optimized for conflicts between individuals; in most group conflicts in the ancestral environment, you probably already belong to one of the sides.
But yes, it does seem to generalize too readily to conflicts where you personally wouldn’t sway the balance.
EDIT: How could we test any of the above theories? My theory seems to predict that describing the conflict as “one single entity versus another” (and triggering modes of thought optimized for third parties to single combat) will give a stronger underdog bias than describing a collection of entities on each side (with one collection much larger than the other).