we generally prefer that other people make such discoveries rather than not make them; each discovery has positive externalities for everyone else
… but we want other people to not discover the SAME gold mines we do.
Everybody working the same goldmine is in a zero-sum game, but I still benefit from somebody else working some other gold mine.
In general, it’s often the case that a game is only zero-sum along some pareto frontier—i.e. negotiation games—so we get some competitive behavior, but can still have nonzero-sum gains. People can fight in a zero-sum manner over how to divide the gains from trade, even though the gains themselves are nonzero-sum.
If we’re both working for the common good and don’t want to discover the same mines, then the rational strategy is not keeping secrets. It is exactly the opposite: coordinate with each other what to work on.
If we’re both working for the common good, then it doesn’t matter whether the game is zero-sum or nonzero-sum in the first place. If everybody’s working for the common good, then there isn’t really much point to game theory at all.
The point is that nonzero-sum does not imply that the rational strategy is to cooperate all the time.
I know that nonzero-sum does not imply that the rational strategy is to cooperate all the time. I would agree with the OP if ey said that it’s sometimes rational to keep secrets. But eir actual recommendation was entirely unconditional. Incidentally, OP also seems to act against the spirit of eir own advice by giving this advice in the first place.
Sharing knowledge and building a model based on that is still going to be cognitively consolidative even if later counteracted. For example in a jury system the jury has only access to evidence that has passed admissability challenges. Telling them something and later telling them to disregard any conclusions or impressions based on that is rather laboursome. The decision to disregard can be waeker than the reflex to integrate sensed data.
There might also be such effects that a person assigned a mine known to be of lesser yield might work on it less enthusiastically. If you think what you are doing is the best method known to man you might throw your whole mind behind it way more wholeheartdly even if your assement of civilizational peak is in error and due to ignorance.
Then there is the case when the combination of individual efforts would be more efficient / smarter than banding together to be smart. Establising a strategy that everyone things is sound and everyone can get on board can get a very small common denominator. Limiting the variety of psychological profiles working on a subfield can make it more internally coherent. You tell someone about berries and they try to “help” and end up poisoning themselfs or the whole tribe. A way of limiting that knowledge can’t be abused is to limit acccess to it. Berry picking adaptations might make you dangerous fisher so it might make sense that you keep the details of your methods within your own operations.
In this case:
we generally prefer that other people make such discoveries rather than not make them; each discovery has positive externalities for everyone else
… but we want other people to not discover the SAME gold mines we do.
Everybody working the same goldmine is in a zero-sum game, but I still benefit from somebody else working some other gold mine.
In general, it’s often the case that a game is only zero-sum along some pareto frontier—i.e. negotiation games—so we get some competitive behavior, but can still have nonzero-sum gains. People can fight in a zero-sum manner over how to divide the gains from trade, even though the gains themselves are nonzero-sum.
If we’re both working for the common good and don’t want to discover the same mines, then the rational strategy is not keeping secrets. It is exactly the opposite: coordinate with each other what to work on.
If we’re both working for the common good, then it doesn’t matter whether the game is zero-sum or nonzero-sum in the first place. If everybody’s working for the common good, then there isn’t really much point to game theory at all.
The point is that nonzero-sum does not imply that the rational strategy is to cooperate all the time.
I know that nonzero-sum does not imply that the rational strategy is to cooperate all the time. I would agree with the OP if ey said that it’s sometimes rational to keep secrets. But eir actual recommendation was entirely unconditional. Incidentally, OP also seems to act against the spirit of eir own advice by giving this advice in the first place.
Sharing knowledge and building a model based on that is still going to be cognitively consolidative even if later counteracted. For example in a jury system the jury has only access to evidence that has passed admissability challenges. Telling them something and later telling them to disregard any conclusions or impressions based on that is rather laboursome. The decision to disregard can be waeker than the reflex to integrate sensed data.
There might also be such effects that a person assigned a mine known to be of lesser yield might work on it less enthusiastically. If you think what you are doing is the best method known to man you might throw your whole mind behind it way more wholeheartdly even if your assement of civilizational peak is in error and due to ignorance.
Then there is the case when the combination of individual efforts would be more efficient / smarter than banding together to be smart. Establising a strategy that everyone things is sound and everyone can get on board can get a very small common denominator. Limiting the variety of psychological profiles working on a subfield can make it more internally coherent. You tell someone about berries and they try to “help” and end up poisoning themselfs or the whole tribe. A way of limiting that knowledge can’t be abused is to limit acccess to it. Berry picking adaptations might make you dangerous fisher so it might make sense that you keep the details of your methods within your own operations.