If you’re smart enough then you should reverse this advice into “There are hundred-dollar bills lying on the sidewalk”.
This is sort of true, but you need to factor in that you’re not competing against the average person, you’re competing against the smartest person who could pick up the bill and wasn’t too busy picking up other bills.
Hundred-dollar bills lying on the sidewalk are called “alpha”. Alpha is a tautologically self-keeping secret.
The notion of “alpha” seems to be relevant only to zero-sum games, where revealing a method is bad because you don’t want the other players to succeed. Personally, I am more interested in the sort of “bills” picking up which is a win in altruistic (“utilitarianism”) terms rather than selfish terms. (Of course, sometimes to score an altruistic win you need to outmaneuver someone opposed to it, whether out of foolishness or out of malice.)
I’m using “alpha” as the term is used in quantitative finance. This includes non-zero-sum games. For example, if you were the first person to diversity across continents then you could lower your own risk profile (and therefore increase risk-adjusted returns) without increasing risk for anyone else.
Distributing the value extracted from a gold mine is zero-sum. Prospecting for gold is non-zero sum.
But, if the game is not zero-sum, why is it a “self-keeping secret”? Why wouldn’t someone who finds it tell to everyone? Why “you should never flaunt these discoveries in the first place”?
This is a good question and I had to think about it for six months before I could come up with a good answer.
Whether or not alpha is “non-zero sum” is a red herring. I was wrong concerning that point. There is a more important reason not to flaunt discoveries of alpha.
Alpha is a bet against your society’s beliefs. If you proselytize a powerful alpha source then other people will (in the short term) think you are nuts. This is no use to anyone.
If you want people to trust you in the long term then you can do one of two things.
Publicly bet reputation by flaunting your discoveries.
Privately bet capital.
If you are confident in your alpha source then you should bet capital regardless of whether you also bet reputation. If you are already betting capital then betting reputation is redundant. You gain no additional upside. But there is a long-term cost to your reputation if you turn out to be wrong. There is an unavoidable short-term cost to your reputation regardless of whether you turn out to be wrong.
Flaunting discoveries of alpha creates downside without upside.
First, it’s not always possible to bet capital. For example, suppose you figured out quantum gravity. How would you bet capital on that?
Second, secrecy is costly and it’s not always worth it to pay the price. For example, it’s much easier to find collaborators if you go public with your idea instead of keeping it secret.
Third, sometimes there is no short-term cost to reputation. If your idea goes against established beliefs, but you have really good arguments for it, other people won’t necessarily think you’re nuts, or at least the people who think you’re nuts might be compensated by the people who think you’re a genius.
we generally prefer that other people make such discoveries rather than not make them; each discovery has positive externalities for everyone else
… but we want other people to not discover the SAME gold mines we do.
Everybody working the same goldmine is in a zero-sum game, but I still benefit from somebody else working some other gold mine.
In general, it’s often the case that a game is only zero-sum along some pareto frontier—i.e. negotiation games—so we get some competitive behavior, but can still have nonzero-sum gains. People can fight in a zero-sum manner over how to divide the gains from trade, even though the gains themselves are nonzero-sum.
If we’re both working for the common good and don’t want to discover the same mines, then the rational strategy is not keeping secrets. It is exactly the opposite: coordinate with each other what to work on.
If we’re both working for the common good, then it doesn’t matter whether the game is zero-sum or nonzero-sum in the first place. If everybody’s working for the common good, then there isn’t really much point to game theory at all.
The point is that nonzero-sum does not imply that the rational strategy is to cooperate all the time.
I know that nonzero-sum does not imply that the rational strategy is to cooperate all the time. I would agree with the OP if ey said that it’s sometimes rational to keep secrets. But eir actual recommendation was entirely unconditional. Incidentally, OP also seems to act against the spirit of eir own advice by giving this advice in the first place.
Sharing knowledge and building a model based on that is still going to be cognitively consolidative even if later counteracted. For example in a jury system the jury has only access to evidence that has passed admissability challenges. Telling them something and later telling them to disregard any conclusions or impressions based on that is rather laboursome. The decision to disregard can be waeker than the reflex to integrate sensed data.
There might also be such effects that a person assigned a mine known to be of lesser yield might work on it less enthusiastically. If you think what you are doing is the best method known to man you might throw your whole mind behind it way more wholeheartdly even if your assement of civilizational peak is in error and due to ignorance.
Then there is the case when the combination of individual efforts would be more efficient / smarter than banding together to be smart. Establising a strategy that everyone things is sound and everyone can get on board can get a very small common denominator. Limiting the variety of psychological profiles working on a subfield can make it more internally coherent. You tell someone about berries and they try to “help” and end up poisoning themselfs or the whole tribe. A way of limiting that knowledge can’t be abused is to limit acccess to it. Berry picking adaptations might make you dangerous fisher so it might make sense that you keep the details of your methods within your own operations.
1. Consider “metals”. The fact that there is an industry in which deposits are discovered and mines are created to extract these resources, is not a secret.
2. But if you find a gold deposit, you should buy the land in order to create a mine there, or sell it at a higher price, rather than telling the current owner of the land.
3. People in a business which uses metals, miners, and shipping companies, may have something to gain by there being more mines. They may have advice on how to a) find deposits, or b) setup mines, or even c) tell you where some deposits are. (People who know their land has a deposit may advertise this fact, and seek a buyer that will pay the higher price in order to build a mine there.)
4. But while people who make and run mines might tell you how to a or b, they are unlikely to tell you c. (Unless there are a lot of such deposits, perhaps distant from their mines, or there are a few mines, and mine owners wanting to trade investments in one mine for investments in more mines (which don’t exist yet) in order to reduce their risk.)
So the article says:
1. There are hundred dollar bills lying on the street.
2. Pick them up.
3. ???
4. Don’t tell other people—see 2. (Unless you are provably rich from this method, and have moved on to serve the role of 3 - being paid to help people figure out where to find, and how to pick up, the hundred dollar bills lying on the street.)
But if you find a gold deposit, you should buy the land in order to create a mine there, or sell it at a higher price, rather than telling the current owner of the land.
Should you? That really depends on your goals and code of ethics.
“Should” in the sense that you can* “gain” something by doing so. (Perhaps the OP should have been more clear about this, or “should” shouldn’t be used.)
*It was intended as a metaphor, which implicitly assumed that there are ‘dollar bills lying on the ground that you can pickup’.
This is sort of true, but you need to factor in that you’re not competing against the average person, you’re competing against the smartest person who could pick up the bill and wasn’t too busy picking up other bills.
The notion of “alpha” seems to be relevant only to zero-sum games, where revealing a method is bad because you don’t want the other players to succeed. Personally, I am more interested in the sort of “bills” picking up which is a win in altruistic (“utilitarianism”) terms rather than selfish terms. (Of course, sometimes to score an altruistic win you need to outmaneuver someone opposed to it, whether out of foolishness or out of malice.)
I’m using “alpha” as the term is used in quantitative finance. This includes non-zero-sum games. For example, if you were the first person to diversity across continents then you could lower your own risk profile (and therefore increase risk-adjusted returns) without increasing risk for anyone else.
Distributing the value extracted from a gold mine is zero-sum. Prospecting for gold is non-zero sum.
But, if the game is not zero-sum, why is it a “self-keeping secret”? Why wouldn’t someone who finds it tell to everyone? Why “you should never flaunt these discoveries in the first place”?
This is a good question and I had to think about it for six months before I could come up with a good answer.
Whether or not alpha is “non-zero sum” is a red herring. I was wrong concerning that point. There is a more important reason not to flaunt discoveries of alpha.
Alpha is a bet against your society’s beliefs. If you proselytize a powerful alpha source then other people will (in the short term) think you are nuts. This is no use to anyone.
If you want people to trust you in the long term then you can do one of two things.
Publicly bet reputation by flaunting your discoveries.
Privately bet capital.
If you are confident in your alpha source then you should bet capital regardless of whether you also bet reputation. If you are already betting capital then betting reputation is redundant. You gain no additional upside. But there is a long-term cost to your reputation if you turn out to be wrong. There is an unavoidable short-term cost to your reputation regardless of whether you turn out to be wrong.
Flaunting discoveries of alpha creates downside without upside.
I have three problems with this argument.
First, it’s not always possible to bet capital. For example, suppose you figured out quantum gravity. How would you bet capital on that?
Second, secrecy is costly and it’s not always worth it to pay the price. For example, it’s much easier to find collaborators if you go public with your idea instead of keeping it secret.
Third, sometimes there is no short-term cost to reputation. If your idea goes against established beliefs, but you have really good arguments for it, other people won’t necessarily think you’re nuts, or at least the people who think you’re nuts might be compensated by the people who think you’re a genius.
In this case:
we generally prefer that other people make such discoveries rather than not make them; each discovery has positive externalities for everyone else
… but we want other people to not discover the SAME gold mines we do.
Everybody working the same goldmine is in a zero-sum game, but I still benefit from somebody else working some other gold mine.
In general, it’s often the case that a game is only zero-sum along some pareto frontier—i.e. negotiation games—so we get some competitive behavior, but can still have nonzero-sum gains. People can fight in a zero-sum manner over how to divide the gains from trade, even though the gains themselves are nonzero-sum.
If we’re both working for the common good and don’t want to discover the same mines, then the rational strategy is not keeping secrets. It is exactly the opposite: coordinate with each other what to work on.
If we’re both working for the common good, then it doesn’t matter whether the game is zero-sum or nonzero-sum in the first place. If everybody’s working for the common good, then there isn’t really much point to game theory at all.
The point is that nonzero-sum does not imply that the rational strategy is to cooperate all the time.
I know that nonzero-sum does not imply that the rational strategy is to cooperate all the time. I would agree with the OP if ey said that it’s sometimes rational to keep secrets. But eir actual recommendation was entirely unconditional. Incidentally, OP also seems to act against the spirit of eir own advice by giving this advice in the first place.
Sharing knowledge and building a model based on that is still going to be cognitively consolidative even if later counteracted. For example in a jury system the jury has only access to evidence that has passed admissability challenges. Telling them something and later telling them to disregard any conclusions or impressions based on that is rather laboursome. The decision to disregard can be waeker than the reflex to integrate sensed data.
There might also be such effects that a person assigned a mine known to be of lesser yield might work on it less enthusiastically. If you think what you are doing is the best method known to man you might throw your whole mind behind it way more wholeheartdly even if your assement of civilizational peak is in error and due to ignorance.
Then there is the case when the combination of individual efforts would be more efficient / smarter than banding together to be smart. Establising a strategy that everyone things is sound and everyone can get on board can get a very small common denominator. Limiting the variety of psychological profiles working on a subfield can make it more internally coherent. You tell someone about berries and they try to “help” and end up poisoning themselfs or the whole tribe. A way of limiting that knowledge can’t be abused is to limit acccess to it. Berry picking adaptations might make you dangerous fisher so it might make sense that you keep the details of your methods within your own operations.
I think the idea is like:
1. Consider “metals”. The fact that there is an industry in which deposits are discovered and mines are created to extract these resources, is not a secret.
2. But if you find a gold deposit, you should buy the land in order to create a mine there, or sell it at a higher price, rather than telling the current owner of the land.
3. People in a business which uses metals, miners, and shipping companies, may have something to gain by there being more mines. They may have advice on how to a) find deposits, or b) setup mines, or even c) tell you where some deposits are. (People who know their land has a deposit may advertise this fact, and seek a buyer that will pay the higher price in order to build a mine there.)
4. But while people who make and run mines might tell you how to a or b, they are unlikely to tell you c. (Unless there are a lot of such deposits, perhaps distant from their mines, or there are a few mines, and mine owners wanting to trade investments in one mine for investments in more mines (which don’t exist yet) in order to reduce their risk.)
So the article says:
1. There are hundred dollar bills lying on the street.
2. Pick them up.
3. ???
4. Don’t tell other people—see 2. (Unless you are provably rich from this method, and have moved on to serve the role of 3 - being paid to help people figure out where to find, and how to pick up, the hundred dollar bills lying on the street.)
Should you? That really depends on your goals and code of ethics.
“Should” in the sense that you can* “gain” something by doing so. (Perhaps the OP should have been more clear about this, or “should” shouldn’t be used.)
*It was intended as a metaphor, which implicitly assumed that there are ‘dollar bills lying on the ground that you can pickup’.