Goodhart’s Law is a very nice corollary to the Snafu Principle: Communication is impossible in a hierarchy.
Temple Grandin has written about the importance of finding relevant, measurable standards—the example she gives is the number of cattle falling down on the way to slaughter. Not falling down means the genes, food, lighting, walking surface etc. are all good enough.
Thing to check: Do measures used as targets for policy always become completely useless, or do they sometimes become increasingly less useful, but not totally useless? Does culture matter? I suspect that the amount of judgement which people are allowed to mix into the system varies a lot.
She seems to believe that thinking visually will be more likely to produce such standards than (as most people do) verbally.
I’m not convinced of that—dog and cat show standards are an example of well-defined visual standards not producing reliably good results.
I have no idea if it’s even possible to have that good a standard for financial markets.
I’ve heard “you can’t manage what you can’t measure”, but I think “you can’t manage what you can’t perceive” is better. Is it possible to generalize the idea of the king traveling incognito to see how the kingdom is doing?
Once management recognizes that there is something to measure, I think they do an OK job measuring it—secret shoppers come to mind. But there’s something more subtle about when you take for granted that G = G* and don’t even think to verbalize your true values, so can’t measure them.
The secret shoppers are a variant of “the king going incognito”—but not as good in some ways because they may be tasked with evaluating according to a checklist, and thus could still be trapped by G vs G*.
I believe that the problem isn’t that true values aren’t verbalized, it’s that they can’t be fully verbalized. Language is too low-bandwidth to capture all the aspects of a situation.
The point of a king going incognito isn’t just to enforce existing, verbalized rules, it’s to see how things are in the kingdom. It’s a bit easier for a king than an AI because a king is more like a subject than an AI is like people.
Yes. Culture is partly, a process of making people behave in a predictable fashion. If you make your subordinate as similar to you as possible, then there is a good chance that he/she will perceive G instead of G . But you have to be committed to juggling G and G and take the risk that someone actively pursuing G* is not getting ahead of you.
Goodhart’s Law is a very nice corollary to the Snafu Principle: Communication is impossible in a hierarchy.
Temple Grandin has written about the importance of finding relevant, measurable standards—the example she gives is the number of cattle falling down on the way to slaughter. Not falling down means the genes, food, lighting, walking surface etc. are all good enough.
Thing to check: Do measures used as targets for policy always become completely useless, or do they sometimes become increasingly less useful, but not totally useless? Does culture matter? I suspect that the amount of judgement which people are allowed to mix into the system varies a lot.
She seems to believe that thinking visually will be more likely to produce such standards than (as most people do) verbally.
I’m not convinced of that—dog and cat show standards are an example of well-defined visual standards not producing reliably good results.
I have no idea if it’s even possible to have that good a standard for financial markets.
I’ve heard “you can’t manage what you can’t measure”, but I think “you can’t manage what you can’t perceive” is better. Is it possible to generalize the idea of the king traveling incognito to see how the kingdom is doing?
Once management recognizes that there is something to measure, I think they do an OK job measuring it—secret shoppers come to mind. But there’s something more subtle about when you take for granted that G = G* and don’t even think to verbalize your true values, so can’t measure them.
The secret shoppers are a variant of “the king going incognito”—but not as good in some ways because they may be tasked with evaluating according to a checklist, and thus could still be trapped by G vs G*.
I believe that the problem isn’t that true values aren’t verbalized, it’s that they can’t be fully verbalized. Language is too low-bandwidth to capture all the aspects of a situation.
The point of a king going incognito isn’t just to enforce existing, verbalized rules, it’s to see how things are in the kingdom. It’s a bit easier for a king than an AI because a king is more like a subject than an AI is like people.
Does culture matter?
Yes. Culture is partly, a process of making people behave in a predictable fashion. If you make your subordinate as similar to you as possible, then there is a good chance that he/she will perceive G instead of G . But you have to be committed to juggling G and G and take the risk that someone actively pursuing G* is not getting ahead of you.