Another, famous, example. At one time, somewhere in India there were a lot of cobras, which are dangerous. So the government (it happened to be the British Raj at the time) decided to offer a bounty for dead cobras. That worked for a while, until people figured out that they could breed cobras for the bounty. Then the government worked out what was going on, and cancelled the bounty. So then the cobra breeders released all their now-valueless cobras into the wild.
(According to Wikipedia this particular instance isn’t actually well documented, but a similar one involving rats in Hanoi is.)
When people want to change the behavior of others, they find some policy and incentive that would encourage the change they desire, but never stop to ask how else people might react to that change in incentives.
Anyone ever come across any catchy name or formulation for this particular failure mode?
Cobra effect (see the Wikipedia page I linked before). Law of unintended consequences. (Perhaps the former is a little too narrow and the latter a little too broad.)
Isn’t this an example of a reflection problem? We induce this change in a system, in this case an evaluation metric, and now we must predict not only the next iteration but the stable equilibria of this system.
A Red Queen’s Race is an evolutionary competition in which absolute position does not change. The classic example is the arms race between foxes and rabbits that results in both becoming faster in absolute terms, but the rate of predation stays fixed. (The origin is Lewis Carrol: “It takes all the running you can do, just to stay in the same place.”)
A Red Queen’s Race is an evolutionary competition in which absolute position does not change.
You mean relative, not absolute.
I’ve also seen a more general interpretation: the Red Queen situation is where staying still (doing nothing) makes you worse off as time passes; you need to run forward just to stay in the same place.
I think this is analogous to what’s happening here—you create better incentives, they create better ways to get around those incentives, nothing changes. I didn’t know that this wasn’t the common usage, as I got it from this Overcoming Bias post:
Another, famous, example. At one time, somewhere in India there were a lot of cobras, which are dangerous. So the government (it happened to be the British Raj at the time) decided to offer a bounty for dead cobras. That worked for a while, until people figured out that they could breed cobras for the bounty. Then the government worked out what was going on, and cancelled the bounty. So then the cobra breeders released all their now-valueless cobras into the wild.
(According to Wikipedia this particular instance isn’t actually well documented, but a similar one involving rats in Hanoi is.)
This effect also exists in software development:
http://thedailywtf.com/articles/The-Defect-Black-Market
Famous Dilbert cartoon on this topic.
I see this failure in analysis all the time.
When people want to change the behavior of others, they find some policy and incentive that would encourage the change they desire, but never stop to ask how else people might react to that change in incentives.
Anyone ever come across any catchy name or formulation for this particular failure mode?
Perverse incentives.
Cobra effect (see the Wikipedia page I linked before). Law of unintended consequences. (Perhaps the former is a little too narrow and the latter a little too broad.)
Isn’t this an example of a reflection problem? We induce this change in a system, in this case an evaluation metric, and now we must predict not only the next iteration but the stable equilibria of this system.
Goodhart’s Law
Oops, double post; V_V already said that.
I believe this is called a “red queen race”
This is not correct, at least in common usage.
A Red Queen’s Race is an evolutionary competition in which absolute position does not change. The classic example is the arms race between foxes and rabbits that results in both becoming faster in absolute terms, but the rate of predation stays fixed. (The origin is Lewis Carrol: “It takes all the running you can do, just to stay in the same place.”)
You mean relative, not absolute.
I’ve also seen a more general interpretation: the Red Queen situation is where staying still (doing nothing) makes you worse off as time passes; you need to run forward just to stay in the same place.
Yes, yes I did. Thanks for the correction.
I think this is analogous to what’s happening here—you create better incentives, they create better ways to get around those incentives, nothing changes. I didn’t know that this wasn’t the common usage, as I got it from this Overcoming Bias post:
http://www.overcomingbias.com/2014/06/bias-is-a-red-queen-game.html