You may be confusing the questions “starting from a blank slate, would you expect players to go here?” and “given that players are (somehow) already here, would they stay here?” Saying that something is a Nash equilibrium only implies the second thing.
You’d punish a player for setting their dial lower because you expect that this will actually make the temperature higher (on average, in the long-run). And you expect that it will make the temperature higher because you expect everyone to punish them for it. This is self-referential, but it’s internally-consistent. It’s probably not what a new player would come up with on their own if you suddenly dropped them into this game with no explanation, but if everyone already believes it then it’s true.
(If you can’t immediately see how it’s true given that belief, try imagining that you are the only human player in this game and the other 99 players are robots who are programmed to follow this strategy and cannot do otherwise. Then, what is your best strategy?)
You’re correct, I was confused in exactly that way.
Once that confusion was cleared up by replies, I became confused as to why the hell (ha?) we were talking about this example at all. I am currently of the belief that it’s just a bad example and we should talk about a different one, since there have got to be better examples of counterintuitive bad outcomes from more reasonable-sounding punishment strategies.
You may be confusing the questions “starting from a blank slate, would you expect players to go here?” and “given that players are (somehow) already here, would they stay here?” Saying that something is a Nash equilibrium only implies the second thing.
You’d punish a player for setting their dial lower because you expect that this will actually make the temperature higher (on average, in the long-run). And you expect that it will make the temperature higher because you expect everyone to punish them for it. This is self-referential, but it’s internally-consistent. It’s probably not what a new player would come up with on their own if you suddenly dropped them into this game with no explanation, but if everyone already believes it then it’s true.
(If you can’t immediately see how it’s true given that belief, try imagining that you are the only human player in this game and the other 99 players are robots who are programmed to follow this strategy and cannot do otherwise. Then, what is your best strategy?)
You’re correct, I was confused in exactly that way.
Once that confusion was cleared up by replies, I became confused as to why the hell (ha?) we were talking about this example at all. I am currently of the belief that it’s just a bad example and we should talk about a different one, since there have got to be better examples of counterintuitive bad outcomes from more reasonable-sounding punishment strategies.