But I only trust Alicorn and Eliezer because I’ve discussed morality with both of them in a situation where they had no incentive to lie; it was only in the very unusual conditions of Less Wrong that they could send such a signal believably. Religion is a much easier signal to send and receive without being a moral philosopher.
In games like this, my strategy (or at least my ideal strategy—I haven’t actually played enough games like this to do it) is to declare that I will keep my agreements in about 70% of the games I play, a statement I try to adhere to.
That gives the other players an incentive to co-operate with me, since in all likelyhood I really am co-operating and they will maximize their expected wins by always co-operating. On the other hand, it still gives me an extra advantage 30% of the time.
That’s the theory, at least. I’m not sure how well it works in practice, nor do I know how well it’ll work against somebody else pulling the same strategy on me.
Anyway, now that I’ve publicly declared my 70% strategy, I’d be interested in playing. (At least assuming that this will be play-by-e-mail with turns at least 24 hours long or equivalent, so that time zones won’t be an issue.) ETA: Oh, you specified 2-3 turns a week and I missed it. Yeah, that’s perfectly fine.
That creates a bidding war with a hard limit at 100%, and personally I would not find a claim of 100% honesty in Diplomacy games credible. 70% seems doable; the closer the claimed honesty ratio is to 100% the less credible it is.
Of course, but then they are not credible in advance. Also, it would take many agreements to reliably tell the difference between 70% and 50%; even between 70% and 30%. If you play 50 games of Diplomacy—and by the luck of the draw, you might not be in a position to make agreements with someone every time; Turkey and England presumably don’t have a lot of contact—then 70% is 35 games in which no agreement is broken. 25, which is 50 percent, is less than two sigma away. And fifty games is a lot of Diplomacy.
Yes, but could you credibly do so? You can commit to anything you like; it’s the credibility, not the commitment, that is the problem. Credibility can only be established by long-run tests, and then you run into the problem of distinguishing 70% from 50%.
If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)
You could roll the die so everyone else can’t see them, then take a picture with your cell-phone. At the end, you can prove that you were doing what the die said.
Nothing could stop you from rolling the die until you get the wanted number, and then only publishing one result.
To be credible, you’d need a RNG that you can’t use without everyone else knowing that you did (if not the specific result), which usually means a third party.
Online D&D uses trusted die-roller websites that keep logs of all the rolls made under one’s name; you could have a variant where they just publish a hash of the results, rather than the results themselves.
One thing about the 70% strategy is that you will be expected to defect in the top 30% of situations where you would gain the most from it, and to cooperate in the bottom 70% of situations in which you’re not passing on such a juicy defection opportunity anyway.
You could probably write out an equation that would take the distribution of decisions (expressed in payoff for defection vs. cooperation), and compute the optimal probability distribution (as a function of expected value for defection, or ratio of cooperation/defection payoff, perhaps) to use for this strategy.
Bongo’s comment rightly implies that the result will depend on whether other players observe the probability you will betray them as a function of payoff, or just the probability overall.
In games like this, my strategy (or at least my ideal strategy—I haven’t actually played enough games like this to do it) is to declare that I will keep my agreements in about 70% of the games I play, a statement I try to adhere to.
That gives the other players an incentive to co-operate with me, since in all likelyhood I really am co-operating and they will maximize their expected wins by always co-operating. On the other hand, it still gives me an extra advantage 30% of the time.
That’s the theory, at least. I’m not sure how well it works in practice, nor do I know how well it’ll work against somebody else pulling the same strategy on me.
Anyway, now that I’ve publicly declared my 70% strategy, I’d be interested in playing. (At least assuming that this will be play-by-e-mail with turns at least 24 hours long or equivalent, so that time zones won’t be an issue.) ETA: Oh, you specified 2-3 turns a week and I missed it. Yeah, that’s perfectly fine.
If I have a choice of cooperating with you, or another player that keeps eir agreements 80% of the time, guess who I am going to cooperate with.
That creates a bidding war with a hard limit at 100%, and personally I would not find a claim of 100% honesty in Diplomacy games credible. 70% seems doable; the closer the claimed honesty ratio is to 100% the less credible it is.
Let credibility be established by track records.
Of course, but then they are not credible in advance. Also, it would take many agreements to reliably tell the difference between 70% and 50%; even between 70% and 30%. If you play 50 games of Diplomacy—and by the luck of the draw, you might not be in a position to make agreements with someone every time; Turkey and England presumably don’t have a lot of contact—then 70% is 35 games in which no agreement is broken. 25, which is 50 percent, is less than two sigma away. And fifty games is a lot of Diplomacy.
You could pre-commit to rolling percentil dice.
Yes, but could you credibly do so? You can commit to anything you like; it’s the credibility, not the commitment, that is the problem. Credibility can only be established by long-run tests, and then you run into the problem of distinguishing 70% from 50%.
If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)
You could roll the die so everyone else can’t see them, then take a picture with your cell-phone. At the end, you can prove that you were doing what the die said.
Nothing could stop you from rolling the die until you get the wanted number, and then only publishing one result.
To be credible, you’d need a RNG that you can’t use without everyone else knowing that you did (if not the specific result), which usually means a third party.
Online D&D uses trusted die-roller websites that keep logs of all the rolls made under one’s name; you could have a variant where they just publish a hash of the results, rather than the results themselves.
One thing about the 70% strategy is that you will be expected to defect in the top 30% of situations where you would gain the most from it, and to cooperate in the bottom 70% of situations in which you’re not passing on such a juicy defection opportunity anyway.
Not if I determine my loyalty by a secret die roll before the game.
I was going to determine all of my moves using I Ching.
You could probably write out an equation that would take the distribution of decisions (expressed in payoff for defection vs. cooperation), and compute the optimal probability distribution (as a function of expected value for defection, or ratio of cooperation/defection payoff, perhaps) to use for this strategy.
Bongo’s comment rightly implies that the result will depend on whether other players observe the probability you will betray them as a function of payoff, or just the probability overall.