That creates a bidding war with a hard limit at 100%, and personally I would not find a claim of 100% honesty in Diplomacy games credible. 70% seems doable; the closer the claimed honesty ratio is to 100% the less credible it is.
Of course, but then they are not credible in advance. Also, it would take many agreements to reliably tell the difference between 70% and 50%; even between 70% and 30%. If you play 50 games of Diplomacy—and by the luck of the draw, you might not be in a position to make agreements with someone every time; Turkey and England presumably don’t have a lot of contact—then 70% is 35 games in which no agreement is broken. 25, which is 50 percent, is less than two sigma away. And fifty games is a lot of Diplomacy.
Yes, but could you credibly do so? You can commit to anything you like; it’s the credibility, not the commitment, that is the problem. Credibility can only be established by long-run tests, and then you run into the problem of distinguishing 70% from 50%.
If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)
You could roll the die so everyone else can’t see them, then take a picture with your cell-phone. At the end, you can prove that you were doing what the die said.
Nothing could stop you from rolling the die until you get the wanted number, and then only publishing one result.
To be credible, you’d need a RNG that you can’t use without everyone else knowing that you did (if not the specific result), which usually means a third party.
Online D&D uses trusted die-roller websites that keep logs of all the rolls made under one’s name; you could have a variant where they just publish a hash of the results, rather than the results themselves.
That creates a bidding war with a hard limit at 100%, and personally I would not find a claim of 100% honesty in Diplomacy games credible. 70% seems doable; the closer the claimed honesty ratio is to 100% the less credible it is.
Let credibility be established by track records.
Of course, but then they are not credible in advance. Also, it would take many agreements to reliably tell the difference between 70% and 50%; even between 70% and 30%. If you play 50 games of Diplomacy—and by the luck of the draw, you might not be in a position to make agreements with someone every time; Turkey and England presumably don’t have a lot of contact—then 70% is 35 games in which no agreement is broken. 25, which is 50 percent, is less than two sigma away. And fifty games is a lot of Diplomacy.
You could pre-commit to rolling percentil dice.
Yes, but could you credibly do so? You can commit to anything you like; it’s the credibility, not the commitment, that is the problem. Credibility can only be established by long-run tests, and then you run into the problem of distinguishing 70% from 50%.
If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)
You could roll the die so everyone else can’t see them, then take a picture with your cell-phone. At the end, you can prove that you were doing what the die said.
Nothing could stop you from rolling the die until you get the wanted number, and then only publishing one result.
To be credible, you’d need a RNG that you can’t use without everyone else knowing that you did (if not the specific result), which usually means a third party.
Online D&D uses trusted die-roller websites that keep logs of all the rolls made under one’s name; you could have a variant where they just publish a hash of the results, rather than the results themselves.