If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)
If someone just blindly asserted that they’d “try in general” to achieve some honesty percentage and didn’t have logs of past games and similar paraphernalia as evidence of honest calculation I would tend to dismiss it as “verbal rationality signaling” that wasn’t likely to be calibrated enough to matter. Real calibration takes paper and pencil.
However, something even more rigorous seems feasible using bit commitment and coin flipping protocols from cryptography. You could do the cryptographic protocol at the beginning of the game with the desired probability of being required to lie and then reveal your “lying commitment status” at the end of the game to demonstrate your meta-trustworthiness. The meta-trustworthiness parameter seems like the thing that Yvain was getting at with philosophically minded ethicists, because such people have demonstrably committed to meta-trustworthiness as a part of their real world identity—as a part of their character.
Assuming you have good character, the trick would be that at the beginning of the game you’d really have pre-committed to either lying or truth-telling and your commitment status becomes a hidden variable whose value people might reasonably guess from your behavior in the game, so there would be another layer to the gameplay where you try to avoid doing anything that would reveal your commitment status one way or another because the ambiguity is finite and potentially valuable.
You could imagine a game where everyone at the beginning commits “pairwise privately” to varying levels of trustworthiness with everyone else and then estimating the full trust network of probabilities and commitment states (and other player’s estimates knowledge about the state of the network) becomes part of the task over the course of the game.
Actually, this version of the game sounds fun. If I had more free time I’d be tempted to get into an online game of diplomacy as an experiment :-)