unilaterally deciding to stop faking data… is nice, but isn’t actually going to help unless it is part of a broader, more concerted strategy.
I could imagine this being true in some sort of hyper-Malthusian setting where any deviation from the Nash equilibrium gets you immediately killed and replaced with an otherwise-identical agent who will play the Nash equilibrium.
I could imagine this being true in some sort of hyper-Malthusian setting where any deviation from the Nash equilibrium gets you immediately killed and replaced with an otherwise-identical agent who will play the Nash equilibrium.