If you apply the Solomonoff prior, amounts of money offered grow far faster than their probabilities decrease, because there are small programs that compute gigantic numbers. So a stipulation that the probabilities must decrease faster has that obstacle to face, and is wishful thinking anyway.
Maybe there should be. I have an intuition that if the game theory is done right, the Solomonoff argument is neutralised. Who you will face in a game depends on your strategy for playing it. A mugger-payer will face false muggers. More generally, the world you encounter depends on your strategy for interacting with it. This is not just because your strategy determines what parts you look at, but also because the strategies of the agents you meet depend on yours, and yours on theirs, causally and acausally. The Solomonoff prior describes a pure observer who cannot act upon the world.
But this is just a vague gesture towards where a theory might be found.
Is there a thing called “adversarial prior”?
Maybe there should be. I have an intuition that if the game theory is done right, the Solomonoff argument is neutralised. Who you will face in a game depends on your strategy for playing it. A mugger-payer will face false muggers. More generally, the world you encounter depends on your strategy for interacting with it. This is not just because your strategy determines what parts you look at, but also because the strategies of the agents you meet depend on yours, and yours on theirs, causally and acausally. The Solomonoff prior describes a pure observer who cannot act upon the world.
But this is just a vague gesture towards where a theory might be found.
There absolutely should be if there isn’t already. Would love to work with an actual mathematician on this....