That’s because the maximizer is now conditioning not on the probability that he is running on 1kg vs 2kg hardware, but on the probability that you/Omega selected the 1kg/2kg machine to talk to, which sounds intuitively more close to 50⁄50 based on your arguments.
But now suppose that I made the same deal to the maximizer.
There was a post about this point recently somewhere around here, that your solution to the Monty Hall problem should depend about what you know about the algorithm behind the moderators choice to open a door, and which.
That’s because the maximizer is now conditioning not on the probability that he is running on 1kg vs 2kg hardware, but on the probability that you/Omega selected the 1kg/2kg machine to talk to, which sounds intuitively more close to 50⁄50 based on your arguments.
But now suppose that I made the same deal to the maximizer.
There was a post about this point recently somewhere around here, that your solution to the Monty Hall problem should depend about what you know about the algorithm behind the moderators choice to open a door, and which.