Yup. In short, I think there is a very simple place when a slowly-dividing computer should start acting as two agents, and that is when, in the sleeping beauty problem, I make two bets with them rather than one.
Hmm. If we use that sort of functional definition, does that mean that if I offer the thicker computer more money, that changes the amount of experience? Nopeāit still has the correct betting behavior if it uses a probability of 0.5.
Yup. In short, I think there is a very simple place when a slowly-dividing computer should start acting as two agents, and that is when, in the sleeping beauty problem, I make two bets with them rather than one.
Hmm. If we use that sort of functional definition, does that mean that if I offer the thicker computer more money, that changes the amount of experience? Nopeāit still has the correct betting behavior if it uses a probability of 0.5.