Re: Does that mean that I should mechanically overwrite my beliefs about the chance of a lottery ticket winning, in order to maximize my expectation of the payout?
No, it doesn’t. It means that the process going on in the brains of intelligent agents heads can be accurately modelled as calculating expected utilities—and then selecting the action that corresponds to the largest of these.
Agents are better modelled as Expected Utility Maximisers than as Utility Maximisers. Whether an Expected Utility Maximiser actually maximises utility depends on whether it is in an environment where its expectations pan out.
Re: Does that mean that I should mechanically overwrite my beliefs about the chance of a lottery ticket winning, in order to maximize my expectation of the payout?
No, it doesn’t. It means that the process going on in the brains of intelligent agents heads can be accurately modelled as calculating expected utilities—and then selecting the action that corresponds to the largest of these.
Agents are better modelled as Expected Utility Maximisers than as Utility Maximisers. Whether an Expected Utility Maximiser actually maximises utility depends on whether it is in an environment where its expectations pan out.