I think Nick Tarleton refuted this in the other subthread—a lottery here means a lottery over states of the world, which include your knowledge state, so if you get your knowledge of the outcome later it’s not really the same thing.
It’s still true that this is a reason to disprefer realistic lotteries where you learn the outcome later, but maybe this is better termed “unpredictability aversion” than “risk aversion”? After all, it can happen even when all lottery outcomes are equally desirable. (Example: you like soup and potatoes equally, but prefer either to a lottery over them because you want to know whether to get a spoon or a fork.)
I think Nick Tarleton refuted this in the other subthread—a lottery here means a lottery over states of the world, which include your knowledge state, so if you get your knowledge of the outcome later it’s not really the same thing.
It’s still true that this is a reason to disprefer realistic lotteries where you learn the outcome later, but maybe this is better termed “unpredictability aversion” than “risk aversion”? After all, it can happen even when all lottery outcomes are equally desirable. (Example: you like soup and potatoes equally, but prefer either to a lottery over them because you want to know whether to get a spoon or a fork.)
(In that link, I’m actually just restating Thom Blake’s argument.)
Thanks for the link!