“That implies that one does not calculate expected utility.”
My impression has been that Eliezer means X and writes “Y” where Y could be interpreted to mean that Eliezer means either X or Z, you say “Eliezer means Z which implies this other obviously wrong thing”, and then Eliezer becomes upset because you have misinterpreted him and you become upset because he is ignoring your noting of the ambiguity of “Y”. Then hilarity is spawned.
The comment that started this now-tedious thread said:
When you said that, it seemed to me that you were saying that you shouldn’t play the lottery even if the expected payoff—or even the expected utility—were positive, because the payoff would happen so rarely.
Does that mean you have a formulation for rational behavior that maximizes something other than expected utility? Some nonlinear way of summing the utility from all possible worlds?
Sounds like asking to me. I clearly was not claiming to know what you were thinking.
“That implies that one does not calculate expected utility.”
My impression has been that Eliezer means X and writes “Y” where Y could be interpreted to mean that Eliezer means either X or Z, you say “Eliezer means Z which implies this other obviously wrong thing”, and then Eliezer becomes upset because you have misinterpreted him and you become upset because he is ignoring your noting of the ambiguity of “Y”. Then hilarity is spawned.
A data point for ya.
Ambiguities can simply be asked. I might or might not answer depending on whether I had time. Speaking for a person is a different matter.
The comment that started this now-tedious thread said:
Sounds like asking to me. I clearly was not claiming to know what you were thinking.