Expected utility isn’t a real thing, it’s an artificial construct defined as that which a rational person maximizes. What other definition of expected utility could you provide such that a rational person might not maximize expected utility? If we define X to be equal to 3Y then someone claiming to have found an example in which X does not equal 3Y has to be wrong because they are “fighting a definition.”
The author of the top post, I believe, made a mistake because, I think, he didn’t realize that it’s tautologically impossible for a rational person to not maximize expected utility.
In the post you cite Eliezer wrote “But eyeballing suggests that using the phrase by definition, anywhere outside of math, is among the most alarming signals of flawed argument I’ve ever found. ” Expected utility is math.
If you look up the Wikipedia entry on expected utility you find “There are four axioms[4] of the expected utility theory that define a rational decision maker.”
If you look up von Neumann-Morgenstern Utility theorem on Wikipedia you find “In 1944, John von Neumann and Oskar Morgenstern exhibited four relatively modest[1] axioms of “rationality” such that any agent satisfying the axioms has a utility function. That is, they proved that an agent is (VNM-)rational if and only if there exists a real-valued function u defined on possible outcomes such that every preference of the agent is characterized by maximizing the expected value of u, which can then be defined as the agent’s VNM-utility”
I arguably try to think rationally (and posting on LessWrong, my thinking feels clearer [1], and it helped remind me to respond rather than react in one particularly trying recent situation), but this is why definitions and wearing them may be best avoided. I don’t wear the label “rationalist”, but I try to use the techniques found here to think better. This is not quite the same thing.
[1] which, using fictitous examples, reminds me of all the stories where the clearly batshit insane protagonist speaks of how clear their thoughts feel now. That is, their internal editor is on the blink. I hope mine isn’t, but only results will tell me that.
n.b.: the parent comment may be wrong, but it’s applying thinking in a manner I thought was worth encouraging. Hence, an upvote.
A rational person, by definition, maximizes expected utility. You’re fighting a definition.
Be careful about arguing by definition.
Expected utility isn’t a real thing, it’s an artificial construct defined as that which a rational person maximizes. What other definition of expected utility could you provide such that a rational person might not maximize expected utility? If we define X to be equal to 3Y then someone claiming to have found an example in which X does not equal 3Y has to be wrong because they are “fighting a definition.”
The author of the top post, I believe, made a mistake because, I think, he didn’t realize that it’s tautologically impossible for a rational person to not maximize expected utility.
In the post you cite Eliezer wrote “But eyeballing suggests that using the phrase by definition, anywhere outside of math, is among the most alarming signals of flawed argument I’ve ever found. ” Expected utility is math.
If you look up the Wikipedia entry on expected utility you find “There are four axioms[4] of the expected utility theory that define a rational decision maker.”
If you look up von Neumann-Morgenstern Utility theorem on Wikipedia you find “In 1944, John von Neumann and Oskar Morgenstern exhibited four relatively modest[1] axioms of “rationality” such that any agent satisfying the axioms has a utility function. That is, they proved that an agent is (VNM-)rational if and only if there exists a real-valued function u defined on possible outcomes such that every preference of the agent is characterized by maximizing the expected value of u, which can then be defined as the agent’s VNM-utility”
I arguably try to think rationally (and posting on LessWrong, my thinking feels clearer [1], and it helped remind me to respond rather than react in one particularly trying recent situation), but this is why definitions and wearing them may be best avoided. I don’t wear the label “rationalist”, but I try to use the techniques found here to think better. This is not quite the same thing.
[1] which, using fictitous examples, reminds me of all the stories where the clearly batshit insane protagonist speaks of how clear their thoughts feel now. That is, their internal editor is on the blink. I hope mine isn’t, but only results will tell me that.
n.b.: the parent comment may be wrong, but it’s applying thinking in a manner I thought was worth encouraging. Hence, an upvote.