For example, suppose that E and F are the both the event “humanity survives for millions of years” and you have the opportunity to push a button that will guarantee this with probability p and otherwise guarantee that this does not happen. If you’re willing to push it when p = 99.999%, that means that you assign a probability less than 99.999% to humanity surviving for millions of years. If you’re not willing to push it when p = 0.001%, that means that you assign a probability greater than 0.001% to humanity surviving for millions of years.
I think these type of definitions are intrinsically circular: you define probability in terms of rational decisions in uncertain environments. But in order to define what it means to perform rational decisions in uncertain environments, you need probability. Hence the circularity.
Savage’s theory shows that there exists a utility function and a subjective probability distribution such that a “rational” agent is maximizing expected utility. It doesn’t disentangle the utility function and the subjective probability distribution. So what you say is true in some sense, but the agent’s behavior still places constraints on the two things.
A rational agent maximizes expected utility by definition.
but the agent’s behavior still places constraints on the two things.
Ok. So if you observe the behavior of an agent, and assume it performs expected utility maximization, you can determine some constraints on its utility function and subjective probability distribution. Fair enough.
Still, this doesn’t allow us to tell that the revealed subjective probability distribution that is intrinsically accurate in any reasonable sense: A person who prefers life over death and nevertheless starves himself to death due to the belief that people are tying to poison him may be perfectly rational for some choice of subjective probability distribution. We tend to call these types of probability distributions “psychotic disorders”, but there is nothing in the theory of subjective probability that allows us to rule them out as wrong.
I think these type of definitions are intrinsically circular: you define probability in terms of rational decisions in uncertain environments. But in order to define what it means to perform rational decisions in uncertain environments, you need probability. Hence the circularity.
What do you mean by this?
Essentially all forms of decision theory or game theory are based on expected utility maximization (up to some details).
In order to define expected utility maximization, you need a concept of expectation, which means that you need probability theory.
Savage’s theory shows that there exists a utility function and a subjective probability distribution such that a “rational” agent is maximizing expected utility. It doesn’t disentangle the utility function and the subjective probability distribution. So what you say is true in some sense, but the agent’s behavior still places constraints on the two things.
A rational agent maximizes expected utility by definition.
Ok. So if you observe the behavior of an agent, and assume it performs expected utility maximization, you can determine some constraints on its utility function and subjective probability distribution. Fair enough.
Still, this doesn’t allow us to tell that the revealed subjective probability distribution that is intrinsically accurate in any reasonable sense:
A person who prefers life over death and nevertheless starves himself to death due to the belief that people are tying to poison him may be perfectly rational for some choice of subjective probability distribution. We tend to call these types of probability distributions “psychotic disorders”, but there is nothing in the theory of subjective probability that allows us to rule them out as wrong.