Doesn’t seem workable to me: being “completely ignorant” suggests an improper prior. An agent with a proper prior over its utility function can integrate over it and maximize expected utility and which action maximizes expected utility will depend on this prior.
Doesn’t seem workable to me: being “completely ignorant” suggests an improper prior. An agent with a proper prior over its utility function can integrate over it and maximize expected utility and which action maximizes expected utility will depend on this prior.