Nah, you can’t choose to un-donate. Whereas you can always make up for lost time. So giving is a case where some mild indecision may be worthwhile.
Obviously the current expected value of your action should be the same as what you expect to think in the future. But getting more info can increase the expected value of your information thus:
Let’s say you have a charity budget, and two charities, A and B. Since your budget is a small fraction of the budget of each charity, assume your utility is linear in this decision, so you’ll give all your money to the better charity. You think there’s a 60% chance that charity A produces 1000 utilons from your donation and charity B produces 100, and a 40% chance that A only produces 10 and B still produces 100. The expected utility of giving to A is 60% 1000 + 40% 10 = 604. The expected utility of giving to B is 60% 100 + 40% 100 = 100, so you are planning to give to A.
But let’s say that by doing some small amount of research (assuming it’s costless for simplicity), you can expect to become, correctly, nearly certain (one situation has probability of ~1, the other has probability of ~0). Now if you become certain that A produces 1000 utilons (which you expect to happen 60% of the time), your choice is the same. But if you become certain that A produces only 10 utilons, you give to B instead. So your expected utility is now 60% 1000 + 40% 100 = 640, a net gain of 36 expected utilons.
Not if you can remake the decision. I read “I intend [...]” to mean “I expect to make this decision, based on the evidence available now, but will gather more evidence first, which may change my mind.”
But people don’t do very well at thinking when there’s already an expected outcome, so peter_hurford should either give up or work on becoming more curious.
Nah, you can’t choose to un-donate. Whereas you can always make up for lost time. So giving is a case where some mild indecision may be worthwhile.
Obviously the current expected value of your action should be the same as what you expect to think in the future. But getting more info can increase the expected value of your information thus:
Let’s say you have a charity budget, and two charities, A and B. Since your budget is a small fraction of the budget of each charity, assume your utility is linear in this decision, so you’ll give all your money to the better charity. You think there’s a 60% chance that charity A produces 1000 utilons from your donation and charity B produces 100, and a 40% chance that A only produces 10 and B still produces 100. The expected utility of giving to A is 60% 1000 + 40% 10 = 604. The expected utility of giving to B is 60% 100 + 40% 100 = 100, so you are planning to give to A.
But let’s say that by doing some small amount of research (assuming it’s costless for simplicity), you can expect to become, correctly, nearly certain (one situation has probability of ~1, the other has probability of ~0). Now if you become certain that A produces 1000 utilons (which you expect to happen 60% of the time), your choice is the same. But if you become certain that A produces only 10 utilons, you give to B instead. So your expected utility is now 60% 1000 + 40% 100 = 640, a net gain of 36 expected utilons.
You seem to have missed the point.
The point being what?
All information gained after making a decision is irrelevant for the purpose of making said decision. See also: The Bottom Line.
Not if you can remake the decision. I read “I intend [...]” to mean “I expect to make this decision, based on the evidence available now, but will gather more evidence first, which may change my mind.”
But people don’t do very well at thinking when there’s already an expected outcome, so peter_hurford should either give up or work on becoming more curious.
Your rebuttal is irrelevant to my point. Did you mean to reply to something else?