That reminds me of this delightful and hilarious (edit: and true!) thing Eliezer said once:
Let me try to clear up the notion that economically rational agents must be cold, heartless creatures who put a money price on everything.
There doesn’t have to be a financial price you’d accept to kill every sentient being on Earth except you. There doesn’t even have to be a price you’d accept to kill your spouse. It’s allowed to be the case that there are limits to the total utility you know how to generate by spending currency, and for anything more valuable to you than that, you won’t exchange it for a trillion dollars.
Now, it *does* have to be the case for a von Neumann-Morgenstern rational agent that if a sum of money has any value to you at all, you will exchange anything else you have—or any possible event you can bring about -- *at some probability* for that sum of money. So it *is* true that as a rational agent, there is some *probability* of killing your spouse, yourself, or the entire human species that you will cheerfully exchange for $50.
I hope that clears up exactly what sort of heartless creatures economically rational agents are.
That reminds me of this delightful and hilarious (edit: and true!) thing Eliezer said once:
Yeah I mean it’s pretty clear to me when I’m talking about things that make me “cheerful” that my feelings are fairly scope insensitive