Iteration isn’t the only thing you’re changing—you’re also increasing the top payout and smoothing the results to reduce the probability of getting nothing. This is going to trigger your biases differently.
As you note, money does not actually translate to utility very well. For a whole lot of people, the idea of $240 feels almost as good as the idea of $1000.
It would help to play with the problem and translate it to something more linear in utility. But such things are really hard to find, almost everything gets weighted weirdly by brains. In fact, if you phrase it as “certain death for 250 people vs 24% of death of 1000” you often get different answers than “certain saving of 24 people vs 25% of saving 1000″.
IMO, part of instrumental rationality is recognizing that your intuitions of mapping outcomes to utility are wrong in a lot of cases. Money really is close to linear in value for small amounts, and you are living a suboptimal life if you don’t override your instincts.
Iteration isn’t the only thing you’re changing—you’re also increasing the top payout and smoothing the results to reduce the probability of getting nothing. This is going to trigger your biases differently.
As you note, money does not actually translate to utility very well. For a whole lot of people, the idea of $240 feels almost as good as the idea of $1000.
It would help to play with the problem and translate it to something more linear in utility. But such things are really hard to find, almost everything gets weighted weirdly by brains. In fact, if you phrase it as “certain death for 250 people vs 24% of death of 1000” you often get different answers than “certain saving of 24 people vs 25% of saving 1000″.
IMO, part of instrumental rationality is recognizing that your intuitions of mapping outcomes to utility are wrong in a lot of cases. Money really is close to linear in value for small amounts, and you are living a suboptimal life if you don’t override your instincts.